As balloonists we have a number of general-purpose tools to track a flight, of which the main ones are the live map and live imagery page. Sometimes though it’s nice to have a custom dashboard showing these items, the telemetry and anything else of interest.

A small number of flights have had these dashboards, including some of mine. Mostly I used the now defunct thdash.com site to design and host them, and for my Telnet flight I built one in Delphi using the rather old Intraweb framework. Now though, for Delphi programmers, there’s a new kid on the block in the form of TMS Web Core. Whereas Intraweb built a server executable or module, TMS Web Core creates a web app consisting of HTML and JavaScript. For those like me with extensive Delphi experience and pretty much zero JavaScript experience, it makes the creation of web apps a fairly familiar experience. I use TMS software extensively in my day job, so I know it’s all high quality stuff.

Mission Control

One of my next planned flights will be of an Apollo 11 CSM/LM combination, to commemorate the first manned moon landing 50 years ago next month. With the payload ready I started to consider what else I could do for the flight, and soon thought of creating a virtual replica mission control console. I started with the image that I created for that Telnet flight, and made some modifications to produce this background image:


Next job was to place a few display controls on it, and write some code that automatically resizes the image, and resizes and re-positions the controls, according to the browser window size.

Live Data

For the telemetry, I wanted to have live data displayed promptly after reception from the payload. For this and other reasons I recently built a web server application that accepts telemetry and other HAB-related data, and makes it available to web clients. Here’s the structure showing the data flow:

The LoRa gateway uses a UDP port to broadcast any telemetry that it receives, and this is then received by a separate Web Feeder application (run locally) that bounces the message up to the web server (running in an AWS instance). The Web App connects to a web socket on the server, tells the server what payload ID(s) it is interested in, and thereafter will receive any new telemetry for any of those payloads. The latency here is measured in milliseconds, so is a lot quicker than for example the standard map polling the habhub server every 10 seconds.

The web feeder also uploads other data (e.g. from my weather station). Rather than ask other HAB listeners to install it, I will make the LoRa gateway upload directly to the server, so there will be a new version of that soon.

Main Window

As well as the telemetry, I wanted my dashboard to provide some other functions:

  • Launch / Chase Video Stream
  • Live SSDV Images
  • Map with payload, landing position and chase car
  • 3D visualisation
  • Raw Telemetry Data

The video stream is trivial to add – drop a TMS YouTube component on the screen and fill in the VideoID. The only code is to make the YouTube component visible when the Video button is pressed, fill in the VideoID and let it play. Job jobbed. For the source I’ll use a Raspberry Pi running ffmpeg which, I’ve no doubt, has a greyscale mode for that authentic 1960’s look! Meanwhile I’m testing with a familiar video …

SSDV is just a simple, this time using a TMS Web Browser component and filling in the URL. Again, I wanted it in greyscale so I set up my Raspberry Pi balloon tracker to take greyscale photos.

Mapping

For the map, I started by embedding the standard spacenear.us map, however that map keeps hitting the Google API limits. Also, the icons are fixed and I wanted a bit more flexibility, so I used the TMS Google Maps control. Putting the map on the screen was a simple drag-and-drop job, plus obtaining a Google API key for the project and filling that in. After that, I created markers for the balloon etc., added those, and woohoo a map with balloon on it. All just like when I’ve used the equivalent TMS control in VCL (Delphi Windows library) or FMX (Delphi cross-platform library, which I primarily use for Android).

However, the next step of moving the balloon marker around the map wasn’t so simple. Unlike under VCL and FMX, no methods are provided to move the marker. Under VCL/FMX, AddMarker() returns a marker which I can then move with Marker.Latitude := … etc.; for Web Core I was stuck with no option in the current library other than to remove all the markers then add them all back. Not a pretty solution for the code or the user.

All was not lost though; TMS provide the source code to the libraries, so I grabbed the file containing the Google Maps class, copied just that class to a fresh unit in my project, renamed it (cleverly as TMyGoogleMap), saved as MyMap.pas (you can tell how expert I am at naming things …), and replaced use of the TMS map with “my” map. So far so easy.

Next job was to have AddMarker return a reference to the marker that it has added, so the code can refer to that later when moving the marker. The way that TMS Web Core works is that the Delphi source code is “compiled” not into machine code but into JavaScript, and where the library needs to include JavaScript directly it does that inside of an “asm” block (yes, short for assembler, as the older Delphi programmers may remember from where we used to embed assembler code when we tried to get a bit more performance out of our code). The Delphi to JS compiler just leaves the contents of the asm block as there are when emitting JS. This is used in the AddMarker procedure where it accesses the Google Maps JS object. To see how this works, here’s my modified version of the AddMarker procedure:

function TMyGoogleMaps.AddMarker(Lat, Lon: double; Title: string): TJSElement;
var
  map: TJSHTMLElement;
begin
  map := GetMap;
  asm
    var myLatLng = {lat: Lat, lng: Lon};
    var marker = new google.maps.Marker({
          position: myLatLng,
          map: map,
          title: Title
        });
    this.FMarkers.push(marker);
    Result = marker;
  end;
end;

All I did was change it from a procedure to a function returning a TSJElement, then had it return the created marker. Not hard at all, even for a JS neophyte like me.

There are several overloaded AddMarker procedures so I made the same change to all of them. I also added a MoveMarker procedure (this isn’t how the VCL/FMX maps do it, but it was expedient):

procedure TMyGoogleMaps.MoveMarker(Marker: TJSElement; Lat,Lon: double);
begin
    asm
        Marker.setPosition(new google.maps.LatLng(Lat, Lon));
    end;
end;

So with those changes, I had a working live map. I’ve sent TMS a feature request asking them to make markers work the same as in their other Google Map products, and to add poly-lines so I can draw the balloon path on the map, but the changes I’ve made will do for now.

The last step was to get the map running in greyscale, so it doesn’t look out of place in my Apollo console. Once I understood how styles are set in a TMS Web Core app this was trivial: add the style to the form’s HTML file:

    <style>
        .gmap_grey {
            webkit-filter: grayscale(100%);
            filter: grayscale(100%);
        }
    </style>

and reference that style from the map control. So easy.

3D Visualisation

Before I started this project, I looked into building a dashboard with an embedded Google Earth visualisation. A quick search for ideas on how to do that led me to the CesiumJS site where have an extensive API for visualising the globe in 3D in a web browser, complete with plenty of examples with source code. I created an account and obtained an API key, and soon had something working.

So I took that initial test project, copied the Cesium module I’d written over to my new dashboard project, and set it to open up CesiumJS in the Apollo console’s window. After that it’s as simple as telling the Cesium camera to fly to each telemetry position as it comes in.

Except … Cesium draws lovely visualisations in colour; I want grey only! I searched the API (which is as poorly organised as it is extensive) for a “grayscale” or “mono” setting or function, but found nothing. So then I scrolled through the examples till I found something that was in greyscale, and then checked the source and found … that I should have been searching for “BlackAndWhite” instead! I simply needed to add this code:

        var stages = viewer.scene.postProcessStages;
        var blackAndWhite = stages.add(Cesium.PostProcessStageLibrary.createBlackAndWhiteStage());
        blackAndWhite.uniforms.gradations = 256.0;

Even the “256.0” was an effort to find; the default is 5 grey scales which looks awful, and putting in “256” made no difference; it has to be a float! With that, it worked very well indeed.

Finally, I added some fun tricks like switch clunks when pressing the buttons, analogue TV “lost signal” lines when changing screens, and Apollo Quindar tones. If you want to view the dashboard, you can see it online here, though I might not be uploading telemetry when you visit.

3 Replies to “Mission Possible”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.