Web Dashboards

When following a flight – whether it’s your own or someone else’s – there are some great online tools to see what’s going on.  The main one of course is the map, but there are also pages for showing live images and for displaying sensor values graphically.

HAB flights in the UK in particular are often community things, with the launch and chase teams keeping in touch via the #highaltitude IRC chatroom.  This is useful both for keeping other balloonists up to date with what’s happening (e.g. letting them know the launch is delayed) or for asking for help when recovering the flight.  Some launchers also provide video streams from the launch and sometimes the chase and recovery too, and all of this helps with the community spirit – balloonists sometimes dedicate hours to help receive data from a flight, so providing them with these extra things is a form of payback for their efforts.

For a while now I’ve wanted to add a custom dashboard page for my flights, to combine elements that are of interest at that stage of the flight in one simple screen.  This screen needs to change during the flight – for example the map isn’t of much interest before launch, and the launch video isn’t useful after the launch.  So I’ve designed 4 custom dashboard screens for my next flight, for launch, ascent, descent and landing.


I wanted a visual dashboard design tool that is simple to use, has useful widgets, produces pretty results, is responsive to new data, adjusts to different screen sizes, and is flexible enough to allow the embedding of various items and representations that are of interest during a balloon flight.  I soon came across www.thedash.com which fits the bill mostly.  One issue was that I wanted to include a Twitter feed, and that’s currently not a live item on thedash (it doesn’t update, so it’s pretty useless).  Another issue, with the free version, is that dynamic items are polled and only refresh every 30 seconds (though this can be worked around by embedding html/javascript, but that breaks the “simple to use” requirement.  Both these issues can be avoided by paying for an account ($9.99/month or $99/year) because that allows for new data to be pushed to the dashboard via html.

Launch Dashboard

Here I’ve added a basic description of the flight, current time, a video from the launch site, and a list of status updates.

The video is a life feed, using the “embed” URL from YouTube.  For now this links to a recorded video but for the actual launch it will be streamed live of course.

The “Status Updates” list was originally intended to be a Twitter feed, and I wrote a Python script to generate the tweets automatically from the telemetry downloaded from the habhub server.  However as I said earlier, the Twitter widgets on thedash.com are not live (one is supposed to be but is currently broken due to changes at the Twitter end), so I replaced the Twitter feed with a live table widget.  That table has data pushed at it from a Python script that grabs the current telemetry from habhub, decides if anything interesting has changed, and if so generates a suitable message which it pushes to thedash using the Python requests library.


Ascent Dashboard

For this part of the flight, the launch video stream is no longer of interest, but the map is:



Descent Dashboard

Normally there would be no point having a separate dashboard for the descent phase, but for this flight there will be some extra things happening, so I intend to add some extra items to this screen soon.  For now it just shows the altitude as well as the usual map and status.


Landing Dashboard

This is more interesting as I intend to stream the landing video on this flight.  The map is also needed (so we can see the approach to the landing spot).  I’ve also added some servo positions which should be some clue as to what’s going on.


Posted in Weather Balloon | Leave a comment

Pi In The Sky Photography

I’ve had a few emails asking how to get camera images off of the PITS SD card, and have seen flights appear on the live image page with poorly configured image settings, so I’m going to cover both of these subjects in this article.

Image Configuration

The image-taking is quite flexible, and the software comes pre-configured with options that are suitable for the majority of flights, but still you may wish to adjust the settings yourself.

Image Taking

The camera is controlled by a script (~/pits/tracker/camera) which runs a basic loop to take photos for the radio transmitter and also for permanent storage.  The script looks for an runs separate small scripts for the RTTY channel (the standard transmitter on the PITS board), for LoRa (if you have the LoRa add-on board) and for permanent storage but not transmission.  These small scripts are created by the main tracker program (~/pits/tracker/tracker) according to settings in the configuration file (/boot/pisky.txt).  There is some basic information in the manual and online but I will  go into more depth here.

The camera software uses a simple scheduler to take a photograph of a certain resolution every n seconds.  The resolution can be set separately for low or high altitudes – there’s no point in transmitting large images before the balloon is launched!

There are several such schedules running independently, so you can separately control the rate of image taking for RTTY, LoRa etc.  This is useful because for example LoRa typically transmits data more quickly than RTTY does, so you might want there to be more frequent LoRa images than RTTY.


The next thing to remember is that although it’s possible to take a photograph every few seconds, it takes a lot longer to transmit them.  The transmission time depends on various factors but the main one under your control is of course the image size in pixels; do not make the mistake of one team who used images so large that it didn’t have time to fully transmit even one image!

Since it is possible to take images much more quickly than transmit them, the software is able to choose what it considers to be the “best” of recent images for transmission.  Suppose that it takes 5 minutes to transmit an image, and one us taken every 30 seconds; that gives the software 10 images to choose from.  The “algorithm” (if I can call it that) is to simply select the largest JPG file out of those images, and this works surprisingly well in selecting good images and not selecting poor ones (e.g. those pointing at the black sky above).

Having selected a image to transmit, that image and all the ones it rejected are moved to a dated folder, so they aren’t later considered for transmission.  This means that (using the above example) the next image will be chosen from the next 10 photographs.  So, at any point, the transmitted image will only be a few minutes old.

Since images are sent using the same radio transmitter(s) as telemetry, only one or the other can be sent at any one time on a particular transmitter.  So, the software interlaces image data and telemetry, which means that it will send a certain number of image packets, then a telemetry packet, and then repeat the cycle.  The ratio can be fairly high when the balloon is high – we don’t  need frequent telemetry updates at that time – but is forced to be 1:1 at lower altitudes (so we get frequent position updates as the balloon comes in to land).


High and Low

As I mentioned, the software can use different image sizes for low altitude and high altitude, with the changeover controlled by this line in the configuration file:


So, above 2000m the software uses settings for “high” images, and below that it uses those for “low” images.  The same changeover altitude is used for all image channels (RTTY, LoRa and “FULL” for SD card only) and each channel has separate settings for high and low image sizes.

RTTY Images

You can set the image size for low or high images, plus how often images are taken, plus the ratio of image packets to telemetry packets (when above the “high” setting above):




When the tracker program starts and reads this file, it displays the following in response, explaining what the options do:

RTTY Low image size 320 x 240 pixels
RTTY High image size 640 x 480 pixels
RTTY: 1 Telemetry packet every 4 image packets
RTTY: 60 seconds between photographs

For typical 300 baud RTTY (the default speed for RTTY in PITS), this provides an image every few minutes, which is fine.  Images will be transmitted more often at high altitude because the black sky compresses very well, which helps.  I do not recommend using larger image sizes for RTTY.  Also, do not transmit images at all if you are using 50 baud (set image_period=0 to disable images for RTTY).

Full Size Images for SD Card Only

Since these images are not transmitted, they might as well be set to the full camera resolution when above the “high” setting:


Which results in the following output:

Full Low image size 640 x 480 pixels
Full High image size 2592 x 1944 pixels
Full size: 60 seconds between photographs

LoRa Images

For LoRa, remember to specify the LoRa channel (0 or 1, for CE0 or CE1, according to which position(s) are occupied on your LoRa board).  Settings are very similar to those for RTTY:


LoRa transmissions should be set to mode 1 (see the LoRa section in the manual) as this is the best option for imaging.  The resulting speed is about 4 times that for 300 baud RTTY, so you will get more frequent updates, or you could increase the image sizes.

LORA0 Low image size 320 x 240 pixels
LORA0 High image size 640 x 480 pixels
LORA0: 1 Telemetry packet every 4 image packets
LORA0: 60 seconds between photographs

Other Settings

If you don’t want images at all, either simply remove the camera, or disable the software from using it with:


Assuming use of a Pi camera, the software uses the raspistill program to take photographs.  You can add parameters that are passed to raspistill, which is handy if for example the camera is upside-down (happens…).  To do this, first check the online raspistill documentation, and then add your parameters using “camera_settings” e.g.:

camera_settings=-vf -hf

which does vertical and horizontal flips to rotate the image by 180 degrees.


Images From SD Card

So, you’ve flown your flight, recovered the payload, and want to get those images from the SD card onto another computer.  There are many methods:

  • If you have a PC or Pi running Linux, you can pop the SD card into a card reader, and access the files directly
  • With a Windows PC, and a card reader, you can use SysInternals Linux Reader to access the files.
  • If you connect the Pi to a network, you can use WinSCP to access files from a Windows PC on the same network.
  • Or, also with the Pi on a network, you can install SAMBA on the Pi to share it’s files.

Here we will cover the WinSCP option.


First, install WinSCP from the official page.  Start the program and you will see this initial screen; enter the IP address of the Pi (which it displays on the monitor at the end of the boot process) and the Pi user name (“pi”).  Click Save to save the settings.

Then click Login and the program will connect to the Pi, requesting a password (even you entered one above!):

and then, again if this is the first time you’ve connected to the Pi, it will ask for confirmation:

Once connected, you will see a file manager window with the directory structure on the left, and any files on the right:

Now, select the images directory (home/pi/pits/tracker/images) and you will see the following directories:

  • FULL – for the full-sized images
  • LORA0 – for LoRa CE0 images
  • LORA1 – for LoRa CE1 images
  • RTTY – for RTTY images

Choose whichever you like, and you will then see dated directories within:

Open up whichever matches the date of your flight, and you will (hopefully!) see lots of image files:

You can drag individual files, multiple files or even entire directories to your hard drive or any other storage.



Posted in Weather Balloon | 2 Comments

YouTube Streaming From a DSLR

One of the nice things about high altitude ballooning in the UK and Europe is the community spirit and help that launchers get from others who will receive and upload the launcher’s balloon transmissions, and freely offer advice during the flight.  I think it’s a very good idea to return that favour by providing live video streams of the launch, chase and recovery where possible.  A few HABbers do this and it would be nice if more did.

There are various methods of uploading to different streaming services, using a phone or Pi with Pi camera or a laptop with webcam.  For a balloon launch though it would be good to use a camera with zoom lens so that the balloon can be streamed once airborne.  To do this requires an SLR.

So, how to get video from an SLR into a laptop?  Again, multiple options – either use a HDMI video capture device (higher quality but a tad expensive) or send the video over USB.  Here we will explore the USB option.

So we need some software on the PC to receive the video stream from the camera, and to upload to a streaming service.  Again, each of these has a choice.  I’m using a Canon EOS 760D and that comes with “EOS Utility” which displays the video in a window (which we can then capture), but a neater option is a £50 program called SparkoCam, which makes a modern Canon or Nikon DSLR appear as a regular webcam thus making it easy to pass the stream onto another program.  If you want to spend nothing then instead you can use the EOS utility or Nikon equivalent plus OBS (see below) to capture from the PC screen.

To upload to a streaming service, we need a suitable video encoder.  I’ve used Adobe’s Flash Media Live Encoder which works well, and which works from a webcam including SparkoCam’s virtual webcam.  Here though we are going to use OBS (Open Broadcasting Software) which is rather more powerful and flexible.


First, install SparkoCam.  You can try it for free but it will watermark the video stream.

Connect your DSLR and switch on.  It should automatically be selected by SparkoCam, and you will hear  the mirror latch up as it switches to live video mode.  If nothing happens, check that the camera is on, that you used a data USB lead not just a charging one, and that your DSLR is supported by SparkoCam.

Open Broadcasting Software

Now, install OBS and run either the 32-bit or (if available on your PC) 64-bit version.  OBS can be a pain to get running initially depending on if your PC has the required DLLs or not, and you may find that the 64-bit version doesn’t work but the 32-bit one does.  Or vice versa.  Error messages from OBS can be a bit cryptic too, but once it starts it works very well.

The opening screen is a bit cryptic too till you realise what you need to do.  First, you need to add a video source to accept video from SparkoCam’s virtual webcam; click the “+” below the Sources panel:

and choose “Video Capture Device” from the popup menu.  eave the name as “Video Capture Device” or change to something appropriate e.g. “Canon DSLR”.  Click OK to save.

A Properties window will appear with a preview; you can just click OK to accept the defaults.  Now the video stream is inside the main window in OBS.

This window is what will be streamed, and can contain several video sources if you want to get clever, but for now we’ll just expand the video source window, using the red drag lines, to fill the OBS source window.

If it doesn’t fit exactly, choose Settings –> Video to change the aspect ratio of the window to match the DSLR’s aspect ratio, and then expand to fit.


The following is for YouTube; other live streaming sites should have similar options.

Go to your YouTube Live Dashboard and either choose “Stream Now” or create an “Event”; we’ll do the former.  With “Stream now” selected on the left of the screen, look at the “Encoder Setup” in the “Basic Info” section, where you will see the Server URL and Stream name/key.  Click “Reveal” and copy the stream name to your clipboard.

Now, in OBS, click the Settings button and then click on “Stream”.  Choose YouTube as the Service, and paste your key into “Stream key”.  Click OK to save.

Now to start the streaming from your PC, just click the “Start Streaming” button in OBS.

YouTube likes to buffer, but after a few seconds your YouTube page should show the stream as “Live”.

And that’s it !


Posted in Weather Balloon | Leave a comment

DIY Lightweight Pi Tracker with SSDV

A few months back, Raspberry Pi brought out a very very nice little case for the Pi Zero, including 3 different front plates one of which accepts the Sony Pi camera.  After several minutes measuring the internal dimensions, I reckoned I could just about fit the parts for a HAB tracker inside, and came up with this:

Just add batteries.

That one was for 434MHz, and I wanted another for 868MHz, so I thought I’d document the build in case anyone else wants to make one.


First, you need these parts for the build:

  1. Raspberry Pi Zero or Zero W
  2. Pi Zero case
  3. Pi Sony Camera
  4. Some solid core hookup wire
  5. UBlox GPS with chip antenna from Uputronics
  6. LoRa module from Uputronics
  7. SD card 8GB or larger

Plus a soldering iron, solder, wire cutters and a Dremel with cutting disc.  I assume that you also have the parts required to power and operate a Pi Zero (all the Zero suppliers provide kits).  For a flight, you will also need 3 Lithium AAA or AA cells, flexible hookup wire, plus Styrofoam or similar to enclose and protect the tracker.

If you are new to soldering, practice on something else first!  We are going to solder wires directly to the Pi GPIO holes, plus those on the radio and GPS boards, which isn’t the most delicate soldering operation ever but may be daunting for those with no soldering experience.


First, cut 4 short long* lengths of the solid-core wire, and solder to the Pi Zero as shown (making sure that the wires are on the top of the board!).

  • IMPORTANT – Although this build worked for me, I have heard from others who have had poor or non-existent GPS reception due to electrical noise received from the Pi by the GPS antenna.  So please, use longer wires so that the GPS module is around 50mm from the Pi board.  That significantly reduces the received noise and enables the GPS receiver to get a good position lock.

I’ve left a very short piece of insulation on the bottom-right wire, but you can remove that completely if you wish.

Next, bend the two top-right wires out of the way, and fold over the leftmost wire and cut to the length shown – this wire will connect to the Vcc hole (top one) on the GPS.

The next part is moderately fiddly: Push the short wire on the left into the Vcc hole, and then push the GPS module over the short bottom-right wire so that this wire goes through the GND hole on the GPS module:

Then push the GPS module down flat on top of the SD socket on the Pi, and solder those 2 wires (Vcc and GND) on the GPS module:

Those last 2 wires can now be bent round and connected to the GPS; the wire in the right of the above photo (Tx on the Pi) above goes to the RXD hole whilst the other (Rx on the Pi) goes to the TXD hole:

Cut the wires to length, bare the ends, push slightly into the holes then solder them.

That’s the GPS sorted.


This is the radio module for communication with the ground.  This has a few more connections to make, and is a bit more fiddly.

First, place wires in these holes as shown, and solder them in place:

Be sure to use the correct holes, by counting from the right edge of the Pi Zero; don’t do it relative to any components because those can vary in position (the Zero and Zero W have the CPU in a different position, for a start!).

Now add 3 bare wires as shown:

The next step is optional.  We need to provide some mechanical security for the radio, to keep if slightly away from the Pi so nothing gets shorted.  This could be a double-sided sticky pad or, as here, a 4th solid wire but this time soldered directly to a capacitor on the Pi.  If that sounds daunting, use the pad!  Here’s the wire, ‘cos that’s how I roll:

Once soldered, remove the insulation.

Now it’s time to place the LoRa module on those 3/4 bare wires:

If you are using a sticky pad, place it now, on the underside of the LoRa module, then push the module down so it’s stuck to the Pi.

If instead you are using the 4th wire, push the LoRa module down but maintain a 1-2mm gap between it and any components on the Pi.

Then cut to length and solder them to the LoRa module.

Now we can cut each of the other wires to length and solder them to the LoRa module:

Until we have the tracker completely soldered together:


Using a Dremel or similar with cutting disc, cut a slot in the case for the GPS module to poke out.  This will take some trial-and-error till the module fits comfortably.

Then drill a hole in the opposite end, in line with the corner pin on the LoRa module.  The hole diameter needs to be wide enough to push a wire through it.

Connect the short flat camera cable (which came with the case) to the Pi, then insert in the case.


Cut a piece of wire to length (164mm for 434MHz), bare and tin and few mm at one end, insert through the hole and solder to the corner pin on the LoRa module.  Finally, connect the camera to the cable, push fit the camera into the lid, and close the lid on the case.

SD Card

First, follow the standard instructions to build a standard PITS SD image.

We then need to modify the configuration file (/boot/pisky.txt) to tell it that we are using this tracker instead of a standard PITS tracker.  Here’s a sample pisky.txt file to work with:




The lines in bold are important:

  • Disable_RTTY=Y  – this disables RTTY (we don’t have a PITS RTTY transmitter)
  • gps_device=/dev/ttyAMA0 – this specified that we have a serial GPS not I2C as on PITS
  • LORA_Frequency_1=434.200 – this sets the frequency of our LoRa module
  • LORA_Payload_1=CHANGEME – you must set this to a name for your flight
  • LORA_Mode_1=1 – this sets LoRa mode 1 which is the only usually used for SSDV
  • LORA_DIO0_1=23 – this specifies the Pi pin we connected the LoRa DIO0 pin to
  • LORA_DIO5_1=29 – this specifies the Pi pin we connected the LoRa DIO5 pin to


You will need to have or make a LoRa gateway to receive transmissions from your tracker.

You will also need to provide a power supply to the tracker.  This can be any USB powerbank with enough capacity, however the batteries may stop working if they get cold during flight.  An alternative is a powerbank that takes AA cells, in which case you can use Eneergizer AA Lithiums.  Finally, and this is the option you will want for a lightweight payload, simply solder 3 Energizer Lithium cells directly to the 5V/GND pads on the Pi.

Posted in Weather Balloon | 25 Comments

Landing Prediction

As I mentioned in my previous post, I was planning to enable my landing prediction code for my next flight.  This code is based on some work that Steve Randall did a few years ago, but using a slightly different technique as I was using a Pi and therefore had plenty of RAM available for storing wind data (Steve used a PIC).  I wrote the code as the first stage in having a HAB guide itself to a predetermined landing spot, and knew that it worked pretty well using stored data from one of Steve’s flights, but hadn’t got round to trying it for real.

The way my code works is this:

  1. During ascent, it splits the vertical range into 100 metres sections, into which it stores the latitude and longitude deltas as degrees per second.
  2. Every few seconds, it runs a prediction of the landing position based on the current position, the data in that array, and an estimated descent profile that uses a simple atmospheric model (from Steve) plus default values for payload weight and parachute effectiveness.
  3. During descent, the parachute effectiveness is measured, and the actual figure is used in the above calculation in (2).

So, basically, for each vertical 100m band, the software calculates the estimated time to fall through that band, and applies that to the latitude/longitude deltas measured during ascent.  It then sums all the resulting deltas for descent to 100m (typical landing altitude), adds them to the current position, and emits the result in the telemetry as the predicted landing position.

Although the habhub online map does its own landing prediction, an onboard prediction has some advantages:

  • It has more descent data to work with, so can more accurately profile the parachute performance
  • It is using more recent wind data, measured during ascent
  • Ground chase crews can see the landing prediction without having internet access

There are disadvantages too.  Because it uses wind data from the ascent, if the wind has changed (due to the landing being in a different area, or because the wind is changing with time) then those factors will introduce errors.

Also, I have a suspicion that the live map consistently overestimates the horizontal distance travelled by a descending flight.  This can be seen by watching its landing prediction which, as the flight descends, will move back towards the actual flight position.

So I was keen to see how well the onboard prediction fairs against the habhub prediction.  Steve Randall was also interested in this, and was kind enough to record the descent on his screen.  He has sped up and annotated the video which you can see here:

From that you can see that:

  • Until close to landing, it’s a lot more accurate than the habhub prediction (for this flight – might not be the case generally!)
  • The noise in the estimated landing position is mainly along the large part of the descent track.

Here’s a screenshot from the map, edited to show the movement of the landing position during descent:

Steve produced a chart showing the parachute effectiveness ( relative coefficient of drag – which is what the code is trying to measure) with altitude:

Noise at low altitudes is less important, as it’s being applied to a short remaining distance to fall, but the noise higher – between say 5000 and 15,000m – is more important.

For my next flight, I’ll apply some filtering to hopefully make the prediction more consistently accurate.  I have all the GPS data from this flight and I can run that back into the tracker code to test how well it would have worked on that last flight.

Posted in Weather Balloon | 2 Comments

Strat The Bat’s Big Adventure

I’ve been a high altitude balloonist for about 6 years, but a fan of Jim Steinman’s music since Bat Out Of Hell hit the world in 1977.  The latter has had something of a renaissance lately with the musical version previewing in Manchester, and soon to open at The Coliseum in London.  I saw the show on opening night and it blew me away about as much as Meat and Karla did on The Old Grey Whistle Test back when I was barely 17!

Those who know me know that I like puns, and I found an obvious one in the name of the musical’s lead character – Strat.  So I figured that somehow I should send Strat into the Stratosphere – a joke that I’ve been milking since I thought of it a few weeks ago.  The actual Strat would need a very large balloon to enter the stratosphere, and would probably die more permanently than he does in the show, so I decided instead to send a little foam bat:

that I bought in Paris at another musical with Jim Steinman’s music – Tanz der Vampire.

So, how to send the little critter (now renamed as Strat) up into the stratosphere?  Well, the sending-up part is really easy – get a weather balloon, fill with hydrogen, attach bat to balloon and let go – the balloon goes up, getting gradually larger in the rarified air as it rises, and eventually gets as large as it can before bursting, at which point everything heads Earthward very very quickly.  A parachute is needed to slow the flight down as the air gets thicker at lower altitudes, with the payload (bat) hopefully landing safely in a field somewhere.  I wanted to have a video of Strat’s flight, so I glued him on to a short rod for positioning in front of the camera:

That “somewhere” is planned in advance using wind prediction software, but the actual flight could land a few miles from the target so it needs to be tracked.  For that, the payload carries a GPS device so it knows where it is, a radio transmitter to relay that position down to the ground, and a small computer board that glues those components together and adds various functions such as providing a live prediction of where the landing might be, and adding photographs from a tiny camera.  I added a video camera to take a continuous video of the entire flight, and some extra trackers I was testing.  This is the entire payload including batteries and foam boxes to protect the contents and whatever it all lands on.

And here I’m testing the view on the camera:
The total weight was 550g, which is about average, and I chose a 500g balloon which was calculated to send the flight up to about 30km which is plenty high enough to get good pictures.  I have no photographs of the launch (I didn’t have any help) but it was easy enough.

The flight was expected to take about 2.5 hours, with a landing near Glastonbury about 2 hours away, so I set off in the chase car (fully equipped with radio tracking kit and other items – more on those later).  Here’s the entire flight path that it actually took (and was very close to the prediction):

You may be used to the view from 30,000 feet or so, where commercial jets fly, but that’s just the top of the Troposphere.  Above that is the Stratosphere, where the sky becomes increasingly black (because there’s almost no atmosphere above).  Weather balloons can, depending on payload weight and balloon size, get well above 100,000 feet, well above any jet commercial or military, and to get any higher requires a large rocket.  This photo is from early in the flight, still with a blue sky:

and this is later, where the sky has gone black:

and you can see the thin blue line of the atmosphere on the horizon.

Meanwhile, the video camera was recording Strat’s flight.  Here’s the launch (from the field behind my house):

Strat Bat in the Stratosphere:

and Strat Bat about to leave the Stratosphere …

Initial descent is typically up to about 200mph in the very thin air (about 1% as dense as it is at ground level), then the parachute automatically opens and slows the flight down, so it lands at a nice gentle 10mph.

Well, I say “land”, and that’s always the aim, but unfortunately for us high-altitude balloonists, the UK has a rather high population of trees.  So this is what actually happened:

One of the extra things I had programmed for this flight was for the tracker to live stream video once it was low enough to get a 3G signal, which it did.  When I saw the video come up on my phone in the chase car, it was pretty obvious that the payload was swinging around in a tree!  I was about half a mile away at the time, thanks to the live telemetry telling me where the payload was throughout the flight, so I soon arrived at the “landing” site.  This was next to a building site, so I checked with the site foreman and got permission to walk over to where the payload was hanging:

If you look at the very top of the tree, you’ll see a lime green and orange parachute.  That’s quite high up – about 17 metres – but fortunately I used about 10 metres of line down from the parachute to the payload:

Also fortunately, I’d packed some long telescopic poles in the car.  One of those has a hook taped to the end, and once I’d hooked that round the payload line it was pretty easy to pull the lot out of the tree:

Here’s Strat looking none the worse for his journey:

So, payload including Strat The Bat all recovered intact, despite the attentions of a fairly tall tree.  I waited till I got home before I checked the video, but as you can see that worked too :-).

I went up to Manchester for the final 2 shows, with a seat near the front of the stalls for the final performance.  Here’s Strat The Bat at the interval, covered in blood group A4 from the motorcycle crash at the end of Act 1:

and then, at stage door after that final show, came the chance to hand the much-travelled Strat to Andrew Polec (who plays Strat in the musical):

Posted in Weather Balloon | 2 Comments

HAB with Calling Mode and 3G Streaming

This, my first flight of the season, will test a few new things:

  • LoRa Calling Mode
  • 3G Modem functions:
    • Upload of balloon position to habitat
    • Upload of full-size photographs to a web server
    • Streaming of launch and landing/recovery to Youtube
  • Landing Prediction
  • Standalone GSM/GPS Tracker

LoRa Calling Mode

“Calling Mode” is where the tracker periodically sends out a special message, on a standard frequency using standard settings, announcing the particular frequency and other settings that it is using.  This allows for unattended receivers to be set up on the calling channel, where they can be expected to automatically switch to any balloons using that channel.

To see how this works, see this page that I wrote last year.

The calling frequency is 433.650MHz, chosen as the quietest part of the 433/434MHz ISM band.


The tracker for this flight has a Huawei 3G USB modem, connected to the O2 network via GiffGaff, using the Sakis3G script.  The tracker runs a Python program that knows the current GPS altitude (passed to it from the PITS C software using a named pipe) so that it knows when it is worth attempting to connect to the internet via 3G (i.e. below 2000m).  That same script controls the various functions that operate over 3G – video streaming, ftp image upload, and direct habitat telemetry upload.

Direct Habitat Upload

As well as the usual ISM (LoRa) radio telemetry, this flight will upload telemetry more directly whilst it has a 3G connection.   Most usefully, assuming it lands in an area with 3G coverage via O2, this will mean that the landing position is uploaded automatically.

Photograph Uploads

Again most useful on landing, the tracker will upload full-sized images (taken by the Pi camera) to a web server via ftp.  That server (balloon.photos) will automatically build thumbnails of the uploaded images.

Youtube Streaming

For both launch and landing, live video will be streamed to my Youtube channel, at this URL.

Landing Prediction

The tracker will predict its own landing position, sending the result out over LoRa as a “XX” which then appears on the map as a large red “X” (marks the spot).  The prediction is only useful after burst, and uses both the measured wind speeds/directions on the way up and the effectiveness of the parachute on the way down.

Standalone GSM/GPS Tracker

These are inexpensive devices that use SMS or GPRS to automatically send out their position.  They don’t have a good track record for HAB, partly because they tend to use cheap and rather deaf GPS and GSM hardware, and also because HABs tend to land in remote areas away from GSM coverage.  Regardless of the above, they have some use as a backup device to a regular radio tracker.

For this flight, I’ve set up a TK102 tracker to connect to the internet via GPRS and to send its position to a traccar server.  traccar is an open-source tracking system which displays multiple car tracking devices on a map.  Here, I’m using a small Python script to extract data from traccar (via its log file) and to then send the position of my particular tracker on to the habitat system so the TK102 appears on the usual HAB map.


Posted in Weather Balloon | Leave a comment

BBC Microbit Balloon Tracker

I’ve been meaning to do this for a while, and a short gap between projects gave me some time to try.

The Microbit is (yet another) educational SBC, sitting somewhere between the Codebug and a Raspberry Pi.  Its processor has plenty enough flash memory and RAM to run a basic tracker (but more on that later), plus it has accelerometer and compass chips.

Importantly, the Microbit has SPI and I2C busses plus a serial port, all brought out to the edge connector on the bottom. Rather than solder directly to the pads, I bought an edge connector and teeny prototyping board:

I also bought a battery holder with cable and plug to suit the micro JST connector on the Microbit.

Balloon Tracker Hardware

To make a balloon tracker, we also need to connect a suitable GPS (by which I mean, one that still sends positions when at high altitudes) and an ISM band radio transmitter.  I chose a UBlox module from Uputronics:

Usefully, this design includes an I2C port as well as the usual serial port.  Since the Microbit serial port is normally used by the debug connection to a PC, software development becomes more difficult if we use that serial port for the GPS, so I2C makes life much much easier.

Now for the radio.  The most popular HAB option is the NTX2B radio transmitter, but that also needs a serial port, so instead I opted for a LoRa transceiver from Uputronics:

This has an SPI interface, so the serial port remains free for debug purposes.

The first job was to get the devices wired together.  There’s not much space on this prototyping board, and it can be useful to keep the GPS away from the other devices anyway (less interference), so I put the GPS and radio on wire tails:

GPS Software

There are several options for writing code for the Microbit, and I opted for MicroPython as I’ve been writing a lot pf Python lately, using the Mu editor/downloader.  I started with some simple code to grab the NMEA data stream from the GPS, and this took just minutes to get going:

I then ported my Pi Python GPS NMEA parser (which meant, just changing the code to use  the Microbit I2C library rather than the Pi serial port).  You can see my test program here (but please don’t use that for a flight, as it was written for car use and therefore doesn’t put the GPS into flight mode!).

LoRa Radio Software

I also have LoRa Python code from another project, so after testing that the device was connected OK (a few commands typed into the Microbit REPL interpreter), I ported that over.  The changes were for the SPI library, plus I had to remove all the LoRa register/value definitions as they made the program source too large; the source is compiled on the device, so the compiler has a rather limited RAM workspace.  You can see the resulting test program here.

To receive LoRa transmissions, you need another LoRa device as a receiver, plus suitable software.  I used my C LoRa Gateway code for the receiver:

Balloon Tracker Program

So far so easy, and the end goal seemed close; once you have GPS and radio modules working, then you just need a small amount of extra code to format the GPS data as a string, adding a prefix (“$$” and the payload ID) and suffix (“*” then CRC then a line-feed), and then transmit the result over radio.

However, as soon as I combined the GPS and LoRa code, the result wouldn’t even compile.  Remember that compilation happens on the Microbit, and my code was too large for that process:

Fortunately it wasn’t much too larger, so I removed some code that wasn’t strictly necessary (mainly, the code that switches off unused GPS NMEA sentences) and soon the compiler was happy.

The resulting code however was not happy.  Once the compiler has finished, the resulting bytecode is loaded into the Microbit’s RAM, which is shares with any data used by the program (variables, stack, temporary work areas).  The nature of Python is that memory gets allocated all the time, and freed up when necessary (i.e. when there’s little free memory available), and my program would run for a short while before crashing with an “out of memory” error when it tried to allocate more memory than was available.  This it working before it crashed:

So, I had to reduce the memory footprint.  I’m used to doing that in C on microcontrollers, but MicroPython needs different techniques.  For example, C on a micro usually sits in flash memory, which often is less of a limit than is the working data in RAM, so you can sometimes rewrite the code to use less RAM without worrying that the new code uses more code memory.  Not so for MicroPython, where everything shares RAM.  So some things I tried actually made the situation (checked by calling gc.free_ram() in the main loop) worse.  So, for the most part, I managed to increase free RAM by removing code that I didn’t need.  Having done so, the program was stable though free memory went up and down cyclically as memory was allocated each loop and then eventually freed up.

Some easy improvements came from removing the code to display GPS satellite count on the LEDs, and specifically importing only the required modules instead of the whole Microbit library.  The most relevant part of the code turned out to be the part that builds up an NMEA sentence.  In C you simply allocate enough memory for the longest sentence you need to parse, then place incoming bytes into that memory using a pointer or index, checking of course for buffer overruns.  In Python, strings are immutable so you can’t do this, and the temptation then is to do “string = string + new_character”.  Of course, the Python interpreter then allocates memory for the resulting string, marking the old string as “no longer in use” so it can be freed up sometime later.  It’s pretty easy to end up with lots of unused memory waiting to be freed.  For now, my NMEA code explicitly frees up memory as each new byte comes in.  I did briefly change the code to using bytearrays, which are close to what I would do in C, but free memory reduced slightly (I assume the source took more space) so I went back to the original code.  Longer term, I’ll ditch NMEA and write code to use the UBX binary protocol instead.

The code has been running continuously now for over 12 hours, and the free-memory figure is solid (measured at the same point each time round the main loop).  I do need to add the flight-mode code, but that’s small and shouldn’t cause an issue :-).  If all is well then I hope to fly this (weather-permitting of course) on Sunday.

Finally, here’s the result of receiving the telemetry on a Python LoRa gateway program that I’ve been working on lately:

Posted in Weather Balloon | 7 Comments

Quick RTL SDR Comparison

As part of a recent project, I’ve used a few different RTL SDR devices, and was surprised how drifty some of them are, one in particular.  For their intended application – decoding wideband transmissions – this isn’t an issue, but if you want to use one to decode RTTY then it certainly is – the signal will soon drift outside of the audio passband unless the SDR is retuned.

My project is on a Raspberry Pi, where I found that all but one (see the test results below) was basically unusable.  So I did some quick tests on my desk, with a Windows PC running Airspy, for a crude visual comparison of drift rates.  I tested 4 devices:

  1. NooElec Nano 2
  2. A very old E4000-based SDR
  3. Current model R820T2 SDR
  4. NooElec Aluminium-cased SDR

1 – NooElec Nano2

Poorest of the bunch.

2 – E4000


3 – R820T2

Not better.

4 – Ali cased

Much, much better.

As such, the metal-cased NooElec is the only one I could recommend.

Of course, there are much better SDRs out there – the Funcubes, SDR Play and Airspy models, and for chasing or tracking balloons you should really spend the extra money – but for bench testing then this particular RTL SDR is just fine.

Posted in Weather Balloon | Leave a comment

Raspberry Pi SSDV with a Compact Camera or SLR

Many HAB flights now use SSDV to transmit images “live” from the balloon down to the ground, using a camera connected to the flight computer, providing an immediacy that is missing when just flying standalone cameras.  Early SSDV flights used serial cameras connected to a microcontroller, but image quality (and, ease of programming) took a step forward when the Raspberry Pi arrived with simple access to webcams.  My first SSDV flight and the following 3 used a webcam, sending SSDV down over RTTY.



Webcams do not provide great image quality, which then improved when the Raspberry Pi camera came out.  There was also some excellent work by Chris Stubbs who managed to program a Canon compact camera to send SSDV directly, again over RTTY.


With RTTY we are limited to sending quite small images – around 500×300 pixels – because larger images take too long to transmit.   It is possible to increase the speed from 300 baud to 600, though it then becomes more difficult to receive, or by using the trick of multiple RTTY transmitters on the same tracker, such as this one:

Another option though is to replace RTTY with LoRa, which on 434MHz provides a speed increase of about 4x over typical RTTY speeds, and that increases by another tenfold if using 868MHz (all due to varying bandwidth limits allowed within IR2030).  Further, LoRa allows for uploads to repeat any missing packets, and this in the 868MHz band produces some impressive results (the inset is an image from my first Pi SSDV flight, showing the improvement we now have in quality and resolution):

So, these bandwidths allow us to send rather higher quality images than before, to the point that the image compression is limiting quality.  With this in mind, Philip Heron added a quality setting to his excellent SSDV encoder/decoder to control the amount of compression applied.

With reduced image compression and higher bandwidths, the remaining factor is camera quality.  Whilst the Raspberry Pi cameras (especially the newer Sony) are quite good, they do have tiny sensors and simple plastic lenses.  A step up would be to use a compact camera, mirrorless system camera or an SLR.  These also potentially offer wider angle lens, making for more impressive HAB photographs.  However we need to get those images to the flight computer.

Pretty much every modern camera allows for a USB PTP (Picture Transfer Protocol) connection to a computer, allowing it to be controlled by a computer to a greater or lesser extent.  For most cameras all we get to do is download images from the camera – and that’s all that most people need – but we also need to be able to take images under control of our flight computer.


To take and transfer images we can use the Linux program gphoto2, with a compatible camera that includes remote operation (i.e. ability to take an image via a command on the Pi).  The compatibility list includes few modern compact cameras, as the remote functions are typically only available on SLRs.  Canon, for example, used to include remote capture in their Powershot models but stopped this practice in 2009, presumably to persuade people to buy their SLRs instead.  I tested with an old Canon SLR (EOS 400D) and pretty much every function is supported – remote shooting, control of ISO, control of aperture/shutter (if the camera is set to semi-auto or manual mode).  However I’m not specially keen on flying something as heavy and solid as an SLR with wide-angle lens, so I checked the compatibility list for smaller, lighter alternatives.  Sadly none of my other cameras fitted the bill, so I purchased a Nikon S3300 compact.  This provides remote shooting (albeit without any control over aperture etc.), has a wide-angle lens (26mm equivalent for 35mm sensors), 16MP sensor, is small and light, and charges from USB (so the Pi should be able to keep it charged during flight).

Once gphoto2 has been installed (sudo apt-get install gphoto2), then the first thing to do is connect the camera and check that it can bee seen:

gphoto2 --auto-detect

This should produce a result like this:

Model Port
Nikon Coolpix S3300 (PTP mode) usb:001,012

So far so good.  Now to find out what capabilities the camera offers:

gphoto2 --summary

Which will give you something like this (some parts removed):

Camera summary:
Manufacturer: Nikon Corporation
Model: S3300
 Version: COOLPIX S3300 V1.0
Vendor Extension ID: 0xa (1.0)
Vendor Extension Description: microsoft.com: 1.0;

Capture Formats: JPEG
Display Formats: Association/Directory, Defined Type, JPEG, DPOF, MS AVI, Apple Quicktime, MS Wave

Device Capabilities:
 File Download, File Deletion, File Upload
 Generic Image Capture, No Open Capture, No vendor specific capture

Device Property Summary:
Property 0xd407:(read only) (type=0x6) 1
Property 0xd406:(readwrite) (type=0xffff) ''
Property 0xd002:(readwrite) (type=0x6) Enumeration [1,2,3,4,5,6,7] value: 6
Date & Time(0x5011):(readwrite) (type=0xffff) '20161111T143911'
Flash Mode(0x500c):(readwrite) (type=0x4) Enumeration [1,2,3,4] value: Flash off (2)
Focus Mode(0x500a):(readwrite) (type=0x4) Enumeration [2,3] value: Automatic (2)
Focal Length(0x5008):(read only) (type=0x6) Enumeration [3500,4600,5300,6100,7300,8600,10500] value: 35 mm (3500)
Battery Level(0x5001):(read only) (type=0x2) Enumeration [2,5,25,50,65,80,100] value: 80% (80)
Property 0xd303:(read only) (type=0x2) 1

Form this we can see that the camera supports “Generic Image Capture” (woohoo!) but no control over zoom (focal length is read-only).  Given that for a HAB flight I want the lens at its default widest setting anyway, that’s not an issue.

Taking a photo is simple:

gphoto2 --capture-image-and-download --force-overwrite --filename dave.jpg

This will extend the lens if it’s retracted, focus the lens, set the exposure (using whatever options are set within the camera), take a photograph and then download it to the Pi.

For more advanced cameras you may be able to control the exposure manually (aperture and/or shutter), control the ISO etc.  The available settings, and the specific commands to set them, vary from camera to camera but your starting point should be to list them all:

gphoto2 --list-config

The latest version of the Pi In The Sky software includes options for the use of cameras via gphoto2 (see instructions in the README).  With “Camera=G” in the pisky.txt file, gphoto2 and imagemagick installed, and a compatible camera connected and powered on, then PITS should take images on that camera and transmit them via SSDV.

Unlike with the Pi camera, images are taken at full resolution (or whatever resolution is set within the camera), and are then stored on the Pi SD card at that resolution.  The resizing for transmission is then done by imagemagick, which is why that has to be installed.

In testing, the Nikon has been completely reliable, running for 11 hours continuously till eventually the battery was discharged (remember, it charges to some degree over over USB hence the long run time).  So this is looking good for a flight.  Here’s a sample test image as sent via SSDV/LoRa.


Posted in Weather Balloon | 3 Comments