RTTY Through LoRa Module

Although the current Pi In The Sky boards include an FM transmitter for use with RTTY, there are some reasons for wanting to transmit RTTY through a LoRa module:

  • You can use 1 aerial for RTTY and LoRa, instead of 2, which makes payload design a lot easier
  • Extra RTTY bandwidth
  • You’re making your own tracker and want to avoid the expense of 2 radio transmitters

This article is about using the latest Pi In The Sky software to achieve this.


To transmit RTTY we need to be able to frequency modulate the carrier at a specific rate (e.g. every 20ms for 50 baud).  This timing needs to be accurate.  We could …

  1. Switch the LoRa chip into a mode where it just generates a carrier, and use the RTTY bit stream to set the frequency to one of 2 values every 20ms.
  2. Switch the LoRa chip into a mode where it transmits one of 2 frequencies, with the selection controlled by a DIO pin, and then send the RTTY stream to this pin.
  3. Switch the LoRa chip into FSK (Frequency Shift Keying) mode, with preamble and checksum etc. disabled, and have it send data from a buffer where that data is the RTTY bitstream.

(1) requires accurate timing, which is not so easy with a non-real-time operating system.  I did try this option with a small test program written in Python, and it worked tolerably well at 50 baud, but really isn’t a good option as the timing varies depending on the processor’s workload.

(2) requires an accurately timed bitstream applied to the DIO pin.  This is possible on the Pi using the serial port, but we use that anyway for the standard RTTY radio.  It’s also possible using a software serial port and the Pigpio driver which uses the DMA hardware for timing, but this increases our reliance on a particular external driver and besides, on our LoRa boards the appropriate DIO pin is not connected.

Which leaves us with (3).  There is a difficulty here, which is that the buffer used for FSK is small (64 bytes) and we need to oversample (store the same bit several times) to get the lower baud rates needed for decent range on HAB, but these are just coding challenges.  So this is the option I chose.

Essentially, the code does the following:

  1. Puts the LoRa chip in FSK mode at the desired frequency
  2. Sets the bit rate to a suitable value for the desired baud rate (50 and 300 supported currently).  “Suitable” means a low bit rate (reduces CPU workload refilling the buffer) for which 1 bit in the data results in 1 or more bytes in the FSK buffer (so we don’t need to mess around with individual bits in the buffer, which complicates the code).
  3. Set a buffer warning level so we can quickly sense when to refill the FSK buffer
  4. Tell the chip to transmit
  5. Fill the FSK buffer
  6. Monitor the buffer level and refill as necessary
  7. At the end of the RTTY sentence/packet, allow the buffer to empty

I am indebted to Matt Brejza for the idea, and for providing the code he used to implement this in his tracker, which I then incorporated/mangled to work with the existing PITS software.


First, you need the new version of the Pi In The Sky software (released 26th September 2018), or later versions.

To understand the settings, first take note that PITS has a concept of “radio channels” where a channel is a particular radio transmitter not mode (RTTY or LoRa).  We are using one of the LoRa devices (channels) to transmit RTTY.  So our settings are associated with the particular LoRa module (in CE0 or CE1 position).  Essentially we are overriding the normal LoRa functionality by telling the software to transmitt RTTY as well as or instead of the LoRa packets.

These are the new settings (shown for channel 0)

  • LORA_RTTY_Frequency_0=<RTTY Frequency>.  Without this, RTTY will use the same frequency as LoRa.  I recommend that you keep the frequencies apart so that your RTTY receiving software does not try to track the LoRa trransmissions.
  • LORA_RTTY_Baud_0=<baud rate>.  Choose 50 (better range) or 300 (faster, allows for SSDV, easier for dl-fldigi to lock to).
  • LORA_RTTY_Shift_0=<carrier shift in Hz>.  The carrier shift must be numerically greater than the baud rate.  Note that the LoRa chip steps in multiples of 30.5Hz.
  • LORA_RTTY_Count_0=<count>.  This is how many RTTY packets are sent one after the other before transmitting any LoRa packets.  2 is recommended in case the RTTY decoder misses the start of the first packet.
  • LORA_RTTY_Every_0=<count>.  This is how many LoRa packets are sent one after the other before transmitting any RTTY packets.  Set to zero to disable LoRa (and only send RTTY).
  • LORA_RTTY_Preamble_0=<bits>.  Sets the length of preamble (constant carrier) before sending RTTY data.  Default is 8 and seems to be plenty.





  • Only 50 baud and 300 baud are currently supported.  This may change in future releases.
  • If you choose to interleave RTTY and LoRa, then any SSDV packets are only sent out over LoRa.
  • If you want to transmit SSDV over RTTY, then you need to disable LoRa transmissions on that module, and use 300 baud.
  • You cannot have separate payload IDs for RTTY and LoRa on the same module.


Posted in Weather Balloon | 1 Comment

Making a Pi Zero GSM/GPS HAB Backup Tracker

This is an update of a previous post, but post-flight (so I know that it works!) and with instructions on how to make your own tracker, and your own gateway to upload to the live HAB map.


GSM-based trackers are quite rightly frowned upon for HAB tracking, mainly because they only work at low altitudes (within range of a mobile phone tower, which aim the signal generally downwards).  So they don’t provide tracking throughout a flight, which is a problem as then you don’t know where the payload is until it lands.

If you’re lucky.

There are 2 problems here – one is that GSM coverage isn’t 100%, and the other is that the popular GSM trackers don’t seem to like high altitudes.  I don’t know if they get confused, or they don’t like the cold, but I’ve tried these things several times and only had one work once.

A GSM/GPS tracker that actually works would be useful though, as a backup to a main tracker.  Having had little success with commercial offerings, I thought I’d make one.  I found a Waveshare model that uses the SIM868 GSM/GPS module, plus supporting electronics on a Pi Zero HAT.  So that plus a Pi Zero and suitable power supply would make a fairly small backup tracker.

The device supports GSM (calls, texts) and GPRS (2G, i.e. slow data).  It also has a GPS receiver.  It seemed attractive to use GPRS to provide internet access (via PPP), but that would lock out the single serial port thus making GPS unavailable.  So I decided to just send SMS from the device instead, using a script that gets the GPS position, then builds and sends an SMS containing that position.  I wrote this in Python using the PyGSM library, which makes things very easy (generally no need to mess around with AT commands).  PyGSM doesn’t know about the SIM868 GPS functions however, but it was simple to add those.  So my test script requests and parses the GPS position, then formulates a text message and ends it to my mobile phone:

Live Map

It would also be useful to have the balloon position automatically uploaded to the live map, so I decided to have the device send a second SMS but this time to a gateway based at home.  This gateway is another Pi with a USB 3G modem attached.  I used the same library, but a different script to poll for new messages, determine whether an incoming message is of the correct format, and if so build a UKHAS telemetry sentence, finally uploading it to habhub for the live map:

Hardware Build

For the tracker, mount the Waveshare GSM/GPS board on a suitable Pi (the Pi Zero is ideal – less power and weight).

For the gateway, use a Pi B+ (V2 or 3 or whatever you have handy), and connect a USB modem (e.g. Huawei E173, which is very common and works well on the Pi).

You can use the software without a gateway if you wish, in which case you will only have texts sent to a smartphone and not uploaded to the live HAB map.

Software Installation

Use the same instructions for both tracker and gateway.  Build a bootable SD card in the usual way, using Raspbian Lite.

Next install git other dependencies:

sudo apt install git apt-get python-setuptools python-pip wiringpi

Install the pygsm library:

git clone https://github.com/adammck/pygsm.git
cd pygsm
python setup.py install

and other Python dependencies:

sudo pip install crcmod

Install the tracker software:

cd ~
git clone https://github.com/daveake/GSMTracker.git



First, start the gateway software:

cd ~/GSMTracker
python gateway.py

Assuming the 3G modem is connected and working, with a valid SIM card, you should see something like this:

Modem details ...
Manufacturer = huawei
Model = E173

Phone number = +4476543210

Waiting for messages ...

Take a note of the phone number.

Payload Document

For the tracker to appear on the UKHAS map, it needs to have a payload document.  Create one using the habhub page and with these fields:

Set the payload ID to something meaningful (but please not “GSM” as that’s what I use!) and set the checksum type to “crc16-ccitt”.


First, start the GSM/GPS module by pressing the button on the side.

Now run the tracker program, using that phone number and also the number of your smartphone.  The format is:

python gsmtrack.py <payload_ID> <phone_number> [gateway_number]

For example:

python gsmtrack.py GSM 07987654321 07876543210

where the “payload_ID” must exactly match the ID you used in the payload document; the phone number is that for your smartphone, and the optional gateway number is that of your gateway to upload to the map.  You should see something like this:

Texts will be sent to mobile phone 07987654321
Texts will be sent to gateway number 07876543210

Modem details ...
Manufacturer = SIMCOM_Ltd
Model = SIMCOM_SIM868

Switching GPS on ...
Position: 16:49:54, 52.12345, -1.23456, 155.611
Send because of timeout
Send because of horizontal movement
Send because of vertical movement
Sending to mobile 07987654321: GSM position: 16:49:54, 51.1....
Sending text to gateway
Sending to gateway 07876543210: HAB:GSM,1,16:49:54,51.1...
Position: 16:50:05, 51.12335, -1.23458, 153.100


For the tracker, you should have it start up automatically.  This should include automatically starting the GSM/GPS device as it does not start up when power is applied.  Earlier we did that by pressing the button on the board, but we can automate that in a script:

cd /home/pi/GSMTracker

gpio mode 7 output
gpio write 7 0
sleep 1
gpio write 7 1
gpio mode 7 input
sleep 5

while :
    python gsmtrack.py GSM 07987654321 07876543210
    sleep 5

Start that when Raspbian starts, using your preferred startup method.


For flight, package the tracker in a small foam polystyrene container, using a suitable power source.  I used a powerbank that accepts AA cells, populated with Energizer Lithiums; this is the safest option.  Remember to connect the GSM and GPS aerials, and have the latter at the top of the payload.

The tracker will send texts to your phone and gateway when it first gets a position, every 10 minutes thereafter, or more often if it detects horizontal or vertical movement.  The update rates are in the code and can be easily changed.  It will only attempt to send out texts when below 2000m.

Posted in Weather Balloon | 5 Comments

Pi Zero GPS/GSM Tracker and Habitat Gateway

GSM-based trackers are quite rightly frowned upon for HAB tracking, mainly because they only work at low altitudes (within range of a mobile phone tower, which aim the signal generally downwards).  So they don’t provide tracking throughout a flight, which is a problem as then you don’t know where the payload is until it lands.

If you’re lucky.

There are 2 problems here – one is that GSM coverage isn’t 100%, and the other is that the popular GSM trackers don’t seem to like high altitudes.  I don’t know if they get confused, or they don’t like the cold, but I’ve tried these things several times and only had one work once.

A GSM/GPS tracker that actually works would be useful though, as a backup to a main tracker.  Having had little success with commercial offerings, I thought I’d make one.  I found a model that uses the SIM868 GSM/GPS module, plus supporting electronics on a Pi Zero HAT.  So that plus a Pi Zero and suitable power supply would make a fairly small backup tracker, and maybe even one that works.

The device supports GSM (calls, texts) and GPRS (2G, i.e. slow data).  It also has a GPS receiver.  It seemed attractive to use GPRS to provide internet access (via PPP), but that would lock out the single serial port thus making GPS unavailable.  So I decided to just send SMS from the device instead, using a script that gets the GPS position, then builds and sends an SMS containing that position.  I wrote this in Python using the PyGSM library, which makes things very easy (generally no need to mess around with AT commands).  PyGSM doesn’t know about the SIM868 GPS functions however, but it was simple to add those.  So my test script requests and parses the GPS position, then formulates a text message and ends it to my mobile phone:

It would also be useful to have the balloon position automatically uploaded to the live map, so I decided to have the device send a second SMS but this time to a gateway based at home.  This gateway is another Pi with a USB 3G modem attached.  I used the same library, but a different script to poll for new messages, determine whether an incoming message is of the correct format, and if so build a UKHAS telemetry sentence, finally uploading it to habhub for the live map:

UPDATE: Software now uploaded to github: https://github.com/daveake/GSMTracker

Note: Not tested in flight yet so I don’t know if the software and/or hardware chosen will work or not, so user beware!!

Posted in Weather Balloon | Leave a comment

Pi Zero W Streaming Dashcam

Tidying my office a few days ago, I came across some car reversing monitors that I used to use as cheap Pi displays for use in the chase car, to show the distance and direction to the payload; these days I use the official Pi touchscreen as it’s a lot better for that application.  One of the monitors is a flip-up model, and I wondered how much space there was inside.  I use the Pi Zero a lot for balloon trackers, as it’s small and light compared to other Pi models, but perhaps one could fit one inside the base to make a smart dashcam – one that can stream my balloon chases to Youtube as well as record to SD.

About the same time, Michael Horne (author of the excellent Pi Pod blog), posted a picture of a similar-looking model on Twitter, asking how to power it from 5V.  That’s the opposite of what I wanted to do (power the Pi Zero from the 5V rail inside the monitor) but I felt I might be able to help so I opened up my unit to find where the 5V could be tapped.  As it turned out, Michael’s unit had a very different PCB to mine, but the seed was sewn so I decided to start building my dashcam.

Pi Zero to Monitor

First job was to connect the the display to a Pi Zero W (W because I want to be able to stream the camera video). This requires the 5V and GND lines on the GPIO pins, plus the composite video output pin, to be wired to their counterparts in the monitor.  Once I’d used the correct video pin this worked without issue!

I don’t know how much spare current capacity the display has on the 5V rail, but it dropped slightly from 5V to 4.9V which is OK.  The Pi Zero booted and ran continuously overnight with no issues and with nothing on the display PCB getting hot.

I connected the Pi Zero to my LAN via a USB LAN adapter, ssh’d to it, then set the video aspect ratio to match the monitor (16:9).


The full set of options is:

I also updated/upgraded Raspbian, and set up the WiFi.


Next steps were to enable and connect the camera, and to install ffmpeg, which is what I use to stream to YouTube.  I used these instructions to install the library:

cd /usr/src
git clone git://git.videolan.org/x264
cd x264
./configure --host=arm-unknown-linux-gnueabi --enable-static --disable-opencl
sudo make install

and then ffmpeg.  This takes several hours to build, so it’s a good time to find something else to do!

cd /usr/src
git clone git://source.ffmpeg.org/ffmpeg.git
cd ffmpeg/
sudo ./configure --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree

Aspect Ratio

I then tested the video streaming to YouTube with a command like this:

raspivid -o - -t 0 -w 640 -h 360 -fps 25 -b 500000 -g 50 | ffmpeg -re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv rtmp://a.rtmp.youtube.com/live2/<stream ID>

which worked to Youtube, however the preview displayed on the monitor was distorted.  I thought that this was to do with Raspbian not correctly applying the screen resolution, but no matter what I did to fix that (specifying the resolution explicitly in config.txt, or specifying the preview window dimensions) fixed it.  Eventually I concluded that the issue was within raspivid, and then I soon found a very relevant forum post, which explained how to modify raspivid:

git clone https://github.com/raspberrypi/userland.git
cd userland

Then add these lines near line 116 of RaspiPreview.c:

param.noaspect= MMAL_TRUE;

then rebuild.  After following these steps, the problem was gone!


With the base plate removed, I threaded the flat camera cable up through the back of the base, behind the support that goes up to the top of the display, connected the camera and fixed it in position with Sugru:

I then added a piece of black duct tape covered the cable to stop it snagging and make it look tidy.


The Pi Zero has 3 wires connected – 5V, 0V and video, which on my monitor go to a convenient electrolytic capacitor and the back of a PCB socket.  Finally, a switch is mounted near the front of the monitor’s base, and connected to a GPIO pin and GND:

Everything was then insulated with duct tape before screwing on the metal base.


I wanted the dashcam to work in one of 2 modes – to record as a dashcam normally would, and to also live stream to YouTube (via the WiFi connection to a phone or MyFi device, for example).  So I wrote a small script that switches modes according to the position of a switch connected to a GPIO pin.  On startup, or when the switch position changes, the script runs the appropriate shell command for that mode.  For regular dashcam recording, that’s a simple raspivid command; for streaming it pipes the raspivid output through ffmpeg (see command above).  At present I’m not recording the video, so I need to do that using rotating file names, over-writing old files before the SD card fills up, and recording at a high resolution but streaming at a lower resolution.



Posted in Weather Balloon | 2 Comments

How To Choose A Tracker

I sometimes receive emails asking which HAB tracker is best to buy, so here I will compare the ones I have direct knowledge of.  First though, if you have or want to have the ability to build and code your own tracker, do that instead!  It’s much more educational to walk that path, rather than take the easy option of buying a ready-made tracker.  It’s also far more rewarding.  Or if you prefer, you could build your own tracker but use existing software, or buy a pre-built tracker and write your own software.

If you want to DIY the electronics and/or software, and if you can you should, then check out these resources:

Now for the comparison, including the PITS (Pi In The Sky) board, PITS Zero and HABDuino:

PITS (Full Size)PITS ZeroHABDuino
Transmits RTTYYesYesOptional
Transmits LoRaOptional (choose from 434, 868 and 915MHz add-on cards)Yes (choose from 434, 868 and 915MHz modules)No
Transmits APRSOptional (choose from 144.390MHz (USA), 144.800MHz (Europe) and 145.175MHz (Australia))NoOptional (choose from 144.390MHz (USA), 144.800MHz (Europe) and 145.175MHz (Australia))
Transmits Live ImagesYes, over RTTY or LoRaYes, over RTTY or LoRaNo
Soldered connections for external DS18B20YesNoNo
Soldered connections for external BMP085/BMP180/BME280YesNoNo
Supports Pi Sense HatYesNoNo
Software Predicts Landing Position During DescentYesYesNo
Posted in Weather Balloon | 1 Comment

I’ve Got No Strings

NASA have given us many iconic photographs, such as “EarthRise” and “Man On The Moon“, but there’s one which is more easily replicated if you’re not NASA or Elon Musk, and that is the image of Bruce McCandless floating freely in space during the first untethered spacewalk:

So several months ago I set about replicating this as best I could, under a high altitude balloon.

Revell Model

Revell USA made an astronat/MMU kit back in 1984; samples now are rare and expensive, but eventually I found an unused kit in the UK on ebay.  It was a good price, complete and original:

The kit arrived late last year, and I assembled it over the Xmas break, in preparation for a launch on (hopefully) February 7th, 34 years to the day after the original flight.


I wanted to include several cameras for the flight, both video cameras and still cameras for downloading live to the ground during the flight.  Having tried several different action cameras in the past, my current favourite is the Gitup Git2 camera – reliable, inexpensive and plenty of options in the firmware.  I combined 2 of these (one Git2 with wide-angle lens, and one Git2P with a normal lens) with some AA-powered powerbanks to extend the run time from about 90 minutes to several hours using 64GB SD cards.

I also wanted live images, ideally from different viewpoints.  The only reliable live image cameras I’ve used are the Pi models, and these are one-per-Pi.  So I built a small network with 3 Pi boards using the built-in wireless modules to pass image files between them; a Pi3 as an access point, and Pi Zero Ws as clients.  All live images were then downlinked in squence by one Pi using LoRa.

In the end it wasn’t really feasible to set up vastly different viewpoints as the astronaut model is quite large and the payload would have then been huge (and cumbersome, and delicate), so I had the cameras all quite close to each other.

Payload Build

I decided to place each Pi in its own Hobbycraft box.  The Zeroes are very small of course and even with an AA powerbank there was space to fit a video camera with its own powerbank inside the box:

Next came the main Pi, with its own camera, 3G (later removed due to insufficient power from the powerbank) and UBlox USB GPS, all inside 2 of the same boxes glued together:

Finally, I added a backup tracker in case the main one failed for any reason, and to provide a programmed cutdown to prevent the flight drifting too far:

There was one last thing to do – my Revell model isn’t identical to the version that NASA flew, and most prominently was missing a camera.  Easily fixed with some foam polystyrene and a plastic cap!

Bruce Junior was now ready for flight!

The Flight

The flight predictions for my chosen date were not ideal, but quite good for the time of year.  Initially I was going to have help from another HABber but he couldn’t make it that day, so I launched alone.  To make that task easier I removed some items from the flight, allowing for a smaller balloon and less gas.  I also chose to launch later in the day than planned, which meant I didn’t need to overfill the balloon so much to keep it away from the sea.  Here’s the predicted flight path:

Less gas means less lift which makes it easier to tie the balloon and handle it afterwards. Aside from the cold, it was a very nice day to launch – fairly clear skies and not too much ground wind.  Here’s the partially-inflated balloon:

Meanwhile the payload cameras were recording and transmitting to the other HAB enthusiasts online:

With the balloon fully inflated, tied off and tied to the parachute and payloads, it was time to launch:

I then finished getting aerials set up for the flight, finished filling the car with kit, and then set off on the chase.  I knew that the flight was going to land some time before I arrived, so I wasn’t in as much of a rush as usual.  Meanwhile the payload continued to rise till, at over 30km and just under 100,000 feet, the balloon burst.  Here’s what happens to an aerodynamically asymmetrical payload when a high altitude balloon bursts and gravity takes over!

The flight computer includes its own landing prediction which, as I’ve seen every time before, is more accurate than one that the online map uses.  Here “X” is the last prediction from the tracker, with “O” being the actual landing spot and the red line showing the online prediction:

That last position was from my chase car, which was still on the M5 and over an hour away at the time!  Here’s what happened when I was trying to catch up:

Normally when I get close to a landed balloon the radio signal reappears and I can get the landing position easily.  This time though the landing was on a farm behind some metal cowsheds, blocking the signal from the nearby roads.  After driving past where I thought the signal should reappear, I found a hill, connected a Yagi aerial to my handheld receiver and got a position that way.  Following that target I still didn’t regain the position until I got to the farm, when I could see a row of sheds between me and the landing position.  It’s always fun explaining to farmers why I’ve suddenly turned up, and this time one of them had actually seen it land.  Retrieval was easy, though muddy and rather smelly …

Here you can see the balsa-wood frame (for lightness and deliberate fragility) with pivoting support (again, to help prevent damage to whatever it lands on), with the balsa painted matt black so it disappears against the black sky at altitude.

The payback

The point of the all this effort was to replicate as close as I could those original NASA images, so once home I went through the camera footage to select these …



Posted in Weather Balloon | 11 Comments

“Untethered” Spacewalk Flight

My next flight is planned for this coming Wednesday, 7th February, to commemorate the first untethered spacewalk by Bruce McCandless on the same day in 1984, by trying to recreate the classic image:

There will be a total of 3 Pi cameras with different viewpoints and different lenses, to best capture images of a Revell model astronaut and MMU (Manned Maneuvering Unit), sending live pictures down to the ground.

The flight will have 2 trackers, one a Pi with 434 and 868 LoRa modules and the other a simple AVR LoRa tracker with cutdown:

  • BRUCE: Pi, LoRa, 869.850MHz, Mode 3, SSDV and telemetry
  • MMU: Pi, LoRa: 434.225MHz, Mode 1, SSDV and telemetry
  • EVA: AVR, LoRa: 434.450MHz, Mode 2, telemetry only

The main tracker is a Pi 3 plus LoRa and UBlox boards.  The SSDV images will cycle between 3 cameras – one on the Pi 3 and 2 more on a pair of Pi ZeroW boards, all connected via wifi (the Pi 3 is an access point).  The cameras are (currently – may change): Pi V2 (Sony) camera, Pi V1 (Omnivision) camera, and a PiHut “fisheye” Omnivision camera.  The cameras are arranged on the payload to get different views, so you will see different views during the flight.  Same applies for both LoRa transmitters, with different images being sent to each.  If for any reason the wireless stops (though it’s been 100% reliable in testing), then the Pi3 will just send its own images.

Both include landing prediction fields, and those from MMU will be re-uploaded by a Python script to appear as “XX” on the map.

The LoRa signals will stop for a few seconds each minute, during which time one of my gateways will be sending a message up to the tracker to request a re-send of missing SSDV packets on the 869.85MHz link.

The Pi 3 also has a USB 3G modem on it, which will attempt to connect while below 2km.  When connected it will:

* Upload telemetry directly to habitat every 1 minute, as payload ID STS41B
* Stream video to YouTube
* Copy images to a web server

The video uses the Pi 3’s camera, so there will be no SSDV from this camera before launch or after landing – all SSDV will be from the other cameras.

The backup tracker will be attached just above the parachute, and will cut the balloon away if the flight gets south of 51.1 latitude (may ehcnage that depending on predictions), or on a specific upload from the ground, to prevent a watery death.

And just because the payload isn’t heavy enough already, and doesn’t have enough trackers on it, we are adding a couple of car GSM/GPRS trackers, both sending messages to Anthony’s traccar server from where a Python script will send them to SNUS, as:

  • HTGSM1 – Upu’s car tracker
  • HTGSM2 – My cheap car tracker

There will be 2 YouTube streams – one from the payload (launch and landing, hopefully) and one at launch only using a camcorder connected to a laptop.  Both streams will appear on a web dashboard – see links below:


Posted in Weather Balloon | Leave a comment

Bruce McCandless Spacewalk Commemoration

This is to commemorate the very first untethered spacewalk by Bruce McCandless on 7th February 1984, as part of Space Shuttle mission STS-41B, when he used an MMU (Mobile Maneuvering Unit) to fly up to 300′ away from the Challenger Space Shuttle, and to replicate as best I can the famous photograph of him floating in space …

Sadly, Mr McCandless died late last year.

Of course the “untethered” aspect can’t be repeated under a balloon, gravity being what it is, but careful use of black supports against a black sky will make it look untethered.

The Kit

Fortunately, Revell USA made a combined astronaut and MMU kit in 1984.  Unfortunately they soon stopped making it, and examples are expensive and fairly rare.  I watched listings on ebay for a few months, but all were in the USA with expensive postage, until one popped up in the UK.  Not only was this the least expensive I’d seen, with reasonable postage, but also it was a completely original sample with the parts still in sealed plastic wrappers.

It’s probably 45 years or more since I assembled a plastic kit, so I had to buy the glue, paints and brushes before I started.  Assembly wasn’t difficult though the plastic in general was much thinner than the small Airfix kits that I remember from my childhood.

Live Cameras

The flight will include 2 LoRa downlinks – one in the 868MHz ISM band (more bandwidth for larger images) and 434MHz ISM band (better range).

I want to be able to take photographs from different viewpoints, ideally:

  1. Straight shot from distance
  2. Side shot
  3. Close-up

One option would be to move the camera around with motors, but that would be delicate and likely to fail during flight.  Instead I’ve opted for 3 separate cameras.  This could possibly be done from a single Pi using USB cameras, but those aren’t reliable in my testing.  Webcams tend to be much more reliable but not as good quality as a Pi camera.

Another idea is to use a separate Pi for each camera.  I could then build 3 separate trackers, but for them all to use the 868MHz band I would need to have them take turns transmitting.  All do-able but a bit messy, plus there would be a lot of aerials!

Airborne Network

So instead, I decided to have one central Pi that has a camera, GPS and radio transmitters, plus 2 extra Pi boards just with cameras.  Networking 3 Pis could be done with a network switch and cabling, but a wireless is a lighter option.  Recent Pis have built-in wireless networking (saves a bit more weight, and is more reliable in my experience) so I settled on a Pi 3 as the tracker and access point, and 2 Pi Zero W boards as clients.  So that’s 3 Pi boards in total and 3 cameras.

Setting up a Pi 3 as an access point takes quite a few steps, especially when bridging the wireless LAN to the wired LAN, but there are clear instructions on the RPi web site.


I needed to modify my PITS software to cope with 3 cameras.  Normally, the tracker program (which is the one transmitting image packets) requests new images periodically according to the schedule in the configuration file, and then chooses the “best” image and requests a conversion from JPG to SSDV format shortly before it finishes transmitting the current image.  One option I had was for this code to be modified to request 3 photos instead of 1, with 2 of those being on the Pi Zeroes, plus of course to cycle between the cameras for conversion and transmission.  Separately, a bash script takes photos and does the conversions.

I felt it would be simpler to remove some of this responsibility from the tracker, so that it just chooses which photo to send, choosing each time from a different camera.  So this means that we need a simple script to take photographs, and a simple script to do the conversion to SSDV.  The first of these scripts is run on each Pi, with the Pi Zero scripts also copying the photo files to the Pi 3.

Here’s the SSDV page showing the result, on the 434MHz channel (smaller images), with the Pi 3 cycling through all 3 cameras:

Weather permitting, I’ll fly this on February 7th, 34 years to the day after the original flight.

Posted in Weather Balloon | 2 Comments

Telnet Flight

This was a fun flight to provide a remote serial terminal on a Pi, between ground and a high-altitude balloon, using a bi-directional radio link.

LoRa Possibilities

Most high altitude balloon flights use a simple unidirectional data stream from the balloon to the ground, sending the telemetry (balloon position and sensor data) and sometimes images too, from balloon to the ground.  Most often this is RTTY or (in the USA APRS) but there are alternatives such as LoRa which more easily provides a means of reliably transmitting data to the balloon as well as from it.  This greatly expands the range of things we can do during a balloon flight, for example:

  • A ground station can request re-sends down to the ground of missing data (image data or anything else) – see http://www.daveakerman.com/?p=2195
  • A balloon can repeat data from other balloons, which might be flying or have landed – see http://www.daveakerman.com/?p=1850
  • Uplink to request cutdowns
  • Uplink to provide a guided parachute or parafoil with a new target landing position

You are in a maze of twisty little passages

Another possibility is to run a terminal session between ground (client) and balloon (host), allowing programs to be run on the balloon tracker as requested by a ground station:

This could even be used to change the tracker program, or have that program use new configuration parameters.  Here though I’m going to use an idea provided by Philip Heron – run an old text adventure game.  And to make this a group experience, I added a web dashboard that displays the terminal window in real (ish) time.  The following diagram shows how this is achieved in software:

The LoRa gateway is the standard release with modifications added to provide a server socket on a specified port, to which any network terminal program (e.g. putty) can connect.  In this case I have written a simple terminal program, in Delphi, that screen-scrapes the terminal window and sends the contents to a Python script, which then updates a web dashboard so that anyone with the URL can see what I see in my terminal program.  Separately (and not shown on this diagram) another Python script updates the same dashboard with the current telemetry, using data from the habitat system.

Each time that a key is pressed in the terminal window, the key character is sent to the gateway that stores it ready for radio upload to the balloon tracker.  Normally with balloon trackers, they transmit all the time, but instead this tracker sits listening for an uplink, to which it replies immediately.  So the gateway program reads the keys sent to it from the terminal program, adds them to a custom message, and sends the result to the tracker.  Assuming the messages arrives intact, the tracker replies with an ACK; any other reply (or no reply at all) results in the message being re-sent.

At the tracker, these messages result in those key codes being sent to the telnetd program (telnet daemon) which is an installable program on Raspbian.  That program provides a regular command interface – same as a login on a Pi using a keyboard and monitor – and any responses to those key codes are sent back from telnetd to the tracker program, where they are included in messages sent back to the ground.

Periodically, when the tracker receives an uplink it will reply with a telemetry string so that the gateway can upload the balloon position to habitat as usual.  The string is standard except for the addition of some status information about the uplink.  There’s also a timeout so that if no uplink is received for a while, telemetry is sent anyway (useful for tracking after landing).

Houston, We Have Another Idea

As the Apollo missions of the 1970’s were a major part of my inspiration for my very first high altitude balloon launch, it seemed entirely appropriate to push the retro theme of this flight a stage further and try to replicate an Apollo mission control console.  So I grabbed a suitable photograph from the web, edited it fairly heavily, and incorporated in a new web server program that populates the screen with balloon telemetry and the terminal session.  I wrote this in Delphi, with some Python to grab the telemetry from Habitat.  I opted for a green-screen monitor though (I later noticed) the Apollo screens had white screens.  I think green looks better!

Radio Waterfall

This is what it looks like in action, minus the web dashboard:


LoRa packets are up to 255 bytes, so long sections of text are downloaded in chunks of a bit less than that length (there’s some overhead of course), so the terminal window is updated in chunks also.  That window is screen-scraped every 1 second, and the results are pushed to the web dashboard at that rate or slower (depending on the time needed to post to the server).  The following video shows the terminal window and dashboard, for a short session that includes logging in to the tracker, running a couple of basic Linux commands, and then starting the Colossal Caves adventure game.


Choice Of Frequency

We have a range of frequencies available to us for balloon flights, with different restrictions according to power and duty cycle.  For this flight the duty cycle (proportion of time spent transmitting) is between 50% and near 100%, so I had to choose  frequency in the band that allows that.

It’s also important to choose a frequency that doesn’t have a lot of use from other devices.  For those receiving on the ground, they may be near ISM (Industrial Scientific and Medical) devices such as oil level senders, weather stations etc, that can be bothersome if transmitting near the receiver.  For the flight though, it can potentially hear transmitters over 100’s of miles, so about a year ago I did a test flight to scan the spectrum and report on the signal levels as received by the balloon.  The results of that test showed that some frequencies are 15dB better (which is a lot) better than others:

The quietest area does not allow 100% duty cycle (which is probably why it’s the quietest!) so I chose a frequency centred on the quietest area that does; namely 434.225MHz.


Normally I would use a PITS tracker, but as I didn’t need RTTY I decided to use a custom GPS+LoRa board, atop a Raspberry Pi model A+.  Power came from an AA “emergency phone charger” with 4 Energizer Ultimate Lithium cells.  The lid of this was firmly taped down with duct tape, and the cells held in place with double-sided pads, but even so the Pi rebooted when it landed.  Not an issue but a reminder that soldered cells are best!

As usual, the lot went into a Hobbycraft polystyrene box, with GPS aerial on the top and a 1/4 wave aerial on the bottom, made from an SMA bulkhead plug and 5 pieces of guitar wire.

The Launch

The conditions were favourable, without too much ground wind (makes it difficult to launch) or high level winds (can take the flight a long way away); I wanted the flight to stay fairly close to give the uplink the best chance of working throughout the flight.

First step was to get all the ground-station software started (LoRa gateway, Telnet terminal client, web server (for Apollo dashboard) and Python scripts for updating the two dashboards (Apollo and thedash.com) with telemetry and terminal data.  I intended to use a gateway up on my Clark mast, for the best range, but that position is beyond the reach of my house wifi signal, and the TP-Link repeater I bought the day before completely failed to extend the signal far enough.  Soon I’ll have my shed (next to the mast) wired to the house network, and that problem will go away.  Meanwhile though, I had to go with the LoRa gateway in the house, using a short colinear aerial in the loft.

With the software all set up, I started the tracker, checked that the 2-way communications was all working as expected, and then filled the balloon.  I chose a 350g Hwoyee, a 24″ Sphereachute, and a gas fill to have the flight land south of Monmouth.  I needed to launch by about midday as after the the flight would land further east, increasing the risk of a tree landing.  The launch itself was easy, with little wind.

The Flight

As mentioned, I only had a loft aerial to communicate with the flight, but that worked very well, both for uplink and downlink; very few missing packets on the uplink, and a last position from an altitude of 390 metres.

Once I got back to my PC after the launch, I typed a few commands into my remote terminal window, with the results then being relayed to the dashboard webb pages along with telemetry:

The upper section shows the latest telemetry, as received either by my gateway or by at least one of the other gateways operated by the HAB community.  The left side shows what the payload is reporting for messages it has received from the ground; the right side is basic GPS information.

The lower section shows a copy of the terminal window from my PC.  There is some latency in the system, with screen updates relying on the approx 1400 bps downlink from the balloon, plus some delays as the terminal is polled then changes sent to the web server and distributed over the web.  It was though entirely usable.


Next step was to run the text adventure “Colossal Cave”, which I remember playing around 1980.  This can be installed on a pi with

sudo apt-get install bsdgames

I’d already installed it (as the payload doesn’t have internet access!), so I just needed to run it by typing “adventure” into my terminal window.


One of the other balloonists on IRC asked if I could do some ASCII art, and conveniently I’d already installed figlet which does that!


Another request was to reboot the Pi, so I obliged:

I don’t have a screenshot from the live reboot, but I do have one from when I previously tested this on the ground:


As mentioned, this flight wasn’t expected to go far, so I left the chase until shortly before the flight landed.  We had a last position at 390 metres altitude which is plenty good enough to then find a position within radio range of wherever the payload actually is.  Here’s the path that the flight took:

We parked up near the last position, switched on our mobile LoRa gateway, and soon received a new position with the landing spot.  This was close to the lane that we were on, so we parked opposite and took a look.  The payload was hanging from a small tree in someone’s front garden.  We rang the doorbell, several times, but nobody in so as the payload was very close we just grabbed it from the tree.

and took it back to the car:

So, a very successful flight, and though the remote terminal and game-playing was just a bit of fun, it did show just how reliable the LoRa uplink is, and I’ll use that for other purposes in forthcoming flights.



Posted in Weather Balloon | 3 Comments

Pi LoRa 868MHz Flight

This was a simple flight, partly to try out a fisheye camera for the Pi Zero, partly to try streaming the launch to YouTube from a new camcorder, and partly to get a launch in while the weather is good for it!


I wrote previously about how to stream to YouTube from a D-SLR.  That technique used the USB connection from the camera, but better quality is possible by feeding the HDMI output from a suitable camera into a HDMI to USB video capture device.  However, many DSLRs superimpose focus rectangles and other items onto their HDMI output, which is not what we want.  For some Canon DSLRs (but sadly not my EOS 760D!) some third-party firmware can make the HDMI output clean, so I needed to find another camera to use.  One option was to buy a cheap/older Canon DSLR, but in the end I opted for a Panasonic V160 which has a 38x optical zoom, is small and very light (perhaps too light!), and can accept a larger than standard battery for longer run time (or can run from USB power).  I paired this with a cheapish fluid head on a heavyish tripod.

For capturing the HDMI feed, I bought an AverMedia HDMI to USB capture device which had HDMI in and out sockets, and a USB connection to a PC.  It’s really very very good, once you get past the rather odd software user-interface which wants to be a game rather than a program.  The software can stream to various services including YouTube, and will authenticate to YouTube so you don’t have to mess around with URLs or video ID codes.  Provided you have the uplink bandwidth (which for me means using 4G rather than the pathetically slow FTTC connection) then streaming is very smooth indeed, mainly as a result of the device doing H.264 compression internally.


I streamed the launch live to YouTube, and you can see the results here in my channel.


I made the tracker a few weeks ago, similar to this design but using a fisheye camera from Pi Hut, using an 868MHz LoRa module so I could send fairly large images down during the flight.  As the range of these wideband transmissions is much shorter than for more normal settings, I set up a LoRa gateway connected to a high-gain Yagi antenna atop an ex-Army 12 metre Clark mast.  This gateway uplinked messages to the flight, for re-sending any missing image packets, and separately another gateway listened only, from a 1/4 wave antenna through a filtered pre-amp.

For the flight, I removed the tracker from that case and placed it inside a foam plastic box from Hobbycraft, powered by a cheap AA*2 powerbank.  However, this combination failed to gain a GPS lock, so I swapped the powerbank for a 4*AA model and used 2 cases taped together so I could keep the powerbank away from the GPS aerial (1/4 wave wire as in the photo above).  The result had no problem at all getting a good GPS lock with plenty of satellites, however the weight went up from 95g to 195g.  Balloon was a 1600g Hwoyee with hydrogen.


The launch was delayed by the GPS issue and wanting to wait until I had some help, by which time the wind had gone from “nothing at all” to “mainly blustery”.  So filling the balloon was fun, as can be seen on the video, and I had to take the balloon down to near some trees before launching it.  After that, the flight itself went smoothly, and pretty close to the prediction, albeit a bit higher finally bursting at 43,014 metres (I expected 41-42km).

We tracked the flight in our chase car, both via the live map and also with direct reception using the 868MHz LoRa gateway in the dashboard.  As I mentioned, range on 868 isn’t that good, so for a long time we had no direct data or images, but once we got in range reception was very good.  We were about 10 minutes away from the flight when it landed, and got our last position when it was at about 775m altitude.  We then tapped the predicted landing position (as sent by the tracker itself) into my phone and drove up to the landing spot easily.  When we got there, Julie first spotted the payload and parachute just metres away from the road.

Even better, there was an open gate just behind where we parked, so a very very easy recovery.

Finally, some photos from the flight.

Just after launch:

Near peak altitude:

Just before landing:

Posted in Weather Balloon | Leave a comment