BBC Microbit Balloon Tracker

I’ve been meaning to do this for a while, and a short gap between projects gave me some time to try.

The Microbit is (yet another) educational SBC, sitting somewhere between the Codebug and a Raspberry Pi.  Its processor has plenty enough flash memory and RAM to run a basic tracker (but more on that later), plus it has accelerometer and compass chips.

Importantly, the Microbit has SPI and I2C busses plus a serial port, all brought out to the edge connector on the bottom. Rather than solder directly to the pads, I bought an edge connector and teeny prototyping board:

I also bought a battery holder with cable and plug to suit the micro JST connector on the Microbit.

Balloon Tracker Hardware

To make a balloon tracker, we also need to connect a suitable GPS (by which I mean, one that still sends positions when at high altitudes) and an ISM band radio transmitter.  I chose a UBlox module from Uputronics:

Usefully, this design includes an I2C port as well as the usual serial port.  Since the Microbit serial port is normally used by the debug connection to a PC, software development becomes more difficult if we use that serial port for the GPS, so I2C makes life much much easier.

Now for the radio.  The most popular HAB option is the NTX2B radio transmitter, but that also needs a serial port, so instead I opted for a LoRa transceiver from Uputronics:

This has an SPI interface, so the serial port remains free for debug purposes.

The first job was to get the devices wired together.  There’s not much space on this prototyping board, and it can be useful to keep the GPS away from the other devices anyway (less interference), so I put the GPS and radio on wire tails:

GPS Software

There are several options for writing code for the Microbit, and I opted for MicroPython as I’ve been writing a lot pf Python lately, using the Mu editor/downloader.  I started with some simple code to grab the NMEA data stream from the GPS, and this took just minutes to get going:

I then ported my Pi Python GPS NMEA parser (which meant, just changing the code to use  the Microbit I2C library rather than the Pi serial port).  You can see my test program here (but please don’t use that for a flight, as it was written for car use and therefore doesn’t put the GPS into flight mode!).

LoRa Radio Software

I also have LoRa Python code from another project, so after testing that the device was connected OK (a few commands typed into the Microbit REPL interpreter), I ported that over.  The changes were for the SPI library, plus I had to remove all the LoRa register/value definitions as they made the program source too large; the source is compiled on the device, so the compiler has a rather limited RAM workspace.  You can see the resulting test program here.

To receive LoRa transmissions, you need another LoRa device as a receiver, plus suitable software.  I used my C LoRa Gateway code for the receiver:

Balloon Tracker Program

So far so easy, and the end goal seemed close; once you have GPS and radio modules working, then you just need a small amount of extra code to format the GPS data as a string, adding a prefix (“$$” and the payload ID) and suffix (“*” then CRC then a line-feed), and then transmit the result over radio.

However, as soon as I combined the GPS and LoRa code, the result wouldn’t even compile.  Remember that compilation happens on the Microbit, and my code was too large for that process:

Fortunately it wasn’t much too larger, so I removed some code that wasn’t strictly necessary (mainly, the code that switches off unused GPS NMEA sentences) and soon the compiler was happy.

The resulting code however was not happy.  Once the compiler has finished, the resulting bytecode is loaded into the Microbit’s RAM, which is shares with any data used by the program (variables, stack, temporary work areas).  The nature of Python is that memory gets allocated all the time, and freed up when necessary (i.e. when there’s little free memory available), and my program would run for a short while before crashing with an “out of memory” error when it tried to allocate more memory than was available.  This it working before it crashed:

So, I had to reduce the memory footprint.  I’m used to doing that in C on microcontrollers, but MicroPython needs different techniques.  For example, C on a micro usually sits in flash memory, which often is less of a limit than is the working data in RAM, so you can sometimes rewrite the code to use less RAM without worrying that the new code uses more code memory.  Not so for MicroPython, where everything shares RAM.  So some things I tried actually made the situation (checked by calling gc.free_ram() in the main loop) worse.  So, for the most part, I managed to increase free RAM by removing code that I didn’t need.  Having done so, the program was stable though free memory went up and down cyclically as memory was allocated each loop and then eventually freed up.

Some easy improvements came from removing the code to display GPS satellite count on the LEDs, and specifically importing only the required modules instead of the whole Microbit library.  The most relevant part of the code turned out to be the part that builds up an NMEA sentence.  In C you simply allocate enough memory for the longest sentence you need to parse, then place incoming bytes into that memory using a pointer or index, checking of course for buffer overruns.  In Python, strings are immutable so you can’t do this, and the temptation then is to do “string = string + new_character”.  Of course, the Python interpreter then allocates memory for the resulting string, marking the old string as “no longer in use” so it can be freed up sometime later.  It’s pretty easy to end up with lots of unused memory waiting to be freed.  For now, my NMEA code explicitly frees up memory as each new byte comes in.  I did briefly change the code to using bytearrays, which are close to what I would do in C, but free memory reduced slightly (I assume the source took more space) so I went back to the original code.  Longer term, I’ll ditch NMEA and write code to use the UBX binary protocol instead.

The code has been running continuously now for over 12 hours, and the free-memory figure is solid (measured at the same point each time round the main loop).  I do need to add the flight-mode code, but that’s small and shouldn’t cause an issue :-).  If all is well then I hope to fly this (weather-permitting of course) on Sunday.

Finally, here’s the result of receiving the telemetry on a Python LoRa gateway program that I’ve been working on lately:

Posted in Weather Balloon | 1 Comment

Quick RTL SDR Comparison

As part of a recent project, I’ve used a few different RTL SDR devices, and was surprised how drifty some of them are, one in particular.  For their intended application – decoding wideband transmissions – this isn’t an issue, but if you want to use one to decode RTTY then it certainly is – the signal will soon drift outside of the audio passband unless the SDR is retuned.

My project is on a Raspberry Pi, where I found that all but one (see the test results below) was basically unusable.  So I did some quick tests on my desk, with a Windows PC running Airspy, for a crude visual comparison of drift rates.  I tested 4 devices:

  1. NooElec Nano 2
  2. A very old E4000-based SDR
  3. Current model R820T2 SDR
  4. NooElec Aluminium-cased SDR

1 – NooElec Nano2

Poorest of the bunch.

2 – E4000

Better.

3 – R820T2

Not better.

4 – Ali cased

Much, much better.

As such, the metal-cased NooElec is the only one I could recommend.

Of course, there are much better SDRs out there – the Funcubes, SDR Play and Airspy models, and for chasing or tracking balloons you should really spend the extra money – but for bench testing then this particular RTL SDR is just fine.

Posted in Weather Balloon | Leave a comment

Raspberry Pi SSDV with a Compact Camera or SLR

Many HAB flights now use SSDV to transmit images “live” from the balloon down to the ground, using a camera connected to the flight computer, providing an immediacy that is missing when just flying standalone cameras.  Early SSDV flights used serial cameras connected to a microcontroller, but image quality (and, ease of programming) took a step forward when the Raspberry Pi arrived with simple access to webcams.  My first SSDV flight and the following 3 used a webcam, sending SSDV down over RTTY.

PIE1 SSDV Image

PIE1 SSDV Image

Webcams do not provide great image quality, which then improved when the Raspberry Pi camera came out.  There was also some excellent work by Chris Stubbs who managed to program a Canon compact camera to send SSDV directly, again over RTTY.

9579985600_fd8ee86cf4_b

With RTTY we are limited to sending quite small images – around 500×300 pixels – because larger images take too long to transmit.   It is possible to increase the speed from 300 baud to 600, though it then becomes more difficult to receive, or by using the trick of multiple RTTY transmitters on the same tracker, such as this one:

Another option though is to replace RTTY with LoRa, which on 434MHz provides a speed increase of about 4x over typical RTTY speeds, and that increases by another tenfold if using 868MHz (all due to varying bandwidth limits allowed within IR2030).  Further, LoRa allows for uploads to repeat any missing packets, and this in the 868MHz band produces some impressive results (the inset is an image from my first Pi SSDV flight, showing the improvement we now have in quality and resolution):

So, these bandwidths allow us to send rather higher quality images than before, to the point that the image compression is limiting quality.  With this in mind, Philip Heron added a quality setting to his excellent SSDV encoder/decoder to control the amount of compression applied.

With reduced image compression and higher bandwidths, the remaining factor is camera quality.  Whilst the Raspberry Pi cameras (especially the newer Sony) are quite good, they do have tiny sensors and simple plastic lenses.  A step up would be to use a compact camera, mirrorless system camera or an SLR.  These also potentially offer wider angle lens, making for more impressive HAB photographs.  However we need to get those images to the flight computer.

Pretty much every modern camera allows for a USB PTP (Picture Transfer Protocol) connection to a computer, allowing it to be controlled by a computer to a greater or lesser extent.  For most cameras all we get to do is download images from the camera – and that’s all that most people need – but we also need to be able to take images under control of our flight computer.

img_2884

To take and transfer images we can use the Linux program gphoto2, with a compatible camera that includes remote operation (i.e. ability to take an image via a command on the Pi).  The compatibility list includes few modern compact cameras, as the remote functions are typically only available on SLRs.  Canon, for example, used to include remote capture in their Powershot models but stopped this practice in 2009, presumably to persuade people to buy their SLRs instead.  I tested with an old Canon SLR (EOS 400D) and pretty much every function is supported – remote shooting, control of ISO, control of aperture/shutter (if the camera is set to semi-auto or manual mode).  However I’m not specially keen on flying something as heavy and solid as an SLR with wide-angle lens, so I checked the compatibility list for smaller, lighter alternatives.  Sadly none of my other cameras fitted the bill, so I purchased a Nikon S3300 compact.  This provides remote shooting (albeit without any control over aperture etc.), has a wide-angle lens (26mm equivalent for 35mm sensors), 16MP sensor, is small and light, and charges from USB (so the Pi should be able to keep it charged during flight).

Once gphoto2 has been installed (sudo apt-get install gphoto2), then the first thing to do is connect the camera and check that it can bee seen:

gphoto2 --auto-detect

This should produce a result like this:

Model Port
----------------------------------------------------------
Nikon Coolpix S3300 (PTP mode) usb:001,012

So far so good.  Now to find out what capabilities the camera offers:

gphoto2 --summary

Which will give you something like this (some parts removed):

Camera summary:
Manufacturer: Nikon Corporation
Model: S3300
 Version: COOLPIX S3300 V1.0
Vendor Extension ID: 0xa (1.0)
Vendor Extension Description: microsoft.com: 1.0;

Capture Formats: JPEG
Display Formats: Association/Directory, Defined Type, JPEG, DPOF, MS AVI, Apple Quicktime, MS Wave

Device Capabilities:
 File Download, File Deletion, File Upload
 Generic Image Capture, No Open Capture, No vendor specific capture

Device Property Summary:
Property 0xd407:(read only) (type=0x6) 1
Property 0xd406:(readwrite) (type=0xffff) ''
Property 0xd002:(readwrite) (type=0x6) Enumeration [1,2,3,4,5,6,7] value: 6
Date & Time(0x5011):(readwrite) (type=0xffff) '20161111T143911'
Flash Mode(0x500c):(readwrite) (type=0x4) Enumeration [1,2,3,4] value: Flash off (2)
Focus Mode(0x500a):(readwrite) (type=0x4) Enumeration [2,3] value: Automatic (2)
Focal Length(0x5008):(read only) (type=0x6) Enumeration [3500,4600,5300,6100,7300,8600,10500] value: 35 mm (3500)
Battery Level(0x5001):(read only) (type=0x2) Enumeration [2,5,25,50,65,80,100] value: 80% (80)
Property 0xd303:(read only) (type=0x2) 1

Form this we can see that the camera supports “Generic Image Capture” (woohoo!) but no control over zoom (focal length is read-only).  Given that for a HAB flight I want the lens at its default widest setting anyway, that’s not an issue.

Taking a photo is simple:

gphoto2 --capture-image-and-download --force-overwrite --filename dave.jpg

This will extend the lens if it’s retracted, focus the lens, set the exposure (using whatever options are set within the camera), take a photograph and then download it to the Pi.

For more advanced cameras you may be able to control the exposure manually (aperture and/or shutter), control the ISO etc.  The available settings, and the specific commands to set them, vary from camera to camera but your starting point should be to list them all:

gphoto2 --list-config

The latest version of the Pi In The Sky software includes options for the use of cameras via gphoto2 (see instructions in the README).  With “Camera=G” in the pisky.txt file, gphoto2 and imagemagick installed, and a compatible camera connected and powered on, then PITS should take images on that camera and transmit them via SSDV.

Unlike with the Pi camera, images are taken at full resolution (or whatever resolution is set within the camera), and are then stored on the Pi SD card at that resolution.  The resizing for transmission is then done by imagemagick, which is why that has to be installed.

In testing, the Nikon has been completely reliable, running for 11 hours continuously till eventually the battery was discharged (remember, it charges to some degree over over USB hence the long run time).  So this is looking good for a flight.  Here’s a sample test image as sent via SSDV/LoRa.

2016-11-11-12-18-35-nikon2-b3ea

Posted in Weather Balloon | Leave a comment

LoRa PC Gateway

I generally use the Raspberry Pi to receive LoRa transmissions from balloons, and to upload the packets to the HabHub servers.  However it sometimes might be more convenient to use a PC or Mac, or a phone or tablet, for internet connectivity, in which case we need some way of interfacing a LoRa module to those devices.

Hardware

Here I have used an Arduino Mini Pro, connected to the LoRa module via SPI and 2 I/O pins, and using software derived from my Handheld LoRa Receiver to allow control of the module via the Arduino’s serial interface.  I’ve built 2 such devices, the first of which connects to a PC via USB, using a Prolific PL2303 USB-serial adapter:

IMG_2811

IMG_2809

The second device uses the same firmware, but connects to the PC (or Mac, tablet, phone) via bluetooth using a HC-06 bluetooth serial adapter.  Power comes from a small LiPo, using a USB charging module.

P1110723 IMG_2813

Firmware

The firmware handles incoming packets directly, copying them to memory before sending to the host PC or mobile device.  It also sends various status values – current RSSI every second, and packet RSSI, SNR and frequency error before each packet.  It accepts simple commands to set the LoRa frequency, bandwidth, spreading factor, error coding, packet header type and low data-rate optimisation.

Software

Currently I’ve produced Windows software that communicates with either device using a virtual serial port (USB or Bluetooth), and expect to make that cross-platform soon (OSX, Android, iOS).  This program allows the LoRa parameters (frequency, bandwidth, spreading factor etc.) to be set, displays incoming telemetry packets, and optionally uploads those packets to Habitat (so the balloon position is displayed on the live map).  SSDV isn’t supported yet.

WindowsGateway

Build

To make your own device, you will need:

  • Arduino Mini Pro
  • Programmer for above
  • LoRa Module (RFM96 for 868MHz or RFM98 for 434MHz)
  • Wire and solder
  • SMA Socket
  • Suitable case

Plus for the USB version:

  • FTDI or Prolific USB-serial adapter

or, for the Bluetooth version:

  • HC-06 Bluetooth interface
  • LiPo battery
  • USB LiPo Charger
  • On/Off Switch

Connections from the Arduino to LoRa are described in the firmware.  Remember to connect GND on Arduino to GND on LoRa, and Vcc on Arduino to 3.3V on LoRa.

For the serial version, first check if your USB adapter supplies 5V or 3,3V or both; for 5V you need to connect the 5V line to the Arduino “Raw” supply input; for 3.3V connect to the Vcc pin instead.  Also, connect 0V/GND from USB adapter to the Arduino GND pin.  The USB Tx connects to Arduino Rx, and USB Rx to Arduino Tx.

For the Bluetooth version, the LiPo connects to the Arduino Raw pin via a switch.  The Bluetooth device then takes power from the Arduino 3.3V line.  Rx/Tx pins connect as above.  All GNDs connect together of course, and to the battery -ve wire.  The LiPi charger connects to the battery directly.

Download the firmware and program the Arduino with it.

To the USB version to a PC, just plug it in and hope that Windows installs the driver OK; if not then download and install the driver appropriate to your device.  Check in Device Manager to see which serial port number it has installed.

For the Bluetooth version, connect and install a USB Bluetooth adapter if one is not already installed.  Power on your LoRa/Bluetooth receiver and then search for the bluetooth device in Windows.  You should see “HC-06” show up.  If you are asked for a PIN number it is 1234.  Check in Device Manager to see which serial port number it has installed; if it doesn’t show then be prepared to wait – for some reason it can take several minutes.

If you are using my Windows software, download that to a new folder and just run the .exe file.  Choose the serial port that was installed earlier, and within a couple of seconds you should see the “Current RSSI” value start to update.  Choose your LoRa settings and click the “Set” button.  Once you’ve done that, you should start seeing packets arrive (assuming you have a tracker running using those LoRa settings, of course).

 

Posted in Weather Balloon | 4 Comments

One Little Cloud In The Blue Sky

On Saturday I helped with a school launch by Greg Tomlin, who drove here from Coventry with his SKYBLUE payload and a minibus full of excited schoolchildren. It was their first launch, but not Greg’s, as he’d launched twice before with his previous school.

Predictions were for a fairly gusty and showery day overall, but with a chance of launching in the morning before the wind got up and the clouds and rain arrived. Landing predictions were also good for the morning, but poorer later as the winds would take the balloon down to the Severn Estuary. I ran through several permutations of balloon size and gas fill, and finally opted for a 1600g balloon with a standard ascent rate of 5m/s.  To help keep the flight away from a watery end, I chose a slightly undersized parachute so the final descent wouldn’t drift too far south.

Greg kindly offered a free ride for one of my trackers if I had anything to test, and I did. I’ve been working on a tracker that uses a servo-controlled parafoil to guide a payload to a specific landing spot. I’ve run this through emulated flights but hadn’t flown it for real, so this was an opportunity to do just that, with the fallback of another tracker in case things went wrong (which they did!). So, I quickly put together RTLS1 (Return To Launch Site), without servos of course, but with all the software intact.  Hardware was an original Pi Zero (I didn’t want to fly a camera this time) and prototype PITS Zero board with LSM303 compass/accelerometer connected.  As an extra test, I added code for a BME280 pressure/temperature/humidity sensor. Together with the RTLS compass data, and various landing prediction and flight control values, there was quite a lot of telemetry to send, so I opted for a 140 bytes/second LoRa mode for transmission.

rtls

I had another reason to fly something. My wife’s maiden name is Cloud, so last year I bought a cloud necklace, with the intention of sending it up to near space so she owned a very high-flying cloud! I hadn’t got round to actually flying it, but with our 30th wedding anniversary in a few days this was a good opportunity! The launch day also turned out to be the 12th anniversary of when Julie’s dad died, so it was particularly poignant. To add one more coincidence, he used to be a CB operator with callsign “Skyblue”.

With Greg and team en route, I prepared for launch so we could get the balloon flying as soon as possible (delays would mean a higher chance of a wet landing). So when they arrived, I had my tracker online and payload sealed with line attached, groundsheet out, balloon tied to the filler, and lines tied to the parachute. Greg and his team wasted little time in getting cameras started, tracker running and online, and payload sealed up and tied to the parachute and my payload.

IMG_1959

Meanwhile I inflated the balloon. The wind by now was quite gusty, but with quiet periods where I could get on with filling with gas and checking the neck lift.

After sealing the balloon sealed and tying it to the payloads, I took the balloon out to the field, followed by Greg and his team carrying the payloads, parachute and line. Out in the middle of the field, the wind was quite gentle and as I let up the balloon it wasn’t far off being vertically above me. Holding the lower (my) payload …
IMG_1970… I took a few steps downwind and launched. The entire flight train that rose into the grey sky above.

IMG_1971

Back in the house, I checked the transmissions and map. initially the live prediction was a bit alarming, showing a landing south-east of the Severn (I’d aimed for North-West), but then I remembered that the live predictor generally assumes a burst altitude of 30km, and ours should be 36km or so. Also, the initial ascent rate with hydrogen is lower than the average, whilst the predictor assumes the ascent rate will be constant.

With a fairly high flight landing not far away, we didn’t have to rush into the chase vehicles. So we had time to watch the flight progressing, and I had time to finish getting my chase car set up. Part of that was starting up my LCARS-based touchscreen, and when I did I noticed that the RTLS1 position wasn’t updating. A quick check of the telemetry showed that it had stopped at about 12km altitude, which was a very strong indication that the GPS wasn’t in flight mode. It later turned out I hadn’t re-enabled that code after disabling it for my emulated tests. Oops. At least the SKYBLUE1 tracker was still working fine, and I knew that RTLS1 would start sending the correct GPS data once the flight descends back through 12km, and that I would have plenty enough test data from it anyway.

Once it was certain that the landing point was going to be west of the Severn, we drove down to Monmouth to wait for the balloon to burst. Parking at a convenient and free location (Lidl !), Julie bought some supplies as we all watched the flight proceed and then burst. We waited a few minutes for the predicted landing position to stabilise, and then set off for the most likely landing area. We had to change our target a couple of times, and were a couple of miles away when the flight landed (which isn’t bad considering the how narrow and windy the roads are in that area!). My LCARS system had a last position of 166m altitude which was only 46 metres above the landing position, and 4 metres below the road near that position! I later found out that my home receiver had a last position of 368 metres altitude, which again was very good considering the hills between the launch and landing sites.

fullpath

Using that last position, we drove through a small forest (usually a bad sign when chasing a payload!) to a track which, according to the satnav, was the closest pint we could get to by road, with the payload about 350 metres away. I still had no signal from either tracker, which seemed very odd as normally I’d get a signal 1km or so away. With a single tracker I’d wonder if the tracker didn’t survive the landing, but it seemed unlikely that both trackers would stop. So we kept going in the hope of getting a signal further along the road. We still couldn’t get a signal so parked up and got out the Yagi which, with radials horizontal (meaning the payload was on its side or the aerial was squished against the ground) finally got a good, decodable signal. Tapping that into the satnav, we were directed back to that track we passed earlier. So we parked up, and Greg and I went to the adjacent house to find out who owned the land that we’d just dropped our payloads on, and to gain permission and hopefully directions too!

onfoot

With that done, we opted to walk down the track, which got progressively more muddy and after a while wasn’t getting us any closer to the payload. By then we managed to get satellite mapping loaded on my phone, and it became clear that it was going to be better to go back to the house and find a different route. When we got there I chatted with the landowners – a retired couple in the house – and they couldn’t have been more helpful. The husband was recovering from an operation, but the wife offered to come out with us, so once a quick rain shower subsided she got her wellies and we all followed her down a footpath and across (in single file!) a field to a second field where the children soon spotted their blue payload. They seemed pretty excited!

IMG_20160625_144458837

I, of course, was relieved that I hadn’t lost Julie’s necklace!

13524388_1050072618395749_1982685103118272568_n

13502103_1050025765067101_6399385235339609339_n

Back at the house, the SKYBLUE team showed some of their photos to the landowners, and after some more chat we all left.

IMG_20160625_153411555

Julie and I decided to stop in Monmouth to have lunch by the river.

IMG_20160625_163015376_HDR

So, a very good flight, and I now have lots of real data to peruse before I start testing my RTLS project with a parafoil.

Posted in Raspberry Pi, Weather Balloon | 1 Comment

To Zero, And Beyond

As many reading this will know, I flew the new Pi Zero on the day it was announced, in order to test a prototype of our new PITS-Zero tracker board.  I’d been pleading with Eben since I first saw a prototype of the original Pi Zero, that its low weight would be ideal for live-imaging HAB applications, if only it had a camera port.  The camera is much the entire reason for using a Pi for HAB – if you don’t want pictures then a smaller/lighter/simpler AVR or PIC microcontroller will easily do the job (and with less battery power) – so I felt that the CSI-less Pi Zero was a missed opportunity.  Eben agreed, and said he would try to make it happen.

PiZero1.3_700

So, when I received a sample Pi Zero with CSI port, I was keen to try it out.  However launching an unreleased device, to possibly parachute down in front of a curious Pi fan, might not be the best idea in the world, so I had to wait.  Fortunately the wind predictions were good for a balloon launch on the Pi Zero CSI launch day, and the flight went well albeit the burst was rather lower than predicted (balloons vary).

Sony Camera

I had hoped to fly the new Sony camera for the Pi, but in testing the camera would become invisible to raspistill after about 2 hours and roughly 2-300 photos.  2 hours isn’t long enough for a regular flight, and mine was expected to take more than 3 hours just to ascend, so this wasn’t good.  I searched the Pi forum and found that a couple of people using time-lapse photography had found the same issue, and as it was a new issue with no fix or workaround yet, I had to opt for the Omnivision camera instead.  This of course gave me a reason to fly the same tracker again as soon as there was a solution for the Sony firmware issue; once there was I tested it, and planned my next flight.

Waiting For Baudot

"It's currently a problem of access to gigabits through punybaud"

I’ve written previously about LoRa, but the key points about these Long Range Radio Modules when compared to the old (first used from the air in 1922) RTTY system are:

  • Higher data rates
  • Longer range with the same rate/power
  • Can receive as well as transmit
  • Low host CPU requirements even for receiving

The higher data rates mean that we can send larger images more quickly (throughput is up to 56 times that of 300 baud RTTY), and the receiving capability makes it easy to have the payload respond to messages sent up from the ground.  For this flight, those messages are used to request the tracker to re-send any missing packets (ones that the receiving stations didn’t hear), thus reducing the number of annoying missing image sections down to about zero.  To give you an idea of the improvement, the following single large picture was sent in about a quarter of the time taken by the inset picture (from my first Pi flight, and at the same pixel scale):

progress

LCARS Chase Car Computer

For this flight, I tried out my new chase-car computer.  This has a Pi B V2, Pi touchscreen, LoRa module, GPS receiver and WLAN (to connect to a MiFi in the chase car).  The user interface mimics the Star Trek LCARS panels, and was written in Python with PyQt.  It receives telemetry both locally (LoRa, or RTTY via a separate PC) and also from the central UKHAS server if connected via GSM.

The Flight

As per the previous Pi Zero flight, this was under a 1600g balloon filled with hydrogen.  Predicted burst altitude was 42km, and I hoped that this time it might achieve that!  The payload was the same as last time:

image

except of course for the new Sony camera (manually focused for infinity, but not beyond) and a new set of batteries.

On the launch day the weather was overcast but forecast to improve a little, so I decided to wait for a gap in the clouds.  When that came, the wind did too (that wasn’t forecast!), which made filling the balloon interesting.

No my head hasn't turned into a giant clove of garlic

No my head hasn’t turned into a giant clove of garlic

Fortunately, the wind did drop for launch, and the balloon ascended towards the gap I’d mentioned in the clouds:

P1110657

The LoRa system worked well (especially once I remembered to enable the “missing packet re-send” software!), with the new camera acquitting itself well.  I used ImageMagick onboard to apply some gamma to the images (to replace contrast lost in the atmosphere) and to provide a telemetry overlay, including this one which I believe is the highest image sent down live from an amateur balloon.

Cjaear8WEAA1ENN

Burst was a few metres later, comfortably beating my previous highest live-image flight.

And this was the last image it sent.  I guessed why.  Remember the camera stuck to the outside?  My guess was that after burst – when the payload suddenly finds itself without support – the line up to the balloon found its way behind the camera which it then removed as the balloon remnants pulled on it.  So, I can’t show you any images from the descent, but I can show you this shot of the Severn Estuary (processed to improve contrast) from the ascent:

15_20_48_shopped

In the chase car, I stopped at a point with a good view towards the landing area, so I could get the best (lowest) last position I could.  With the payload transmitting both LoRa and RTTY, I had my LCARS Pi receiving the former, and a Yaesu 817 with laptop PC receiving the latter.  With no images, the LoRa side dropped to sending telemetry only, which was handy as I was able to receive a lot of packets as the balloon descended. Overall LoRa seemed to be much more reliable from the car than RTTY did, despite the much higher data rate, and I now would be quite happy to chase a balloon transmitting high bandwidth LoRa and nothing else.

With the final position logged, I carefully tapped that into the car sat nav and then drove off to get the payload back.  10 minutes later I remembered that I’d coded exactly that function into my LCARS program!  2 screen-taps later, I had on-screen navigation (via Navit); I would also have had voice navigation but I hadn’t connected a speaker yet.

Both Navit and the car sat nav took me to a hill with the payload about 300 metres away.  I waited for another HABber to arrive – his first time chasing – and meantime I updated the other enthusiasts online, and took some photographs of the scenery; Shropshire is very pretty.

P1110661

Once Andy arrived, we walked down to the payload, watched (as often the case) by the local populace:

Ewe Looking At Me ??

As expected, the camera was missing, so if anyone wants a free Sony Pi camera, I can give you a 5-mile radius area to search.

P1110664

You don’t need CSI to see what went wrong here …

A lot of the balloon was still attached, which helps to explain how the camera was forcibly removed:

P1110665

So, a nice flight and recovery.  The Sony camera worked well; 868 LoRa worked well; the LCARS chase car tracker worked well.  Next time, more duct tape!

Posted in Weather Balloon | 4 Comments

Birthday Balloon

This was my second test flight of my 868MHz LoRa fast SSDV (Slow Scan Digital Video) software, to celebrate my 58th flight and 56th birthday.

Fixes

I made some changes to fix some issues with the previous test flight:

  • Set RTTY baud rate to 300 to prevent RTTY Tx during LoRa Rx period
  • New MTX2 programming code
  • Replace the UBX GPS code with NMEA code

The RTTY overrun was due to accidentally leaving the tracker at to 50 baud after a brief test.  The timing had been set using 300 baud RTTY (ensuring that the RTTY sentence had finished before the LoRa uplink slot), so when set to 50 baud the RTTY was still sending when the tracker was trying to listen on the LoRa frequency.  Although the frequencies are very different, this still slightly deafened the LoRa receiver and meant that only 12 messages were uploaded during that flight.

When the Pi starts up, the MTX2 radio transmitter is programmed to the configured frequency, and this process failed under some circumstances (depending on Pi operating system version and PITS version).  This was resolved and extensively tested.

During the previous flight, the payload went silent for long periods (1 minute or more) a few times, as the new GPS UBX code failed to get an updated time and position.  I traced this to some very slow responses to UBX requests via I2C, but didn’t have time to resolve the issue so I reverted to tried and tested NMEA code instead.

Additions

I also made some changes to the SSDV code, storing telemetry in the JPEG file comment field, by using the EXIV2 program.  This information is then available (via EXIV2) to the image processing script which can then overlay the downloaded image with on-screen text and/or graphics.  I added such a script using ImageMagick for the rendering.

Payload

I re-used the same payload as my previous test flight – a Pi A+, PITS+, LoRa board with single 868MHz module, Pi camera and 4 (fresh!) AA batteries.

The 868MHz antenna was below the payload with a 434MHz stubby antenna above (not an ideal position, but usable, and avoids compromising the 868MHz antenna).

Total weight including 18″ Spherachute and cord is less than 250g.  Ascent rate was targetted at 4.5m/s with an expected burst altitude of 31-32km using a 350g Hwoyee balloon filled with hydrogen.

Flight

The launch was easy enough, with very little wind (good, as I was alone).  There was some thick and fairly low cloud cover, and once through that the payload mostly sent some fairly boring images of nothing but blue sky and white cloud.  After a while though it was high enough to see some better weather and so we started to see some better images:

Note the overlay with telemetry along the bottom (altitude, latitude, longitude and UTC time), plus an altitude graphic with the balloon on a translucent background.

As the flight approached peak altitude, the pictures improved markedly, with this probably being the best one.  For comparison, I’ve inserted a live image from my very first Pi flight, at the same scale in pixels.  Roughly, the new image has 10 times the number of pixels, and took about 1/10th of the time to transmit.

For this flight there were 2 other gateways set up by other HAB enthusiasts, as well as my too.  Even with this limited coverage we still lost very, very few packets.  I had my Python script running to check for missing packets, and creating uplink messages to request repeats of those packets.  Throughout the flight 35 such messages were uplinked to the flight, which is 3 times more than last time, but still I noticed that the uplink stopped working before the downlink stopped, so this is something to try and improve for next time.

With 3-4 gateways running we again saw an issue with SSDV uploads timing out.  I’d seen this before when running 3 gateways at my location, and assumed my internet connection was the cause, but now we were seeing the same thing with 3 gateways at 3 different locations.  Phil Heron checked the server and saw that it was, at times, struggling to keep up, so that’s something to look at for next time.

The balloon burst at an altitude of over 32km, and produced this image during descent (note, no missing packets which is impressive for the first image during descent):

Recovery

Since the flight was mainly about testing the radio side, I stayed at home checking the radio communications, aiming and swapping aerials, until the flight had descended beyond my range.  I then prepared the chase car and checked the final descent positions and landing predictions.  The landing area was just south of Avebury, near Cherhill White Horse, and about 1 mile from the nearest road.  From the Ordnance Survey mapping I could see that the area was hilly and that unless the payload happened to land on top of a hill then I was unlikely to receive a signal from the road.  I made sure to pack walking boots, backpack and mobile tracking gear as this wasn’t going to be a quick and easy recovery!

The landing area was about 90 minutes drive from home, and when I got there I set up a rooftop aerial and drove along the nearest road hoping to hear a signal from the payload.  Nothing.  As expected the area was hilly and I could see a bridleway up to a monument at the top of a hill, so I parked up, packed my backpack, and walked up the hill with the radio on, listening for the payload.

As I got near the top, I had a choice of taking the path up to the monument, or a short walk across a field to a ridge overlooking the valley the other side of the hill.  That sounded like a better (easier!) bet, especially as I thought the payload would be in that valley.  Sure enough, as I got closer, the radio burst into life with that familiar RTTY sound.  I sat down to get the laptop set up, and the signal almost disappeared – it really made a huge difference how high the aerial was.  So, with laptop and radio on the ground, I held the aerial up to get a decoded position.

Unsurprisingly then, the payload position turned out to be in the valley below:

listening

I then had a choice of a direct route down a very steep hill (see those contour lines!) or a more leisurely, slightly longer and rather safer walk across the ridge and down a more gentle slope:

choice

On the way I caught my first glimpse of the payload below (parachute on left; payload on right):

25380800024_a640c33646_k

So in the end, not a difficult recovery if we ignore the steep climbs!

For more photos, click this image:

26013649245_87bafab256_h

Posted in Weather Balloon | Leave a comment

Fast HAB Imaging

It’s now quite common for high altitude balloon flights to send down live images during the flight, as well as the usual telemetry.  However data rates from balloons are rather low, so each image can take several minutes to send even for rather low resolutions.

"It's currently a problem of access to gigabits through punybaud"

Maximum data rates are a function of several things – emitted power, receiver sensitivity, path loss, aerial gain, duty cycle (proportion of time that the transmitter is active), bandwidth (amount of the radio spectrum used by the signal) and the type of modulation – and these items have legal and practical limits.

For UK flights we use licence-exempt radio transmitters and so operate within the limits defined by IR2030,  For almost all flights, operators choose the band from 434.04-434.79MHz, because this range allows for continuous transmission (100% duty cycle) and is shared with the amateur radio 70cm band, meaning that receiving equipment is readily available.  Maximum power in this band is 10mW e.r.p. (effective radiated power), and the modulation scheme most often used is RTTY.  To allow reception by ham radios, the modulation has to fit within a frequency range that those radios will pass, which is about 4kHz.  Historically radio transmitters drift in frequency considerably, so if the modulation uses a large proportion of this bandwidth then receivers will need to re-tune almost continuously.  With this, and the increased packet loss at higher baud rates, 600 baud is a practical limit for transmission speeds with RTTY, and most flights with live imaging settle for 300 baud.  With modern temperature-compensated radio transmitters this could probably be pushed to 1200 baud with fair success.

LoRa Modulation

Anyone who follows this blog will be aware that I’ve been experimenting with LoRa modules, which use their own (patented, sadly) modulation scheme.  However LoRa modules are cheap and work well, so until something else comes along that is as good and open-source, then I’ll continue flying them.

LoRa chips offer bandwidths up to 250kHz, which is way beyond the practical limit with most ham radio receivers and indeed beyond that allowed in the 434MHz band that I mentioned.  Higher bandwidths can allow for higher throughput, though the receiver will then see more noise (as it has a wider frequency range to listen to) so the signal-to-noise ratio is worsened, meaning that range is compromised.

Some bands around 434MHz do allow for wider bandwidths, but only at limited duty cycles of 10% or less.  There’s not much point in using a bandwidth 10 times wider if you can only transmit for 10% of the time, as the throughput is back where you started!  However there’s a section of the 868MHz range that allows for both high bandwidth (300kHz) and continuous operation.  The downside (there is always a downside) is that transmitted power is even lower, at 5mW ERP.

LoRa devices operating within the 434MHz band mentioned above can use up to 20.8kHz bandwidth, giving a data rate equivalent to about 1400 baud RTTY.  Going from 20.8KHz to 250kHz gives nearly 17,000 baud equivalent.  This is still pathetically slow compared with 3G or home broadband, but it’s much, much faster than the 300 baud that we’re used to.  If, of course, it works.

Need For Speed

Higher data rates mean that, for example, instead of taking 5 minutes for a low resolution (about 600 x 300 pixels) image, we can send a 1280×640 image in about 70 seconds.  Here’s an example image being transferred with a bandwidth of 250kHz:

Packet Loss

As discussed, wideband modulation reduces the SNR seen by the receiver, so range is limited compared to narrowband modulation.  The lower tranmission power (5mW vs 10mW) makes SNR a little worse, as does the path loss of 868MHz vs 434MHz.  The end result will be increased packets loss as the signal level drops, through the balloon payload swinging around and as it gets further away.

We can mitigate these factors somewhat by using a Yagi aerial at the receiver and a filtered pre-amplifier.  Even so, and with error correction in the image packets, some packets will be lost resulting in blocks missing from the received images.  What can we do if that happens?

Play It Again

Another nice feature of LoRa modules is that they are transceivers, so we have the option of sending messages back up to the payload from the ground.  So, why not have a ground station take note of any missing packets, and then send a message to the payload asking for those packets to be sent again?  All we need is a way of spotting missing packets, and a scheme (e.g. TDM) that provides an uplink slot where the payload stops transmitting and starts listening for a short time.

It’s not difficult for a listening station (LoRa gateway) to build a list of missing image packets – it just needs to take note of which packets have been received.  However there may be several listening stations – this is a key feature of the UKHAS listening network – and there’s no point asking for a packet to be re-sent if it was already received by another station.  So, this logic needs to be done centrally.  With this in mind I spoke to Philip Heron who invented the SSDV (Slow Scan Digital Video) standard, wrote the encoding and decoding software that converts between JPEG and SSDV, and wrote the code on the SSDV server that accepts packets from listening stations and combines them into JPEG files which it displays on web pages.  Philip very kindly wrote a web API for the server, providing functions which a LoRa gateway can use to produce a list of missing image packets.

I had already written code in my LoRa gateway to produce this list locally, so I removed most of that code and instead created some new code that queries the SSDV server and produces a list of missing packets for the latest 2 SSDV images, for a particular payload.  I chose 2 because if the payload has recently started transmitting a new image, we need to know if it has to go back and re-transmit any missing packets from the previous image.

Test flight will happen when the weather is a bit better!

Posted in Weather Balloon | 4 Comments

Portable LoRa Gateway

This is a packaged version of my LoRa gateway, for use when chasing high altitude balloons.  Based a a Pi V2 and official Pi screen, it currently features:

  • Receipt and upload of LoRa telemetry messages to http://tracker.habhub.org/
  • Receipt and upload of LoRa SSDV imagery to ssdv.habhub.org
  • Local storage of telemetry log and SSDV images
  • Chase car position upload to http://tracker.habhub.org/
  • LCARS (Star Trek) user interface with display of balloon position and images
  • WiFi connection to internet (e.g. via MiFi or mobile phone WiFi hotspot)
  • Approx 24 hours of run time from powerbank

Future plans include display of balloon position on a map, and navigation to the balloon position include voice guidance.

P1110312

The parts are all readily available:

  • Raspberry Pi V2 (B+ or A+ will also work, but the extra speed is useful)
  • Raspberry Pi touchscreen display
  • Hofbauer XtraBag 200 Case, Black
  • LoRa board for the Pi
  • Powerbank
  • 2 Micro USB cables
  • Double-sided tape
  • Door trim
  • SMA plug-socket cable
  • 70cm aerial

There are instructions here for building and installing the LoRa gateway.  I will post again shortly with instructions on installing the LCARS interface.

I used a Dremel with cutting tool to cut a hole in the case, then stuck the display down with double-sided tape.  A length of rubber door insulation was then cut and placed around the display to provide some protection.  The powerbank was held down with Velcro, and connected to the Pi and display using short micro USB cables.  Finally, a short SMA cable was used so that aerials can be connected externally.

P1110304

The Pi is set up to auto-start the LoRa gateway and LCARS program.  These two programs talk via a socket so that the LCARS UI can display the current payload position:

P1110314

Posted in Weather Balloon | 2 Comments

Astro Pi Flight – Update

I’ve made some progress on my “tweet uplink” software, and now all stages in the process are fully automated except for a required manual approval step, in case anyone tries to spam the flight!

The key new element is a Windows program that lists incoming tweets and offers Accept and Reject options for each tweet that mentions a particular hashtag, for example:

tweet

The tweets are received via a simple Python script using the tweepy library, which saves each as a separate file which the Windows program then sees and displays:

incoming

Each accepted tweet is then moved to an accepted list (the tweet file is renamed from <Twitter ID>.tweet to <Twitter ID>.accepted):

accepted

The next step is a bit more complex.  To make the most of the available radio bandwidth, tweets are combined together into radio packets.  So whenever there are enough tweets to combine, or the uplink queue is empty, the program seeks to find the best combination of tweets to join together into a single radio packet (max 255 bytes).  These packets are then placed in a queue on a Raspberry Pi running my LoRa gateway:

queue

Here we just have a single tweet, because only one was generated in the above example.

Periodically, the balloon tracker stops transmitting down to the ground, and starts listening for an uplink from the gateway.  If the queue has at least one entry then the gateway program will send one entry to the balloon.  In turn, the balloon tracker will receive the packet and save it as a file.  Meanwhile, a small Python script will see that file and start displaying it on the Astro Pi LED.

The gateway needs to know if the uplink was received OK or not, so the tracker includes a “last message ID received” value in the telemetry packets that it sends to the ground:

downlink

The gateway then parses this information and removes the matching message from the queue (renames <ID>.sms as <ID>.ack).  The final step is for the Windows program to then remove the message from its list of queued messages and add it to the list of acknowledged messages:

acked

Posted in Weather Balloon | Leave a comment