YouTube Streaming From a DSLR

One of the nice things about high altitude ballooning in the UK and Europe is the community spirit and help that launchers get from others who will receive and upload the launcher’s balloon transmissions, and freely offer advice during the flight.  I think it’s a very good idea to return that favour by providing live video streams of the launch, chase and recovery where possible.  A few HABbers do this and it would be nice if more did.

There are various methods of uploading to different streaming services, using a phone or Pi with Pi camera or a laptop with webcam.  For a balloon launch though it would be good to use a camera with zoom lens so that the balloon can be streamed once airborne.  To do this requires an SLR.

So, how to get video from an SLR into a laptop?  Again, multiple options – either use a HDMI video capture device (higher quality but a tad expensive) or send the video over USB.  Here we will explore the USB option.

So we need some software on the PC to receive the video stream from the camera, and to upload to a streaming service.  Again, each of these has a choice.  I’m using a Canon EOS 760D and that comes with “EOS Utility” which displays the video in a window (which we can then capture), but a neater option is a £50 program called SparkoCam, which makes a modern Canon or Nikon DSLR appear as a regular webcam thus making it easy to pass the stream onto another program.  If you want to spend nothing then instead you can use the EOS utility or Nikon equivalent plus OBS (see below) to capture from the PC screen.

To upload to a streaming service, we need a suitable video encoder.  I’ve used Adobe’s Flash Media Live Encoder which works well, and which works from a webcam including SparkoCam’s virtual webcam.  Here though we are going to use OBS (Open Broadcasting Software) which is rather more powerful and flexible.


First, install SparkoCam.  You can try it for free but it will watermark the video stream.

Connect your DSLR and switch on.  It should automatically be selected by SparkoCam, and you will hear  the mirror latch up as it switches to live video mode.  If nothing happens, check that the camera is on, that you used a data USB lead not just a charging one, and that your DSLR is supported by SparkoCam.

Open Broadcasting Software

Now, install OBS and run either the 32-bit or (if available on your PC) 64-bit version.  OBS can be a pain to get running initially depending on if your PC has the required DLLs or not, and you may find that the 64-bit version doesn’t work but the 32-bit one does.  Or vice versa.  Error messages from OBS can be a bit cryptic too, but once it starts it works very well.

The opening screen is a bit cryptic too till you realise what you need to do.  First, you need to add a video source to accept video from SparkoCam’s virtual webcam; click the “+” below the Sources panel:

and choose “Video Capture Device” from the popup menu.  eave the name as “Video Capture Device” or change to something appropriate e.g. “Canon DSLR”.  Click OK to save.

A Properties window will appear with a preview; you can just click OK to accept the defaults.  Now the video stream is inside the main window in OBS.

This window is what will be streamed, and can contain several video sources if you want to get clever, but for now we’ll just expand the video source window, using the red drag lines, to fill the OBS source window.

If it doesn’t fit exactly, choose Settings –> Video to change the aspect ratio of the window to match the DSLR’s aspect ratio, and then expand to fit.


The following is for YouTube; other live streaming sites should have similar options.

Go to your YouTube Live Dashboard and either choose “Stream Now” or create an “Event”; we’ll do the former.  With “Stream now” selected on the left of the screen, look at the “Encoder Setup” in the “Basic Info” section, where you will see the Server URL and Stream name/key.  Click “Reveal” and copy the stream name to your clipboard.

Now, in OBS, click the Settings button and then click on “Stream”.  Choose YouTube as the Service, and paste your key into “Stream key”.  Click OK to save.

Now to start the streaming from your PC, just click the “Start Streaming” button in OBS.

YouTube likes to buffer, but after a few seconds your YouTube page should show the stream as “Live”.

And that’s it !


Posted in Weather Balloon | Leave a comment

DIY Lightweight Pi Tracker with SSDV

A few months back, Raspberry Pi brought out a very very nice little case for the Pi Zero, including 3 different front plates one of which accepts the Sony Pi camera.  After several minutes measuring the internal dimensions, I reckoned I could just about fit the parts for a HAB tracker inside, and came up with this:

Just add batteries.

That one was for 434MHz, and I wanted another for 868MHz, so I thought I’d document the build in case anyone else wants to make one.


First, you need these parts for the build:

  1. Raspberry Pi Zero or Zero W
  2. Pi Zero case
  3. Pi Sony Camera
  4. Some solid core hookup wire
  5. UBlox GPS with chip antenna from Uputronics
  6. LoRa module from Uputronics
  7. SD card 8GB or larger

Plus a soldering iron, solder, wire cutters and a Dremel with cutting disc.  I assume that you also have the parts required to power and operate a Pi Zero (all the Zero suppliers provide kits).  For a flight, you will also need 3 Lithium AAA or AA cells, flexible hookup wire, plus Styrofoam or similar to enclose and protect the tracker.

If you are new to soldering, practice on something else first!  We are going to solder wires directly to the Pi GPIO holes, plus those on the radio and GPS boards, which isn’t the most delicate soldering operation ever but may be daunting for those with no soldering experience.


First, cut 4 short lengths of the solid-core wire, and solder to the Pi Zero as shown (making sure that the wires are on the top of the board!).

I’ve left a very short piece of insulation on the bottom-right wire, but you can remove that completely if you wish.

Next, bend the two top-right wires out of the way, and fold over the leftmost wire and cut to the length shown – this wire will connect to the Vcc hole (top one) on the GPS.

The next part is moderately fiddly: Push the short wire on the left into the Vcc hole, and then push the GPS module over the short bottom-right wire so that this wire goes through the GND hole on the GPS module:

Then push the GPS module down flat on top of the SD socket on the Pi, and solder those 2 wires (Vcc and GND) on the GPS module:

Those last 2 wires can now be bent round and connected to the GPS; the wire in the right of the above photo (Tx on the Pi) above goes to the RXD hole whilst the other (Rx on the Pi) goes to the TXD hole:

Cut the wires to length, bare the ends, push slightly into the holes then solder them.

That’s the GPS sorted.


This is the radio module for communication with the ground.  This has a few more connections to make, and is a bit more fiddly.

First, place wires in these holes as shown, and solder them in place:

Be sure to use the correct holes, by counting from the right edge of the Pi Zero; don’t do it relative to any components because those can vary in position (the Zero and Zero W have the CPU in a different position, for a start!).

Now add 3 bare wires as shown:

The next step is optional.  We need to provide some mechanical security for the radio, to keep if slightly away from the Pi so nothing gets shorted.  This could be a double-sided sticky pad or, as here, a 4th solid wire but this time soldered directly to a capacitor on the Pi.  If that sounds daunting, use the pad!  Here’s the wire, ‘cos that’s how I roll:

Once soldered, remove the insulation.

Now it’s time to place the LoRa module on those 3/4 bare wires:

If you are using a sticky pad, place it now, on the underside of the LoRa module, then push the module down so it’s stuck to the Pi.

If instead you are using the 4th wire, push the LoRa module down but maintain a 1-2mm gap between it and any components on the Pi.

Then cut to length and solder them to the LoRa module.

Now we can cut each of the other wires to length and solder them to the LoRa module:

Until we have the tracker completely soldered together:


Using a Dremel or similar with cutting disc, cut a slot in the case for the GPS module to poke out.  This will take some trial-and-error till the module fits comfortably.

Then drill a hole in the opposite end, in line with the corner pin on the LoRa module.  The hole diameter needs to be wide enough to push a wire through it.

Connect the short flat camera cable (which came with the case) to the Pi, then insert in the case.


Cut a piece of wire to length (164mm for 434MHz), bare and tin and few mm at one end, insert through the hole and solder to the corner pin on the LoRa module.  Finally, connect the camera to the cable, push fit the camera into the lid, and close the lid on the case.

SD Card

First, follow the standard instructions to build a standard PITS SD image.

We then need to modify the configuration file (/boot/pisky.txt) to tell it that we are using this tracker instead of a standard PITS tracker.  Here’s a sample pisky.txt file to work with:




The lines in bold are important:

  • Disable_RTTY=Y  – this disables RTTY (we don’t have a PITS RTTY transmitter)
  • gps_device=/dev/ttyAMA0 – this specified that we have a serial GPS not I2C as on PITS
  • LORA_Frequency_1=434.200 – this sets the frequency of our LoRa module
  • LORA_Payload_1=CHANGEME – you must set this to a name for your flight
  • LORA_Mode_1=1 – this sets LoRa mode 1 which is the only usually used for SSDV
  • LORA_DIO0_1=23 – this specifies the Pi pin we connected the LoRa DIO0 pin to
  • LORA_DIO5_1=29 – this specifies the Pi pin we connected the LoRa DIO5 pin to


You will need to have or make a LoRa gateway to receive transmissions from your tracker.

You will also need to provide a power supply to the tracker.  This can be any USB powerbank with enough capacity, however the batteries may stop working if they get cold during flight.  An alternative is a powerbank that takes AA cells, in which case you can use Eneergizer AA Lithiums.  Finally, and this is the option you will want for a lightweight payload, simply solder 3 Energizer Lithium cells directly to the 5V/GND pads on the Pi.


Posted in Weather Balloon | 13 Comments

Landing Prediction

As I mentioned in my previous post, I was planning to enable my landing prediction code for my next flight.  This code is based on some work that Steve Randall did a few years ago, but using a slightly different technique as I was using a Pi and therefore had plenty of RAM available for storing wind data (Steve used a PIC).  I wrote the code as the first stage in having a HAB guide itself to a predetermined landing spot, and knew that it worked pretty well using stored data from one of Steve’s flights, but hadn’t got round to trying it for real.

The way my code works is this:

  1. During ascent, it splits the vertical range into 100 metres sections, into which it stores the latitude and longitude deltas as degrees per second.
  2. Every few seconds, it runs a prediction of the landing position based on the current position, the data in that array, and an estimated descent profile that uses a simple atmospheric model (from Steve) plus default values for payload weight and parachute effectiveness.
  3. During descent, the parachute effectiveness is measured, and the actual figure is used in the above calculation in (2).

So, basically, for each vertical 100m band, the software calculates the estimated time to fall through that band, and applies that to the latitude/longitude deltas measured during ascent.  It then sums all the resulting deltas for descent to 100m (typical landing altitude), adds them to the current position, and emits the result in the telemetry as the predicted landing position.

Although the habhub online map does its own landing prediction, an onboard prediction has some advantages:

  • It has more descent data to work with, so can more accurately profile the parachute performance
  • It is using more recent wind data, measured during ascent
  • Ground chase crews can see the landing prediction without having internet access

There are disadvantages too.  Because it uses wind data from the ascent, if the wind has changed (due to the landing being in a different area, or because the wind is changing with time) then those factors will introduce errors.

Also, I have a suspicion that the live map consistently overestimates the horizontal distance travelled by a descending flight.  This can be seen by watching its landing prediction which, as the flight descends, will move back towards the actual flight position.

So I was keen to see how well the onboard prediction fairs against the habhub prediction.  Steve Randall was also interested in this, and was kind enough to record the descent on his screen.  He has sped up and annotated the video which you can see here:

From that you can see that:

  • Until close to landing, it’s a lot more accurate than the habhub prediction (for this flight – might not be the case generally!)
  • The noise in the estimated landing position is mainly along the large part of the descent track.

Here’s a screenshot from the map, edited to show the movement of the landing position during descent:

Steve produced a chart showing the parachute effectiveness ( relative coefficient of drag – which is what the code is trying to measure) with altitude:

Noise at low altitudes is less important, as it’s being applied to a short remaining distance to fall, but the noise higher – between say 5000 and 15,000m – is more important.

For my next flight, I’ll apply some filtering to hopefully make the prediction more consistently accurate.  I have all the GPS data from this flight and I can run that back into the tracker code to test how well it would have worked on that last flight.

Posted in Weather Balloon | 1 Comment

Strat The Bat’s Big Adventure

I’ve been a high altitude balloonist for about 6 years, but a fan of Jim Steinman’s music since Bat Out Of Hell hit the world in 1977.  The latter has had something of a renaissance lately with the musical version previewing in Manchester, and soon to open at The Coliseum in London.  I saw the show on opening night and it blew me away about as much as Meat and Karla did on The Old Grey Whistle Test back when I was barely 17!

Those who know me know that I like puns, and I found an obvious one in the name of the musical’s lead character – Strat.  So I figured that somehow I should send Strat into the Stratosphere – a joke that I’ve been milking since I thought of it a few weeks ago.  The actual Strat would need a very large balloon to enter the stratosphere, and would probably die more permanently than he does in the show, so I decided instead to send a little foam bat:

that I bought in Paris at another musical with Jim Steinman’s music – Tanz der Vampire.

So, how to send the little critter (now renamed as Strat) up into the stratosphere?  Well, the sending-up part is really easy – get a weather balloon, fill with hydrogen, attach bat to balloon and let go – the balloon goes up, getting gradually larger in the rarified air as it rises, and eventually gets as large as it can before bursting, at which point everything heads Earthward very very quickly.  A parachute is needed to slow the flight down as the air gets thicker at lower altitudes, with the payload (bat) hopefully landing safely in a field somewhere.  I wanted to have a video of Strat’s flight, so I glued him on to a short rod for positioning in front of the camera:

That “somewhere” is planned in advance using wind prediction software, but the actual flight could land a few miles from the target so it needs to be tracked.  For that, the payload carries a GPS device so it knows where it is, a radio transmitter to relay that position down to the ground, and a small computer board that glues those components together and adds various functions such as providing a live prediction of where the landing might be, and adding photographs from a tiny camera.  I added a video camera to take a continuous video of the entire flight, and some extra trackers I was testing.  This is the entire payload including batteries and foam boxes to protect the contents and whatever it all lands on.

And here I’m testing the view on the camera:
The total weight was 550g, which is about average, and I chose a 500g balloon which was calculated to send the flight up to about 30km which is plenty high enough to get good pictures.  I have no photographs of the launch (I didn’t have any help) but it was easy enough.

The flight was expected to take about 2.5 hours, with a landing near Glastonbury about 2 hours away, so I set off in the chase car (fully equipped with radio tracking kit and other items – more on those later).  Here’s the entire flight path that it actually took (and was very close to the prediction):

You may be used to the view from 30,000 feet or so, where commercial jets fly, but that’s just the top of the Troposphere.  Above that is the Stratosphere, where the sky becomes increasingly black (because there’s almost no atmosphere above).  Weather balloons can, depending on payload weight and balloon size, get well above 100,000 feet, well above any jet commercial or military, and to get any higher requires a large rocket.  This photo is from early in the flight, still with a blue sky:

and this is later, where the sky has gone black:

and you can see the thin blue line of the atmosphere on the horizon.

Meanwhile, the video camera was recording Strat’s flight.  Here’s the launch (from the field behind my house):

Strat Bat in the Stratosphere:

and Strat Bat about to leave the Stratosphere …

Initial descent is typically up to about 200mph in the very thin air (about 1% as dense as it is at ground level), then the parachute automatically opens and slows the flight down, so it lands at a nice gentle 10mph.

Well, I say “land”, and that’s always the aim, but unfortunately for us high-altitude balloonists, the UK has a rather high population of trees.  So this is what actually happened:

One of the extra things I had programmed for this flight was for the tracker to live stream video once it was low enough to get a 3G signal, which it did.  When I saw the video come up on my phone in the chase car, it was pretty obvious that the payload was swinging around in a tree!  I was about half a mile away at the time, thanks to the live telemetry telling me where the payload was throughout the flight, so I soon arrived at the “landing” site.  This was next to a building site, so I checked with the site foreman and got permission to walk over to where the payload was hanging:

If you look at the very top of the tree, you’ll see a lime green and orange parachute.  That’s quite high up – about 17 metres – but fortunately I used about 10 metres of line down from the parachute to the payload:

Also fortunately, I’d packed some long telescopic poles in the car.  One of those has a hook taped to the end, and once I’d hooked that round the payload line it was pretty easy to pull the lot out of the tree:

Here’s Strat looking none the worse for his journey:

So, payload including Strat The Bat all recovered intact, despite the attentions of a fairly tall tree.  I waited till I got home before I checked the video, but as you can see that worked too :-).

I went up to Manchester for the final 2 shows, with a seat near the front of the stalls for the final performance.  Here’s Strat The Bat at the interval, covered in blood group A4 from the motorcycle crash at the end of Act 1:

and then, at stage door after that final show, came the chance to hand the much-travelled Strat to Andrew Polec (who plays Strat in the musical):

Posted in Weather Balloon | 1 Comment

HAB with Calling Mode and 3G Streaming

This, my first flight of the season, will test a few new things:

  • LoRa Calling Mode
  • 3G Modem functions:
    • Upload of balloon position to habitat
    • Upload of full-size photographs to a web server
    • Streaming of launch and landing/recovery to Youtube
  • Landing Prediction
  • Standalone GSM/GPS Tracker

LoRa Calling Mode

“Calling Mode” is where the tracker periodically sends out a special message, on a standard frequency using standard settings, announcing the particular frequency and other settings that it is using.  This allows for unattended receivers to be set up on the calling channel, where they can be expected to automatically switch to any balloons using that channel.

To see how this works, see this page that I wrote last year.

The calling frequency is 433.650MHz, chosen as the quietest part of the 433/434MHz ISM band.


The tracker for this flight has a Huawei 3G USB modem, connected to the O2 network via GiffGaff, using the Sakis3G script.  The tracker runs a Python program that knows the current GPS altitude (passed to it from the PITS C software using a named pipe) so that it knows when it is worth attempting to connect to the internet via 3G (i.e. below 2000m).  That same script controls the various functions that operate over 3G – video streaming, ftp image upload, and direct habitat telemetry upload.

Direct Habitat Upload

As well as the usual ISM (LoRa) radio telemetry, this flight will upload telemetry more directly whilst it has a 3G connection.   Most usefully, assuming it lands in an area with 3G coverage via O2, this will mean that the landing position is uploaded automatically.

Photograph Uploads

Again most useful on landing, the tracker will upload full-sized images (taken by the Pi camera) to a web server via ftp.  That server ( will automatically build thumbnails of the uploaded images.

Youtube Streaming

For both launch and landing, live video will be streamed to my Youtube channel, at this URL.

Landing Prediction

The tracker will predict its own landing position, sending the result out over LoRa as a “XX” which then appears on the map as a large red “X” (marks the spot).  The prediction is only useful after burst, and uses both the measured wind speeds/directions on the way up and the effectiveness of the parachute on the way down.

Standalone GSM/GPS Tracker

These are inexpensive devices that use SMS or GPRS to automatically send out their position.  They don’t have a good track record for HAB, partly because they tend to use cheap and rather deaf GPS and GSM hardware, and also because HABs tend to land in remote areas away from GSM coverage.  Regardless of the above, they have some use as a backup device to a regular radio tracker.

For this flight, I’ve set up a TK102 tracker to connect to the internet via GPRS and to send its position to a traccar server.  traccar is an open-source tracking system which displays multiple car tracking devices on a map.  Here, I’m using a small Python script to extract data from traccar (via its log file) and to then send the position of my particular tracker on to the habitat system so the TK102 appears on the usual HAB map.


Posted in Weather Balloon | Leave a comment

BBC Microbit Balloon Tracker

I’ve been meaning to do this for a while, and a short gap between projects gave me some time to try.

The Microbit is (yet another) educational SBC, sitting somewhere between the Codebug and a Raspberry Pi.  Its processor has plenty enough flash memory and RAM to run a basic tracker (but more on that later), plus it has accelerometer and compass chips.

Importantly, the Microbit has SPI and I2C busses plus a serial port, all brought out to the edge connector on the bottom. Rather than solder directly to the pads, I bought an edge connector and teeny prototyping board:

I also bought a battery holder with cable and plug to suit the micro JST connector on the Microbit.

Balloon Tracker Hardware

To make a balloon tracker, we also need to connect a suitable GPS (by which I mean, one that still sends positions when at high altitudes) and an ISM band radio transmitter.  I chose a UBlox module from Uputronics:

Usefully, this design includes an I2C port as well as the usual serial port.  Since the Microbit serial port is normally used by the debug connection to a PC, software development becomes more difficult if we use that serial port for the GPS, so I2C makes life much much easier.

Now for the radio.  The most popular HAB option is the NTX2B radio transmitter, but that also needs a serial port, so instead I opted for a LoRa transceiver from Uputronics:

This has an SPI interface, so the serial port remains free for debug purposes.

The first job was to get the devices wired together.  There’s not much space on this prototyping board, and it can be useful to keep the GPS away from the other devices anyway (less interference), so I put the GPS and radio on wire tails:

GPS Software

There are several options for writing code for the Microbit, and I opted for MicroPython as I’ve been writing a lot pf Python lately, using the Mu editor/downloader.  I started with some simple code to grab the NMEA data stream from the GPS, and this took just minutes to get going:

I then ported my Pi Python GPS NMEA parser (which meant, just changing the code to use  the Microbit I2C library rather than the Pi serial port).  You can see my test program here (but please don’t use that for a flight, as it was written for car use and therefore doesn’t put the GPS into flight mode!).

LoRa Radio Software

I also have LoRa Python code from another project, so after testing that the device was connected OK (a few commands typed into the Microbit REPL interpreter), I ported that over.  The changes were for the SPI library, plus I had to remove all the LoRa register/value definitions as they made the program source too large; the source is compiled on the device, so the compiler has a rather limited RAM workspace.  You can see the resulting test program here.

To receive LoRa transmissions, you need another LoRa device as a receiver, plus suitable software.  I used my C LoRa Gateway code for the receiver:

Balloon Tracker Program

So far so easy, and the end goal seemed close; once you have GPS and radio modules working, then you just need a small amount of extra code to format the GPS data as a string, adding a prefix (“$$” and the payload ID) and suffix (“*” then CRC then a line-feed), and then transmit the result over radio.

However, as soon as I combined the GPS and LoRa code, the result wouldn’t even compile.  Remember that compilation happens on the Microbit, and my code was too large for that process:

Fortunately it wasn’t much too larger, so I removed some code that wasn’t strictly necessary (mainly, the code that switches off unused GPS NMEA sentences) and soon the compiler was happy.

The resulting code however was not happy.  Once the compiler has finished, the resulting bytecode is loaded into the Microbit’s RAM, which is shares with any data used by the program (variables, stack, temporary work areas).  The nature of Python is that memory gets allocated all the time, and freed up when necessary (i.e. when there’s little free memory available), and my program would run for a short while before crashing with an “out of memory” error when it tried to allocate more memory than was available.  This it working before it crashed:

So, I had to reduce the memory footprint.  I’m used to doing that in C on microcontrollers, but MicroPython needs different techniques.  For example, C on a micro usually sits in flash memory, which often is less of a limit than is the working data in RAM, so you can sometimes rewrite the code to use less RAM without worrying that the new code uses more code memory.  Not so for MicroPython, where everything shares RAM.  So some things I tried actually made the situation (checked by calling gc.free_ram() in the main loop) worse.  So, for the most part, I managed to increase free RAM by removing code that I didn’t need.  Having done so, the program was stable though free memory went up and down cyclically as memory was allocated each loop and then eventually freed up.

Some easy improvements came from removing the code to display GPS satellite count on the LEDs, and specifically importing only the required modules instead of the whole Microbit library.  The most relevant part of the code turned out to be the part that builds up an NMEA sentence.  In C you simply allocate enough memory for the longest sentence you need to parse, then place incoming bytes into that memory using a pointer or index, checking of course for buffer overruns.  In Python, strings are immutable so you can’t do this, and the temptation then is to do “string = string + new_character”.  Of course, the Python interpreter then allocates memory for the resulting string, marking the old string as “no longer in use” so it can be freed up sometime later.  It’s pretty easy to end up with lots of unused memory waiting to be freed.  For now, my NMEA code explicitly frees up memory as each new byte comes in.  I did briefly change the code to using bytearrays, which are close to what I would do in C, but free memory reduced slightly (I assume the source took more space) so I went back to the original code.  Longer term, I’ll ditch NMEA and write code to use the UBX binary protocol instead.

The code has been running continuously now for over 12 hours, and the free-memory figure is solid (measured at the same point each time round the main loop).  I do need to add the flight-mode code, but that’s small and shouldn’t cause an issue :-).  If all is well then I hope to fly this (weather-permitting of course) on Sunday.

Finally, here’s the result of receiving the telemetry on a Python LoRa gateway program that I’ve been working on lately:

Posted in Weather Balloon | 5 Comments

Quick RTL SDR Comparison

As part of a recent project, I’ve used a few different RTL SDR devices, and was surprised how drifty some of them are, one in particular.  For their intended application – decoding wideband transmissions – this isn’t an issue, but if you want to use one to decode RTTY then it certainly is – the signal will soon drift outside of the audio passband unless the SDR is retuned.

My project is on a Raspberry Pi, where I found that all but one (see the test results below) was basically unusable.  So I did some quick tests on my desk, with a Windows PC running Airspy, for a crude visual comparison of drift rates.  I tested 4 devices:

  1. NooElec Nano 2
  2. A very old E4000-based SDR
  3. Current model R820T2 SDR
  4. NooElec Aluminium-cased SDR

1 – NooElec Nano2

Poorest of the bunch.

2 – E4000


3 – R820T2

Not better.

4 – Ali cased

Much, much better.

As such, the metal-cased NooElec is the only one I could recommend.

Of course, there are much better SDRs out there – the Funcubes, SDR Play and Airspy models, and for chasing or tracking balloons you should really spend the extra money – but for bench testing then this particular RTL SDR is just fine.

Posted in Weather Balloon | Leave a comment

Raspberry Pi SSDV with a Compact Camera or SLR

Many HAB flights now use SSDV to transmit images “live” from the balloon down to the ground, using a camera connected to the flight computer, providing an immediacy that is missing when just flying standalone cameras.  Early SSDV flights used serial cameras connected to a microcontroller, but image quality (and, ease of programming) took a step forward when the Raspberry Pi arrived with simple access to webcams.  My first SSDV flight and the following 3 used a webcam, sending SSDV down over RTTY.



Webcams do not provide great image quality, which then improved when the Raspberry Pi camera came out.  There was also some excellent work by Chris Stubbs who managed to program a Canon compact camera to send SSDV directly, again over RTTY.


With RTTY we are limited to sending quite small images – around 500×300 pixels – because larger images take too long to transmit.   It is possible to increase the speed from 300 baud to 600, though it then becomes more difficult to receive, or by using the trick of multiple RTTY transmitters on the same tracker, such as this one:

Another option though is to replace RTTY with LoRa, which on 434MHz provides a speed increase of about 4x over typical RTTY speeds, and that increases by another tenfold if using 868MHz (all due to varying bandwidth limits allowed within IR2030).  Further, LoRa allows for uploads to repeat any missing packets, and this in the 868MHz band produces some impressive results (the inset is an image from my first Pi SSDV flight, showing the improvement we now have in quality and resolution):

So, these bandwidths allow us to send rather higher quality images than before, to the point that the image compression is limiting quality.  With this in mind, Philip Heron added a quality setting to his excellent SSDV encoder/decoder to control the amount of compression applied.

With reduced image compression and higher bandwidths, the remaining factor is camera quality.  Whilst the Raspberry Pi cameras (especially the newer Sony) are quite good, they do have tiny sensors and simple plastic lenses.  A step up would be to use a compact camera, mirrorless system camera or an SLR.  These also potentially offer wider angle lens, making for more impressive HAB photographs.  However we need to get those images to the flight computer.

Pretty much every modern camera allows for a USB PTP (Picture Transfer Protocol) connection to a computer, allowing it to be controlled by a computer to a greater or lesser extent.  For most cameras all we get to do is download images from the camera – and that’s all that most people need – but we also need to be able to take images under control of our flight computer.


To take and transfer images we can use the Linux program gphoto2, with a compatible camera that includes remote operation (i.e. ability to take an image via a command on the Pi).  The compatibility list includes few modern compact cameras, as the remote functions are typically only available on SLRs.  Canon, for example, used to include remote capture in their Powershot models but stopped this practice in 2009, presumably to persuade people to buy their SLRs instead.  I tested with an old Canon SLR (EOS 400D) and pretty much every function is supported – remote shooting, control of ISO, control of aperture/shutter (if the camera is set to semi-auto or manual mode).  However I’m not specially keen on flying something as heavy and solid as an SLR with wide-angle lens, so I checked the compatibility list for smaller, lighter alternatives.  Sadly none of my other cameras fitted the bill, so I purchased a Nikon S3300 compact.  This provides remote shooting (albeit without any control over aperture etc.), has a wide-angle lens (26mm equivalent for 35mm sensors), 16MP sensor, is small and light, and charges from USB (so the Pi should be able to keep it charged during flight).

Once gphoto2 has been installed (sudo apt-get install gphoto2), then the first thing to do is connect the camera and check that it can bee seen:

gphoto2 --auto-detect

This should produce a result like this:

Model Port
Nikon Coolpix S3300 (PTP mode) usb:001,012

So far so good.  Now to find out what capabilities the camera offers:

gphoto2 --summary

Which will give you something like this (some parts removed):

Camera summary:
Manufacturer: Nikon Corporation
Model: S3300
 Version: COOLPIX S3300 V1.0
Vendor Extension ID: 0xa (1.0)
Vendor Extension Description: 1.0;

Capture Formats: JPEG
Display Formats: Association/Directory, Defined Type, JPEG, DPOF, MS AVI, Apple Quicktime, MS Wave

Device Capabilities:
 File Download, File Deletion, File Upload
 Generic Image Capture, No Open Capture, No vendor specific capture

Device Property Summary:
Property 0xd407:(read only) (type=0x6) 1
Property 0xd406:(readwrite) (type=0xffff) ''
Property 0xd002:(readwrite) (type=0x6) Enumeration [1,2,3,4,5,6,7] value: 6
Date & Time(0x5011):(readwrite) (type=0xffff) '20161111T143911'
Flash Mode(0x500c):(readwrite) (type=0x4) Enumeration [1,2,3,4] value: Flash off (2)
Focus Mode(0x500a):(readwrite) (type=0x4) Enumeration [2,3] value: Automatic (2)
Focal Length(0x5008):(read only) (type=0x6) Enumeration [3500,4600,5300,6100,7300,8600,10500] value: 35 mm (3500)
Battery Level(0x5001):(read only) (type=0x2) Enumeration [2,5,25,50,65,80,100] value: 80% (80)
Property 0xd303:(read only) (type=0x2) 1

Form this we can see that the camera supports “Generic Image Capture” (woohoo!) but no control over zoom (focal length is read-only).  Given that for a HAB flight I want the lens at its default widest setting anyway, that’s not an issue.

Taking a photo is simple:

gphoto2 --capture-image-and-download --force-overwrite --filename dave.jpg

This will extend the lens if it’s retracted, focus the lens, set the exposure (using whatever options are set within the camera), take a photograph and then download it to the Pi.

For more advanced cameras you may be able to control the exposure manually (aperture and/or shutter), control the ISO etc.  The available settings, and the specific commands to set them, vary from camera to camera but your starting point should be to list them all:

gphoto2 --list-config

The latest version of the Pi In The Sky software includes options for the use of cameras via gphoto2 (see instructions in the README).  With “Camera=G” in the pisky.txt file, gphoto2 and imagemagick installed, and a compatible camera connected and powered on, then PITS should take images on that camera and transmit them via SSDV.

Unlike with the Pi camera, images are taken at full resolution (or whatever resolution is set within the camera), and are then stored on the Pi SD card at that resolution.  The resizing for transmission is then done by imagemagick, which is why that has to be installed.

In testing, the Nikon has been completely reliable, running for 11 hours continuously till eventually the battery was discharged (remember, it charges to some degree over over USB hence the long run time).  So this is looking good for a flight.  Here’s a sample test image as sent via SSDV/LoRa.


Posted in Weather Balloon | 2 Comments

LoRa PC Gateway

I generally use the Raspberry Pi to receive LoRa transmissions from balloons, and to upload the packets to the HabHub servers.  However it sometimes might be more convenient to use a PC or Mac, or a phone or tablet, for internet connectivity, in which case we need some way of interfacing a LoRa module to those devices.


Here I have used an Arduino Mini Pro, connected to the LoRa module via SPI and 2 I/O pins, and using software derived from my Handheld LoRa Receiver to allow control of the module via the Arduino’s serial interface.  I’ve built 2 such devices, the first of which connects to a PC via USB, using a Prolific PL2303 USB-serial adapter:



The second device uses the same firmware, but connects to the PC (or Mac, tablet, phone) via bluetooth using a HC-06 bluetooth serial adapter.  Power comes from a small LiPo, using a USB charging module.

P1110723 IMG_2813


The firmware handles incoming packets directly, copying them to memory before sending to the host PC or mobile device.  It also sends various status values – current RSSI every second, and packet RSSI, SNR and frequency error before each packet.  It accepts simple commands to set the LoRa frequency, bandwidth, spreading factor, error coding, packet header type and low data-rate optimisation.


Currently I’ve produced Windows software that communicates with either device using a virtual serial port (USB or Bluetooth), and expect to make that cross-platform soon (OSX, Android, iOS).  This program allows the LoRa parameters (frequency, bandwidth, spreading factor etc.) to be set, displays incoming telemetry packets, and optionally uploads those packets to Habitat (so the balloon position is displayed on the live map).  SSDV isn’t supported yet.



To make your own device, you will need:

  • Arduino Mini Pro
  • Programmer for above
  • LoRa Module (RFM96 for 868MHz or RFM98 for 434MHz)
  • Wire and solder
  • SMA Socket
  • Suitable case

Plus for the USB version:

  • FTDI or Prolific USB-serial adapter

or, for the Bluetooth version:

  • HC-06 Bluetooth interface
  • LiPo battery
  • USB LiPo Charger
  • On/Off Switch

Connections from the Arduino to LoRa are described in the firmware.  Remember to connect GND on Arduino to GND on LoRa, and Vcc on Arduino to 3.3V on LoRa.

For the serial version, first check if your USB adapter supplies 5V or 3,3V or both; for 5V you need to connect the 5V line to the Arduino “Raw” supply input; for 3.3V connect to the Vcc pin instead.  Also, connect 0V/GND from USB adapter to the Arduino GND pin.  The USB Tx connects to Arduino Rx, and USB Rx to Arduino Tx.

For the Bluetooth version, the LiPo connects to the Arduino Raw pin via a switch.  The Bluetooth device then takes power from the Arduino 3.3V line.  Rx/Tx pins connect as above.  All GNDs connect together of course, and to the battery -ve wire.  The LiPi charger connects to the battery directly.

Download the firmware and program the Arduino with it.

To the USB version to a PC, just plug it in and hope that Windows installs the driver OK; if not then download and install the driver appropriate to your device.  Check in Device Manager to see which serial port number it has installed.

For the Bluetooth version, connect and install a USB Bluetooth adapter if one is not already installed.  Power on your LoRa/Bluetooth receiver and then search for the bluetooth device in Windows.  You should see “HC-06” show up.  If you are asked for a PIN number it is 1234.  Check in Device Manager to see which serial port number it has installed; if it doesn’t show then be prepared to wait – for some reason it can take several minutes.

If you are using my Windows software, download that to a new folder and just run the .exe file.  Choose the serial port that was installed earlier, and within a couple of seconds you should see the “Current RSSI” value start to update.  Choose your LoRa settings and click the “Set” button.  Once you’ve done that, you should start seeing packets arrive (assuming you have a tracker running using those LoRa settings, of course).


Posted in Weather Balloon | 7 Comments

One Little Cloud In The Blue Sky

On Saturday I helped with a school launch by Greg Tomlin, who drove here from Coventry with his SKYBLUE payload and a minibus full of excited schoolchildren. It was their first launch, but not Greg’s, as he’d launched twice before with his previous school.

Predictions were for a fairly gusty and showery day overall, but with a chance of launching in the morning before the wind got up and the clouds and rain arrived. Landing predictions were also good for the morning, but poorer later as the winds would take the balloon down to the Severn Estuary. I ran through several permutations of balloon size and gas fill, and finally opted for a 1600g balloon with a standard ascent rate of 5m/s.  To help keep the flight away from a watery end, I chose a slightly undersized parachute so the final descent wouldn’t drift too far south.

Greg kindly offered a free ride for one of my trackers if I had anything to test, and I did. I’ve been working on a tracker that uses a servo-controlled parafoil to guide a payload to a specific landing spot. I’ve run this through emulated flights but hadn’t flown it for real, so this was an opportunity to do just that, with the fallback of another tracker in case things went wrong (which they did!). So, I quickly put together RTLS1 (Return To Launch Site), without servos of course, but with all the software intact.  Hardware was an original Pi Zero (I didn’t want to fly a camera this time) and prototype PITS Zero board with LSM303 compass/accelerometer connected.  As an extra test, I added code for a BME280 pressure/temperature/humidity sensor. Together with the RTLS compass data, and various landing prediction and flight control values, there was quite a lot of telemetry to send, so I opted for a 140 bytes/second LoRa mode for transmission.


I had another reason to fly something. My wife’s maiden name is Cloud, so last year I bought a cloud necklace, with the intention of sending it up to near space so she owned a very high-flying cloud! I hadn’t got round to actually flying it, but with our 30th wedding anniversary in a few days this was a good opportunity! The launch day also turned out to be the 12th anniversary of when Julie’s dad died, so it was particularly poignant. To add one more coincidence, he used to be a CB operator with callsign “Skyblue”.

With Greg and team en route, I prepared for launch so we could get the balloon flying as soon as possible (delays would mean a higher chance of a wet landing). So when they arrived, I had my tracker online and payload sealed with line attached, groundsheet out, balloon tied to the filler, and lines tied to the parachute. Greg and his team wasted little time in getting cameras started, tracker running and online, and payload sealed up and tied to the parachute and my payload.


Meanwhile I inflated the balloon. The wind by now was quite gusty, but with quiet periods where I could get on with filling with gas and checking the neck lift.

After sealing the balloon sealed and tying it to the payloads, I took the balloon out to the field, followed by Greg and his team carrying the payloads, parachute and line. Out in the middle of the field, the wind was quite gentle and as I let up the balloon it wasn’t far off being vertically above me. Holding the lower (my) payload …
IMG_1970… I took a few steps downwind and launched. The entire flight train that rose into the grey sky above.


Back in the house, I checked the transmissions and map. initially the live prediction was a bit alarming, showing a landing south-east of the Severn (I’d aimed for North-West), but then I remembered that the live predictor generally assumes a burst altitude of 30km, and ours should be 36km or so. Also, the initial ascent rate with hydrogen is lower than the average, whilst the predictor assumes the ascent rate will be constant.

With a fairly high flight landing not far away, we didn’t have to rush into the chase vehicles. So we had time to watch the flight progressing, and I had time to finish getting my chase car set up. Part of that was starting up my LCARS-based touchscreen, and when I did I noticed that the RTLS1 position wasn’t updating. A quick check of the telemetry showed that it had stopped at about 12km altitude, which was a very strong indication that the GPS wasn’t in flight mode. It later turned out I hadn’t re-enabled that code after disabling it for my emulated tests. Oops. At least the SKYBLUE1 tracker was still working fine, and I knew that RTLS1 would start sending the correct GPS data once the flight descends back through 12km, and that I would have plenty enough test data from it anyway.

Once it was certain that the landing point was going to be west of the Severn, we drove down to Monmouth to wait for the balloon to burst. Parking at a convenient and free location (Lidl !), Julie bought some supplies as we all watched the flight proceed and then burst. We waited a few minutes for the predicted landing position to stabilise, and then set off for the most likely landing area. We had to change our target a couple of times, and were a couple of miles away when the flight landed (which isn’t bad considering the how narrow and windy the roads are in that area!). My LCARS system had a last position of 166m altitude which was only 46 metres above the landing position, and 4 metres below the road near that position! I later found out that my home receiver had a last position of 368 metres altitude, which again was very good considering the hills between the launch and landing sites.


Using that last position, we drove through a small forest (usually a bad sign when chasing a payload!) to a track which, according to the satnav, was the closest pint we could get to by road, with the payload about 350 metres away. I still had no signal from either tracker, which seemed very odd as normally I’d get a signal 1km or so away. With a single tracker I’d wonder if the tracker didn’t survive the landing, but it seemed unlikely that both trackers would stop. So we kept going in the hope of getting a signal further along the road. We still couldn’t get a signal so parked up and got out the Yagi which, with radials horizontal (meaning the payload was on its side or the aerial was squished against the ground) finally got a good, decodable signal. Tapping that into the satnav, we were directed back to that track we passed earlier. So we parked up, and Greg and I went to the adjacent house to find out who owned the land that we’d just dropped our payloads on, and to gain permission and hopefully directions too!


With that done, we opted to walk down the track, which got progressively more muddy and after a while wasn’t getting us any closer to the payload. By then we managed to get satellite mapping loaded on my phone, and it became clear that it was going to be better to go back to the house and find a different route. When we got there I chatted with the landowners – a retired couple in the house – and they couldn’t have been more helpful. The husband was recovering from an operation, but the wife offered to come out with us, so once a quick rain shower subsided she got her wellies and we all followed her down a footpath and across (in single file!) a field to a second field where the children soon spotted their blue payload. They seemed pretty excited!


I, of course, was relieved that I hadn’t lost Julie’s necklace!



Back at the house, the SKYBLUE team showed some of their photos to the landowners, and after some more chat we all left.


Julie and I decided to stop in Monmouth to have lunch by the river.


So, a very good flight, and I now have lots of real data to peruse before I start testing my RTLS project with a parafoil.

Posted in Raspberry Pi, Weather Balloon | 1 Comment