Earlier this year I posted an article about the TTGO Watch with the optional LoRa/GPS board, and how to program it to track LoRa high altitude balloons. Since then I’ve used it on a real flight where I found it ideal for checking that the balloon tracker was running fine while I filled the balloon.
Since then, LilyGo have started shipping a new watch which is slimmer (good!) but lacks the option of adding GPS or LoRa. So at first glance it doesn’t seem useful for HAB. However the watch does still have Bluetooth so it can be linked to a suitable GPS/LoRa device via a wireless link. For tracking balloons this is actually a better option, since the LoRa device can be much more easily connected to a good aerial, like so:
So here we have a compact 434MHz Yagi antenna, connected to a TTGO T-Beam LoRa/GPS device with battery, housed in a 3D-printed enclosure and connected over Bluetooth to the T-Watch 2020. This is ideal for that “last mile” tracking once the balloon has landed.
TTGO T-Watch 2020
As you can see, this is much slimmer than the previous model (which is why there’s no space for the LoRa/GPS board), and looks more like a regular watch and less like something that only a geek would wear! Though if this is a concern then you probably don’t want to be seen carrying a Yagi either!
Otherwise it’s very similar to the original watch, with the same touchscreen display and the same processor. There are some differences internally but those are taken care of my the supplied Arduino library – you just tell it which model you have.
TTGO T-Beam Board
This is a useful board that can be used as a LoRa receiver or even a LoRa HAB tracker. It has an ESP32 processor plus UBlox GPS and LoRa module (be careful to order one for the correct frequency band!).
It also includes a battery holder for a standard 18650 LiPo, so all it needs for this project is a suitable case. If you have access to a 3D printer then you’ll find no shortage of designs to choose from on thingiverse. Note that many designs include a hole for the OLED which most of the T-Beam boards do not have, so select appropriately.
For chasing, a small Yagi aerial is ideal as it provides all you need – directional, easy to carry, and some gain. I got this model from Aliexpress:
This model has an SO239 socket so I added an adapter to SMA and then a short SMA-SMA cable to connect to the T-Beam.
Both the T-Beam and T-Watch can be programmed using the Arduino IDE, and LilyGo supply a library with examples for the latter.
To program the T-Beam, install an ESP32 board package into the IDE connect the watch to the host computer via a micro USB cable. In the IDE, choose the “TTGO T-Beam” board option and select the correct serial port.
To program the T-Watch 2020, install the Lilygo library, and connect the watch to host the computer via a micro USB cable. In the IDE, choose the “TTGO T-Watch” board option and select the correct serial port.
The aim here is to provide the the following capabilities for the host ESP32:
Receive GPS latitude, longitude, altitude and direction
Receive packets received by the LoRa module
Set the LoRa module frequency and other settings
Provide Bluetooth (BLE) serial link
Essentially, this is the same as what I have done before with Bluetooth-connected receivers, but I had to make a change to the BLE code to make it possible for the ESP32 in the watch to connect easily. Get the code from github.
This is an idea that has been on my mind for a long while to do, and now with the lockdown I’ve had some time to work on it.
ADS-B is used by aircraft to transmit their position to the ground. This information isn’t encrypted, and anyone can build themselves a receiver to take those transmissions and draw the aircraft on a map. This is very popular now especially with the availability of inexpensive radio receivers (e.g. the RTL SDR) and single-board computers (e.g. Raspberry Pi). The software is free and typically it’s used to feed online maps such as https://uk.flightaware.com/. In fact sites like that usually provide a Raspberry Pi image file already set up to feed data to their servers where data from all the feeders is collated and used to drive an online map.
Why Put One On A Balloon?
Also, it would be interesting to see how much further the receiver can “see” once above local geography and how well that improves with altitude as Earth’s curvature becomes less of an issue.
So, my plan is to fly an ADS-B receiver, have it relay selected aircraft telemetry to the ground, and monitor how many aircraft it can see and the maximum distance of a received ADS-B transmission.
First thing is to set up an ADS-B receiver. I used the prebuilt Raspberry Pi image from flightaware, but others are available or the required software can be installed onto a vanilla Pi Raspbian image. For development I’m using a Pi 3B+, which I will swap for an A+ for the flight. The RTL SDR is one of these:
This is the program that deals with incoming ADS-B packets. Usefully, it emits a datastream on network port 30003, which can be examined on Linux with a command like this:
nc localhost 30003
and even in lockdown, this shows a very fast stream of messages flying up the screen:
My first coding job was to connect to this network port, apply some filtering to discard the messages I don’t need, and then parse and process the ones that I do. Specifically, those are the messages that contain aircraft positions.
So I took my standard PITS software, added a new thread for ADS-B, and had this open port 30003 and filter/parse/process the incoming messages.
Next job was to maintain a list of received aircraft, containing the latest position plus other information such as when it was received, how far away the aircraft is, etc.
Megabits Through Punybaud
For transmission to the ground we only have very low power and hence low bandwidth. I could have chosen something faster but I settled on a LoRa transmission in the 434MHz band, with a throughput of approx 200 bytes per second.
Each LoRa packet is a maximum of 255 bytes, which including the balloon position itself allows for flight positions per packet. I could have used binary packets but that would mean changing the receiver software and having all the other balloon receivers do a software update, so I compromised by packing each aircraft position into binary and then converting to base64 ASCII before appending to a standard UKHAS ASCII position sentence. The result was up to 7 aircraft positions per paclet, which equates to just over 5 aircraft updates per second.
That rate is nowhere near the incoming ADS-B update rate, however I don’t need aircraft to update quickly on the map – one update every 20 seconds say would be fine, which allows for 100 active aircraft. As I’m particularly interested in the most distant aircraft, I built a scheme where that aircraft is given priority in the transmission queue, so it is updated every 7 seconds and all the others are updated cyclically. Only active aircraft are transmitted, so aircraft that are now too distant will not be included in the cycle.
Here’s the log output from the tracker showing the data packets and some of the operation:
FLIGHT ID 4CA643
Array 119 used 51 --> 68 bytes 'TKZDJJZPQj/jBsBwlGKIMQE5XWebWFFCkuhDwBhHZ14rAjxLLyeCU0LBHI3AoIx1jl8G'
POSITIONS IN SENTENCE 3 - 3 are '71,5,67,'
FLIGHT ID ACE5FC
Array 119 used 68 --> 92 bytes 'rOX8PCBQQu/m+b9YGy8TnwE5XWeRVVFCsvRDwDFHZ14pAjxLLyeCU0LBHI3AoIx1jl8GTKZDMZZPQvQaB8BwlGKILwE='
POSITIONS IN SENTENCE 4 - 3 are '33,5,67,71,'
FLIGHT ID 3C4B2F
This is my standard LoRa Gateway receiving the packets:
I’ve set this feeder to deliver incoming packets to my hab.link server, using these settings:
That server used to be on an Amazon EC2 Ubuntu machine, but Amazon pricing is less predictable than CV-19 transmission so I’ve recently moved it to OVH; their simpler control panel is a welcome bonus.
The server program is written in Delphi, which can (amongst other things) target Windows or Linux for console apps. I’m testing on Windows but transfer to the actual Linux server is trivial.
I used this same scheme last year for my Apollo flight where, as this time, I wrote a custom web dashboard that was fed from the server. Since then I’ve started reworking the system so that it maintains a small database (a kind of habhub-lite) for flights, payloads and listener stats etc.
So this server accepts messages from the LoRa Gateway, does some processing (such as extracting the flight telemetry from the base64 strings) and then feeds balloon and aircraft positions to a web application. Here’s the log from the server in action:
The app connects to hab.link over a web socket, from which it receives balloon telemetry (which feeds the status box at top-right and draws the balloon on the map) plus aircraft telemetry which it displays in the grid on the right and as aircraft icons (pointing in the approx correct direction) on the map.
I don’t have a date for this yet, but with lockdown easing it might be feasible soon. I have some final things to do in the software, such as retiring aircraft from the map/grid when they haven’t been seen for a while, but I don’t expect major changes.
One of the nice things about LoRa vs traditional radio modulation techniques used by high altitude balloons, is that LoRa chips include decoders. So whereas RTTY and APRS generally require a PC to decode the signal, LoRa only needs a basic microcontroller. So LoRa receivers are cheap and small.
I’ve made several different LoRa receivers, from Raspberry Pi gateways down to a handheld Arduino-based receiver. The latter is battery powered, quick to start, lightweight and portable. It shows distance and direction to the payload so it’s ideal for the “last mile” chase of a balloon on foot, which is typically across a field, but it’s long been a dream of mine to make something even smaller that can be worn like a watch.
One option for this is a Smartwatch linked to a LoRa receiver, but suitable watches are expensive. So I was very interested to discover the LilyGo T-Watch which is about £50 delivered including a GPS/LoRa board. So that makes the watch a completely self-contained LoRa chase watch.
It packs a lot of devices into a rather thick but still practical watch – not something you’d want to wear all the time but absolutely fine for an activity like chasing a balloon. The watch contains:
ESP32 processor running at 240MHz
16 MB flash
8 MB PSRAM
Bluetooth / BLE
AXP202x battery management
240×240 IPS backlit LCD touch-screen
power and custom buttons
(optional) LoRa + GPS board (S78G for 434Mhz).
Of those, the CPU, battery and power management, touchscreen and LoRa/GPS are what we need.
S78G LoRa/GPS Board
Rather than having those devices connect directly to the ESP32 on the main board, they connect to a small STM32 processor. The GPS connects to the STM32 by serial, the SX1276 LoRa chip connects via SPI, and the STM32 connects back to the ESP32 over another serial connection.
The STM32 comes programmed with stock LoRaWAN firmware. This firmware does not provide access to all LoRa modes, does not report GPS altitude, and is generally unsuitable for our purposes. There does exist alternate firmware as part of the SoftRF system, which helps on the GPS side by sending raw NMEA packets from the GPS, but does not solve the LoRa limitations. So I had to write my own firmware.
Both the main (ESP32) and backplane (STM32) processors can be programmed using the Arduino IDE, and LilyGo supply a library with examples for the ESP32.
To program the ESP32, install an ESP32 board package into the IDE, install the Lilygo library, and connect the watch to host computer via the supplied USB C cable. In the IDE, choose the “TTGO T-Watch” board option, select the correct serial port, and then load one of the sample programs. Note that to test the GPS you must use the S7XG/GPS example and not the regular GPS example.
To program the STM32, install an STM32 board package and select the Nucleo 64 / L073RZ board. There are some other board settings which have to be set correctly, since the S78G uses different serial port connections to those on the L073RZ board; my S7XG firmware on github shows the options. Also, you need to buy and connect an ST-Link USB programmer (inexpensive) and make up a custom cable to connect from the programmer to the 5 programming pins on the S78G.
The aim here is to provide the the following capabilities for the host ESP32:
Receive GPS latitude, longitude, altitude and direction
Receive packets received by the LoRa module
Set the LoRa module frequency and other settings
Item (1) is something that any balloon tracker does, so I wasn’t short of code to borrow. However, the S78G uses a Sony GPS and not the usual UBlox, so I had to use code that handles standard NMEA messages instead of the UBlox custom protocol.
Items (2) and (3) I’ve also already coded, for example in my LoRa OTG device for phones and tablets. This code provides a serial protocol which I can use here to allow the ESP32 to set frequency etc. and to receive LoRa packets.
So it didn’t take long to merge my existing source code together and have something that should work on the S78G, assuming I knew the pin allocations for STM32 to GPS and LoRa. Documentation on this proved elusive, but fortunately the SoftRF firmware I mentioned is open source on github, so I searched that code for the information I needed.
To change to the next screen, swipe up or left, or press the user button briefly (less than 0.5 seconds); to change to previous screen swipe down or left, or press the user button for at least 0.5 seconds. The screen sequence is:
Logo –> GPS –> LoRa –> Direction –> Settings
The screens are in a loop so going “back” from Logo takes you to the Settings screen.
Note that the direction relies on knowing the user’s direction as reported by the GPS, so if you have not moved recently then take a few steps forward so that the GPS can measure your direction.
Just touch the buttons to adjust the LoRa mode up/down.
Touch the frequency up/down buttons to adjust the frequency in 1kHz (</>) or 25kHz (<</>>) steps.
I’ve had a few emails asking how to post in the HAB IRC (Internet Relay Chat) channels. This used to be easy, but due to some cretin spamming a great many IRC channels last year, both of the UKHAS IRC channels (#highaltitude and #habhub) have been locked down so that only registered users can post. Registration is fairly easy but not entirely obvious, so I decided to write a short guide.
We have 2 channels for High Altitude Ballooning on the freenode server:
#highaltitude – this is for general chat about ballooning. It’s the perfect place to ask any questions you may have, or to announce your flight.
#habhub – this is for getting flight documents approved (see https://ukhas.org.uk/guides:documents to learn about flight and payload documents). It’s also used for adding APRS trackers to the spacenear.us map, and for asking about getting hourly predictors set up.
Fill in your chosen nickname (not that one!). The channel is filled in for you but you can change that if you want to join a different channel. Finally, check the “I’m not a robot” box, even if you a re a robot, and click the Start button.
You will then be joined to the #habhub channel, and you will see the list of current members (people online at the moment) and any new messages that they post. However you will not yet be able to post messages.
Look at the bottom of the screen, and you will see a box with your nickname in it, and a blank box next to that where you you type the following:
This is planned for Saturday morning, when Steve Randall and I are are launching a single balloon, probably around 8am but ISH applies. If you can’t manage to get up that early don’t worry it could be a fairly long flight at around 3.5 hours.
There are 3 trackers planned:
Pi tracker payload ID “Xpi” with SSDV and “XX” Landing spot prediction, LoRa mode 1 on 434.170MHz
AVR tracker payload IDs “UBX” and “L80” (sends one, then a gap, then the other), both LoRa Mode 1 on 434.450MHz, plus calling mode (433.650MHz Mode 5)
Pi tracker payload ID “PTE” with and SSDV, with landing prediction included in the sentence, LoRa Mode 1 on 434.325MHz
We will live stream the launch on YouTube:
There may be other live streams too including from a drone.
Those of, cough, my generation, will remember watching with awe as a President’s clear and concise statement of intent was brought to fruition, played out on our monochrome TV sets as grainy, ghostly images accompanied by words that will last for as long as mankind. And thus were inspired a generation of astronauts, scientist, engineers and space-followers.
Space travel remains, for now, beyond the reach of most, but near-space has never been more accessible. All you need is a big balloon, some helium, a suitable tracking device and some knowledge about how to fly it legally and successfully. So if, like me, you were born too late and too not-American to be part of NASA’s efforts to put a man on the moon, you can have your own near-space programme to at least partially relieve those frustrations!
This week it’s the 50’th anniversary of that first moon landing, with the launch on 16th July, landing on 20th July, return launch on 21st July, and splashdown in the Pacific on 24th July. To commemorate the achievement I looked for a suitable model to fly under a weather balloon. Both Revell and Lego have models that you can purchase and build, but not all of them are of practical size (or even weight!) for a high altitude balloon flight! I settled on the Revell Apollo 11 Columbia & Eagle kit, which is small and light, with pretty good detail (and there’s one in London’s Science Museum!)
It’s 60 parts, with glue, gold foil, paint and brush included (but not quite all the colours required!). You’ll also need a craft knife, tweezers, brush cleaner and ideally some different size brushes.
The kit includes parts for the command module (CM):
the service module:
and the lunar lander (LM). Here you can see the CSM (Command and Service Module), plus the LM ascent stage and descent stage.
Here’s an assembled model in the Science Museum in London:
The final step was to mount my model in front of a suitable camera. Commonly people use balsa wood or similar, but that’s just ugly, and I prefer to use either clear acrylic rod or sheet underneath the object, or (as in this case) thin carbon fibre rod above it. That meant carefully drilling a hole in my carefully built model, then inserting the rod and some glue …
Just as the forces from Saturn 5 culminated in accelerating the astronauts and their tiny home away from home towards the moon, the efforts of thousands of engineers converged in the iconic form of mission control. It’s definitely the place I would have wanted to be at the time if I’d had the opportunity.
So to support my little near-space Apollo 11 flight, I decided to build a virtual mission control console, making it is close to the original as I could with a reasonable amount of effort.
This provide several different functions:
Live video from the launch site
SSDV – live images from the flight itself
Live map showing balloon/parachute
3D simulation of the view from the balloon
Also, because I just cannot resist the temptation, I added a few gimmicks …
Various Apollo 11 playback sounds (launch sequence, landing and the 1202 error!)
During descent, the map shows 3 Apollo-style parachutes
Analogue TV interference when changing channels ….
Payload and Tracking
As with most of my flights I used a Raspberry Pi with tracker board added, and had it send images and a landing prediction as well as the usual telemetry. I used a wide-angle lens on the Pi camera so the Apollo model could be placed closed to it. The camera has manual focus which I set to have the model in focus.
The pi was placed in a foam polystyrene box, made up of 2 commercial (Hobbycraft) boxes glued together, and with an SJ4000 action camera and extra battery.
As a backup I added a simple AVR tracker in a small foam egg.
This hobby is very dependent on the weather; if the launch winds are too high then filling and launching can be very difficult if not impossible, and if the winds in general and higher altitude winds in particular are too strong or going in the wrong direction then it can be impossible to have the flight land somewhere safe (i.e. nowhere near a town or major road).
I’d hoped to launch on moon landing day, but forecast winds were rather too high to launch, especially with a delicate payload which, if it hit the ground at launch (that’s what tends to happen in strong winds) was likely to get broken thus making the entire flight pointless. Further, the landing prediction was too far away for high definition images.
In contrast, the predictions for the Apollo 11 launch anniversary were perfect; next to no launch wind, and a very, very convenient flight path!
So I applied to the CAA for a NOTAM, and made the plans to fly on Apollo 11’s launch anniversary.
The launch day was lovely as expected. My NOTAM was from 9am to 3pm, which was just as well as someone else launched during the morning, on the same frequency as I was using and without announcing the flight to the community! Rather than reprogram my trackers to a different frequency, I waited for their flight to land and then got my payloads ready:
I set up a live video stream to YouTube, using a Pi Zero W and a wide-angle camera, aimed from an upstairs window down over the garden where I fill my balloons. Here’s a shot from the stream, as displayed in the Apollo dashboard:
The launch itself was very easy, and we watched the balloon quickly rise in the sky with the payload swinging below …
As the landing prediction was only a few miles from the launch site, I had plenty of time to follow the launch online via my dashboard:
I waited till shortly after the balloon burst (pretty much on schedule at an altitude of around 38.7km) before setting off to the landing area predicted by the Raspberry Pi tracker. It’s a hilly area, and I found somewhere to park that would have line-of-sight hopefully all the way to the ground, which it did. For a long while it looked like the flight wouldn’t actually reach the ground, but in the end it swung away from the trees just before landing.
As well as being hilly, the area was very rural with no roads closer than about 800m from the landing spot. I drive around the closest roads and found a track that would take me closer, then found a nearby resident to ask who owned the land. He sent me to the landowner’s estate office, and they very helpfully printed out an area map with footpaths shown. As it happened, there was a public footpath that would take me very close to the payload, so I got back in the car, parked up near the footpath, then set out on foot with my phone running my HAB Explora app connected to a small USB telemetry receiver. With the payload around 800m away, I’d expect to receive telemetry already, but the hilly terrain meant that there was no line-of-sight and I didn’t receive telemetry till around 300m from the payload.
It was a pleasant, if strenuous, walk, with this lovely sight when I got to the top!
Amazingly, the Apollo model was completely intact although the (tethered) support arm had broken away from the payload on landing:
And here’s the lovely view on the way back down …
Of course the main aims of the flight were photographic, and this is what I found when I got the memory cards back home …
This flight is planned for Tuesday 16th July and will commemorate the launch of Apollo 11 on this day in 1969, 50 years ago. I want to make it special, so the flight will have a few new things all designed to try and recapture the technology of the ’60s.
The primary payload is a Revell Apollo 11 kit, assembled in the configuration flown from the Earth to the Moon. For more details see my blog post on the build.
Mission Control was a central part of every NASA spaceflight, and everyone has seen the Apollo-era mission control room with its flickering monitors and flashing lights. I built an emulation of such a console as a web page back in my Telnet Flight a couple of years ago, so I decided to start with that and build a new web app for my Apollo flight. You can read about how this works in this blog post about the system.
Google map (not SNUS) with balloon, chase car and on-board landing prediction
3D visualisation of the view from the payload
Raw data screen
The current plan is for flight to carry 3 trackers:
Pi tracker with wide-angle camera sending telemetry and monochrome (of course!) SSDV (869MHz band).
Pi tracker telemetry only (434MHz band LoRa)
AVR tracker (434MHz band LoRa
Payload IDs and frequencies etc will be updated here when decided.
The launch will be streamed over YouTube, viewable within the dashboard.
I hope to have a lot of receivers for this flight. Now, because the dashboard works from a custom web server and not Habhub, you will need an up-to-date LoRa gateway to provide data for the dashboard. You will need V1.8.30 issued on 26th June, or later, and you need to add this line to gateway.txt:
With that done, telemetry will up uploaded to the hab.link server, as well as Habitat if you have that enabled. SSDV is not affected and is sent to Habitat only.
Sometime last year I came across a Python library for controlling a GoPro camera, and was interested because this provides a means for a Raspberry Pi to capture photographs from a GoPro during a HAB flight, and then send those images to the ground via SSDV. I happened to have some GoPro cameras one of which has the necessary WiFi ability. It didn’t take long to have a script running that would take images and transfer to a Pi, and I then integrated this into my Pi In The Sky software. I even wrote a blog post about it, but held off publishing till I had a chance to test it in flight.
A week ago I noticed that predictions for this week looked good, with Wednesday having a good combination of nearby landing and low launch winds. So on Friday I quickly put an application for Wednesday morning (I need to give the CAA 72 hours notice so Monday would technically have been too late). I spent Tuesday preparing the payloads, receivers and the chase car, but when Wednesday morning came the launch winds were a bit high for a solo launch. Since I had 3 payloads to launch (camera, backup radio tracker, GSM tracker) and needed to use a large balloon to avoid landing in the Severn or the Forest of Dean, I decided it was best to wait for another day.
When I checked the predictions, I saw that predictions for the following morning were much better – half the launch winds, and I could use a small balloon to land safely. I quickly sent an email to the CAA asking for the NOTAM to be changed or reissued, if possible. Obviously this was within my 72 hour notice period, but that’s in place to stop people asking on a Sunday for a launch on Monday. Also, my NOTAMS are generally issued within a couple of hours or so, so I felt positive that the CAA would respond positively, which thankfully they did.
This was delayed for about an hour due to a couple of issues with my GoPro tracker. First, it couldn’t get a GPS lock. Often this kind of issue is to do with having a camera nearby, but this time it was actually a faulty GPS antenna; it looked fine – no breaks in the wire and the SMA plug was intact with no breaks or shorts – but it just didn’t work. Swiftly replaced with a spare.
Second problem was that the GoPro was showing a low battery. This was very odd, as I’d left it charging for a few hours the previous day and hadn’t used it since (but more on that later …). So I connected to a charger, added a powerbank to the payload to keep the camera charged in flight, and delayed the flight for an hour to allow the camera to partially charge.
The delay did mean that the flight would land a little further East, which gave me more margin in case the balloon burst late. I ran various prediction scenarios and was satisfied that even a very late burst would be fine. So with all trackers powered, transmitting and being decoded, I enlisted help from a neighbour, filled and launched the balloon.
With a flight time of well over 2 hours, there was no rush in chasing. I checked that all the receivers were running OK, enabled the packet-resend uplink (which requests that the payload re-sends any missing image packets), and then set off in the chase car.
The prediction was for a landing near Yate, so I took the M5 south, stopping at the Gloucester Services to check the live map which, unfortunately, was running with a huge time lag as the database is still full of radiosonde data (it’s being cleared, but not soon enough for my flight). Normally I wouldn’t be concerned, as my car has an Android head-unit with its own mapping, but a recent change to that and/or something specific with the tracker configuration meant that the app kept crashing.
Another problem was that the SSDV from the GoPro stopped at around 16km altitude. I wondered if the battery had discharged (though it shouldn’t have, as there was a power-bank connected also), but again more on that later. Anyway, the images so far were pretty good.
From the M5 I took the M4 east, and made the mistake of using the online map which showed an expected landing south of the M4. So I turned right towards Bath, and parked up to check more carefully. With my head unit app unusable, I connected my phone to a USB OTG receiver, and used my phone app to show the balloon, car and predicted landing spot. This worked great, and I saw that the landing prediction was north of the M4 (as I’d expected earlier), so I turned round and aimed for the prediction.
The closest road to the landing prediction had fields either side, with several places to park off the road. It was on a hill so initially I drove to the top, hoping to get a signal from there till the flight landed, but after a while the landing prediction moved to be quite close to the road itself. So I drove back down the hill and parked under the expected flight path. I moved from the a couple of times as the balloon, and prediction, meandered.
Initially the flight was tricky to see, coming out of the sun, but once it was north of me I saw it easily.
And I even managed to photograph it as it landed – you can see one white payload box (GSM tracker) about to hit the ground, and another (white with pink tape and containing the GoPro) stuck in a tree to the right.
Coincidentally, the farmer was driving around the field. I beckoned him over, explained about the parachute in his field, and he said to jump over the gate and go get it.
On the way I passed a calf having a long afternoon nap. That was the reason the farmer was there – to find out where the calf was hiding!
So, That GoPro ….
When I got the payloads back to the car, I checked the GoPro and it was still powered up, so power wasn’t why it stopped taking photos. It was very very warm, as was the tracker (a Pi 3 A+), so maybe that was it. Anyway I left both running so I could check them when I got home.
Back home I hooked both the Pi and camera up to mains USB power adapters, connected the Pi to my LAN via USB-LAN adapter, and connected via ssh. The camera script had stopped. I ran the script again and it failed with a library error during the “take photo” function. Odd. I let the camera cool down and tried again, but same error.
So I took the SD card out to copy the contents, and noticed that the card was full. That was odd as I’d cleared it of all files the day before, when I charged the camera. So I checked the files, and the majority of the card capacity was taken up with video files. I opened them, and all just showed ….. video of my USB charger! So, the reason the card was full, and the reason the GoPro battery was nearly discharged before flight, was that I’d accidentally touched the record button when I connected the camera to the charger. Ooops!
So that’s something to be careful of next time, and a reminder to check the SD card capacity when the payload is prepared on flight day, not just before.
Although the Pi camera is a very good option for live imaging from a Raspberry Pi tracker, it does have its limitations -mainly due to its plastic lens which is a moderate wide angle and not as sharp as some. There are alternative lenses to try and some are very good, but they can mist up as the payload goes through clouds (but clear further up).
I’ve tried various other camera options with the Pi. Webcams aren’t any sharper than the Pi camera, and USB cameras (e.g. a Sony compact, Gitup Git2) aren’t reliable for long periods. So I was interested when a couple of other HABbers mentioned a Python library for connecting wirelessly to a GoPro camera.
The first task is to get a Pi connected to a GoPro. The GoPro (Hero 4 Silver in my case) is set up as a wireless access point, using the GoPro Android app to set the camera name (SSID) and password. With that done these details can be added to the wireless setup on the Pi. To do that, edit this file:
Replacing the SSID and password with those you used in setting up the camera.
You can use a Pi Zero W which is small, light and has reliable built-in wireless networking, or a Pi 3 A+, but the latter uses more power and will get hotter. Remember to have the GoPro on, with wireless enabled, before the Pi boots.
Install this using the usual instructions, though you can leave the Pi camera disabled.
A sample script gopro.py is supplied. This replaces the usual camera script so you should edit the startup script so that it runs the Python script not the old one; i.e. replace this line:
sudo ./camera &
sudo python3 gopro.py &
To test, run the camera script manually, with the tracker program running in another terminal window (e.g. another ssh session). To run the tracker:
and to run the script:
sudo python3 gopro.py
If the script crashes with error “ValueError: unknown url type:” “, then the GoPro has no SD card – insert and format one – or the card is full.
If the script crashes with error “FileNotFoundError: [Errno 2] No such file or directory: ‘\t./process_image” then that means that the tracker was previously configured for the Pi camera; delete those and try again.
I’ve flown a few different types of camera on my HAB flights, starting with Canon compact cameras in my early flights, and Raspberry Pi cameras in many of my more recent flights. GoPro and other “action” cameras are popular, and I’ve flown them a few times, though their rather extreme lens distortion does encourage comments such as “it’s so curved it must be flat” from the flat-earth contingent of the lunatic fringe.
Most of the best photographs on my flights were taken by the Canon compacts, so I wondered about what improvement there might be from using a better camera. SLRs are too heavy really, and I wouldn’t want to risk damaging something that expensive or delicate, so my attention turned to “mirrorless” cameras which use large sensors and interchangeable lenses but are smaller, lighter and less delicate than SLRs.
Canon’s first such camera was the EOS M, and I bought one soon after it came out based on its compatibility (via an included adapter) with my Canon SLR lenses. It came with a 22mm “pancake” lens and I soon added a couple of zoom lenses to the kit. Image quality was very good indeed, but focussing was rather suspect. Canon knew this and brought out improved firmware but it still focussed much more slowly, if at all, on difficult subjects. And by “difficult” I mean anything that isn’t sharp and stationary.
A couple of years ago I bought an an EOS M100, which has much improved focussing and is a camera I now use a lot. The EOS M remained unused, as did (pretty much) that 22mm lens, so I had a camera with very good image quality that I wouldn’t be too distressed about if lost or damaged in a balloon flight …
With battery life measured in 100’s of shots rather than 1000’s needed for a balloon flight, I needed to arrange an external power supply for the camera. Canon sell a suitable device, and clones are available, where a USB connection is boosted to battery voltage and then fed to the camera through a dummy battery. So I bought one of those, plus an Anker PowerBank with 2A capacity (to handle peaks) and tested it continuously for several hours to ensure that it would comfortably last through a typical 2-3 hour flight.
I needed some way of having the camera take images thoughout the flight. With the Canon compact cameras I used “CHDK” firmware, and the equivalent for Canon SLRs and M-series cameras is called “Magic Lantern”. It’s very easily installed, and very easily configured to take a photograph every few seconds.
I configured the camera to store photos as RAW files as well as JPG format (though somehow I managed to switch that off before the flight, or the intervalometer ignored the setting), and set the camera to manual focus (to include infinity) so it wouldn’t need to try (and probably fail) to focus on fluffy clouds.
I wanted to soften the landing to help prevent damage to the camera, so I built an internal sub-frame for the camera, with soft foam suspension underneath to reduce the forces on the camera. I included a slight tilt to the camera so that it is generally looking downwards, though the natural swinging of the payload results in many viewing angles through the flight anyway.
With predictions looking good for Saturday, including low launch winds, I applied for permission. I was originally hoping to do another flight with inter-balloon telemetry to a balloon in Northern Ireland, but there wasn’t time to get permission for that flight (the launcher doesn’t have permanent permission for his site, so it’s a longer process). That left me with permission for a launch, so I looked through my other planned flights to see what might be best to launch. Cloud cover was looking good (i.e. a lot less than 100%) so I settled on this photographic flight.
I also wanted to test some radio trackers. Testing isn’t a great idea if you want to actually get the flight back, so I included 3 trackers in case one failed. First was a simple and very reliable AVR tracker which I’d programmed to send regular LORA transmissions plus Calling Mode transmissions. Second was a Pi Zero tracker switching between LORA and RTTY (both via the same LORA module). Third was a GSM tracker. All 3 had flown successfully before though the radio trackers hadn’t used those radio settings.
The conditions were very pleasant, as expected, however 2 of the trackers decided to be awkward. First the Pi Zero couldn’t get a lock, and I had to rehouse it in a larger payload container to keep the GPS antenna away from the power supply. Secondly, the GSM tracker failed to send any positions. I flew it anyway, in case it was a local issue, but it didn’t send any positions when it landed either. Investigation after the flight revealed that the GPS antenna cable had broken internally (and invisibly).
With the landing area not far away there was no need to rush, so we left when the balloon was about 1/3rd of the way up. The flight path was typical for a summer launch, with the higher altitude winds bringing it west, against the lower altitude winds taking it to the east.
The Pi tracker includes software to predict the landing spot based on the wind measurements during ascent and the efficacy of the parachute, so I enabled that with a setting to make it appear as a large “X” on the live map. Onboard landing predictions are very useful when chasing especially if the online prediction isn’t available (e.g. no internet connection in the chase car). Here’s a video, made by Steve Randall, showing the two predictions plus the balloon position and chase car position; note that due to load on the server, the balloon and predictions are lagging 2-3 minutes behind the chase car position.
In the chase car, we saw that the online prediction was rather different from the on-board one. From experience, the latter is more likely to be correct, although it doesn’t take into account different winds at the landing site vs the launch site. I knew from running predictions earlier that the landing spot was moving north during the day, so figured that the actual landing point would be north of the on-board prediction, which was in the same direction as the online prediction. Checking on the map, we found a road that was close to this point, so we took that road and parked under the expected flight path. It took a while to spot the parachute, which I managed about a minute before it landed.
As it happens, the payload’s camera was busy photographing us as it flew over our heads!
With the landing position known, I chose the “Navigate to payload” function in my Android tablet app, and followed the instructions to a farmhouse about 250m from the landing spot. After a short chat with a resident we drove back to a public footpath and set out on foot with phone connected to a USB OTG LoRa receiver. The phone runs my HAB Explora app which gives directions to the payload. For a while it looked like the payload would be close to the path, but as we got closer we could see it was around 50 metres to our right. And to our right were …. trees.
So, back in the app, I chose the “Navigate off-road to payload” function. This loads up a separate mapping app (in this case BackCountry Navigator Pro) and tells it to provide directions to the payload. This showed us that the payload was actually at the far side of the trees, so we followed the path and edge of the fields round the trees to the payload.
The parachute was the first thing we saw, as it was up a tree! The balloon remnants and payloads were on the ground. The parachute was stuck fairly solidly but came free with a lot of force.
The camera payload obviously landed on one corner, and for next time I’ll add some extra foam to absorb the shock, because this is what happened …
And so, the aim of the flight, starting with a balloon selfie …