NASA have given us many iconic photographs, such as “EarthRise” and “Man On The Moon“, but there’s one which is more easily replicated if you’re not NASA or Elon Musk, and that is the image of Bruce McCandless floating freely in space during the first untethered spacewalk:
So several months ago I set about replicating this as best I could, under a high altitude balloon.
Revell Model
Revell USA made an astronat/MMU kit back in 1984; samples now are rare and expensive, but eventually I found an unused kit in the UK on ebay. It was a good price, complete and original:
The kit arrived late last year, and I assembled it over the Xmas break, in preparation for a launch on (hopefully) February 7th, 34 years to the day after the original flight.
Cameras
I wanted to include several cameras for the flight, both video cameras and still cameras for downloading live to the ground during the flight. Having tried several different action cameras in the past, my current favourite is the Gitup Git2 camera – reliable, inexpensive and plenty of options in the firmware. I combined 2 of these (one Git2 with wide-angle lens, and one Git2P with a normal lens) with some AA-powered powerbanks to extend the run time from about 90 minutes to several hours using 64GB SD cards.
I also wanted live images, ideally from different viewpoints. The only reliable live image cameras I’ve used are the Pi models, and these are one-per-Pi. So I built a small network with 3 Pi boards using the built-in wireless modules to pass image files between them; a Pi3 as an access point, and Pi Zero Ws as clients. All live images were then downlinked in squence by one Pi using LoRa.
In the end it wasn’t really feasible to set up vastly different viewpoints as the astronaut model is quite large and the payload would have then been huge (and cumbersome, and delicate), so I had the cameras all quite close to each other.
Payload Build
I decided to place each Pi in its own Hobbycraft box. The Zeroes are very small of course and even with an AA powerbank there was space to fit a video camera with its own powerbank inside the box:
Next came the main Pi, with its own camera, 3G (later removed due to insufficient power from the powerbank) and UBlox USB GPS, all inside 2 of the same boxes glued together:
Finally, I added a backup tracker in case the main one failed for any reason, and to provide a programmed cutdown to prevent the flight drifting too far:
There was one last thing to do – my Revell model isn’t identical to the version that NASA flew, and most prominently was missing a camera. Easily fixed with some foam polystyrene and a plastic cap!
Bruce Junior was now ready for flight!
The Flight
The flight predictions for my chosen date were not ideal, but quite good for the time of year. Initially I was going to have help from another HABber but he couldn’t make it that day, so I launched alone. To make that task easier I removed some items from the flight, allowing for a smaller balloon and less gas. I also chose to launch later in the day than planned, which meant I didn’t need to overfill the balloon so much to keep it away from the sea. Here’s the predicted flight path:
Less gas means less lift which makes it easier to tie the balloon and handle it afterwards. Aside from the cold, it was a very nice day to launch – fairly clear skies and not too much ground wind. Here’s the partially-inflated balloon:
Meanwhile the payload cameras were recording and transmitting to the other HAB enthusiasts online:
With the balloon fully inflated, tied off and tied to the parachute and payloads, it was time to launch:
I then finished getting aerials set up for the flight, finished filling the car with kit, and then set off on the chase. I knew that the flight was going to land some time before I arrived, so I wasn’t in as much of a rush as usual. Meanwhile the payload continued to rise till, at over 30km and just under 100,000 feet, the balloon burst. Here’s what happens to an aerodynamically asymmetrical payload when a high altitude balloon bursts and gravity takes over!
The flight computer includes its own landing prediction which, as I’ve seen every time before, is more accurate than one that the online map uses. Here “X” is the last prediction from the tracker, with “O” being the actual landing spot and the red line showing the online prediction:
That last position was from my chase car, which was still on the M5 and over an hour away at the time! Here’s what happened when I was trying to catch up:
Normally when I get close to a landed balloon the radio signal reappears and I can get the landing position easily. This time though the landing was on a farm behind some metal cowsheds, blocking the signal from the nearby roads. After driving past where I thought the signal should reappear, I found a hill, connected a Yagi aerial to my handheld receiver and got a position that way. Following that target I still didn’t regain the position until I got to the farm, when I could see a row of sheds between me and the landing position. It’s always fun explaining to farmers why I’ve suddenly turned up, and this time one of them had actually seen it land. Retrieval was easy, though muddy and rather smelly …
Here you can see the balsa-wood frame (for lightness and deliberate fragility) with pivoting support (again, to help prevent damage to whatever it lands on), with the balsa painted matt black so it disappears against the black sky at altitude.
The payback
The point of the all this effort was to replicate as close as I could those original NASA images, so once home I went through the camera footage to select these …
Great launch. Looks like a lot of fun.
One question I have had about some of your payloads is why you have a stick sticking away from the body for the antennae.
Why not have it directly below?
I often have multiple radios, requiring multiple aerials, and it’s better to space those out so they don’t detune each other.
Did you consider using one of the third-party Raspberry Pi camera multiplexer boards? I’ve heard they do work and can use up to 4 cameras on one Pi.
I did. I saw some comments about them being unreliable. Also I wanted to have different cameras in different places, which means long camera cables, which would also possibly make it less reliable and can increase radio interference to GPS and my radio uplink. WiFi seemed a better bet (and was completely reliable in testing and the flight).
Hi Dave.
I’m planning to implement my own LoRa and SSDV based HAB tracker. It’s a shame that RFM9x’s maximum LoRa packet size is 255 bytes and SSDV packet without FEC is 256 bytes. How are you handling this? And what software are you using for decoding SSDV packets?
One possibility is to send normal packets with FEC and simply ignoring the last byte since it’s used only for error correction. But then I’m sending unnecessary data because LoRa is already using FEC.
I skip the first byte from SSDV and then put it back in after. The first 2 bytes are constant in SSDV.