This is to commemorate the very first untethered spacewalk by Bruce McCandless on 7th February 1984, as part of Space Shuttle mission STS-41B, when he used an MMU (Mobile Maneuvering Unit) to fly up to 300′ away from the Challenger Space Shuttle, and to replicate as best I can the famous photograph of him floating in space …
Sadly, Mr McCandless died late last year.
Of course the “untethered” aspect can’t be repeated under a balloon, gravity being what it is, but careful use of black supports against a black sky will make it look untethered.
Fortunately, Revell USA made a combined astronaut and MMU kit in 1984. Unfortunately they soon stopped making it, and examples are expensive and fairly rare. I watched listings on ebay for a few months, but all were in the USA with expensive postage, until one popped up in the UK. Not only was this the least expensive I’d seen, with reasonable postage, but also it was a completely original sample with the parts still in sealed plastic wrappers.
It’s probably 45 years or more since I assembled a plastic kit, so I had to buy the glue, paints and brushes before I started. Assembly wasn’t difficult though the plastic in general was much thinner than the small Airfix kits that I remember from my childhood.
The flight will include 2 LoRa downlinks – one in the 868MHz ISM band (more bandwidth for larger images) and 434MHz ISM band (better range).
I want to be able to take photographs from different viewpoints, ideally:
- Straight shot from distance
- Side shot
One option would be to move the camera around with motors, but that would be delicate and likely to fail during flight. Instead I’ve opted for 3 separate cameras. This could possibly be done from a single Pi using USB cameras, but those aren’t reliable in my testing. Webcams tend to be much more reliable but not as good quality as a Pi camera.
Another idea is to use a separate Pi for each camera. I could then build 3 separate trackers, but for them all to use the 868MHz band I would need to have them take turns transmitting. All do-able but a bit messy, plus there would be a lot of aerials!
So instead, I decided to have one central Pi that has a camera, GPS and radio transmitters, plus 2 extra Pi boards just with cameras. Networking 3 Pis could be done with a network switch and cabling, but a wireless is a lighter option. Recent Pis have built-in wireless networking (saves a bit more weight, and is more reliable in my experience) so I settled on a Pi 3 as the tracker and access point, and 2 Pi Zero W boards as clients. So that’s 3 Pi boards in total and 3 cameras.
Setting up a Pi 3 as an access point takes quite a few steps, especially when bridging the wireless LAN to the wired LAN, but there are clear instructions on the RPi web site.
I needed to modify my PITS software to cope with 3 cameras. Normally, the tracker program (which is the one transmitting image packets) requests new images periodically according to the schedule in the configuration file, and then chooses the “best” image and requests a conversion from JPG to SSDV format shortly before it finishes transmitting the current image. One option I had was for this code to be modified to request 3 photos instead of 1, with 2 of those being on the Pi Zeroes, plus of course to cycle between the cameras for conversion and transmission. Separately, a bash script takes photos and does the conversions.
I felt it would be simpler to remove some of this responsibility from the tracker, so that it just chooses which photo to send, choosing each time from a different camera. So this means that we need a simple script to take photographs, and a simple script to do the conversion to SSDV. The first of these scripts is run on each Pi, with the Pi Zero scripts also copying the photo files to the Pi 3.
Here’s the SSDV page showing the result, on the 434MHz channel (smaller images), with the Pi 3 cycling through all 3 cameras:
Weather permitting, I’ll fly this on February 7th, 34 years to the day after the original flight.