The program has lots of cute animations which play back to back, both as the intro and also individually when a successful detection occurs. When this happens, the program asks “Are you there?” and the crew can press any button to confirm their presence. A picture is taken whether or not the crew confirm, though.
Eleven pictures were taken over the course of seven days. Nine of these show Tim Peake, one shows Russian Cosmonaut Mikhail Kornienko, and there were a few empty ones. The results consist of the images and three log files which everyone is welcome to analyse. The logs show that each spike in humidity successfully resulted in a picture.
In early March 2016, the students of Cranmere Code Club presented their results to a packed lecture theatre at the 4th Raspberry Pi Birthday event in Cambridge, UK.
Over 60,000 rows of sensor measurements were collected, making a CSV file of about 26MB in size. This file can be loaded up and played back in Minecraft on any Raspberry Pi (you don’t need the Sense HAT). Martin O’Hanlon even produced a new educational resource to help people get started, called Exploring Space with Minecraft. Hannah’s father also wrote his own CSV analysis tool which can be found on GitHub.
Hannah was also invited to appear alongside Tim Peake on The One Show, aired on the 6th of November 2015, where she got to wear an actual space suit (see 8:40 onwards).
Tim Peake was a guest on BBC1’s The One Show on Friday the 6th of November, following on from his final pre-flight UK press conference. He talked about the Principia mission, the Raspberry Pis he’s taking with him and featured guest star Hannah Belshaw, one of the Astro Pi competition winners!
Tracks the location of the ISS and works out what country it’s above, then shows its flag on the LED matrix along with a short phrase in the local language.
The experiment uses two lines of stored Ephemeris telemetry (the path and speed of a space object defined by numbers), and extrapolates the path to get where the ISS should be at the time of the hardware clock on the Astro Pi. It then shows a flag on the LED matrix if the ISS is above land. Otherwise, it shows a twinkling blue-green pattern for the sea.
Initially, the stored telemetry (from September 2015 when we shipped the flight units to ESA) was out-of-date, causing the flag to not match the location of the ISS. This happens because the ISS continually makes slight course corrections to change orbital height or avoid pieces of space debris. So, over time, these changes add up and the stored telemetry from any particular date becomes out of sync with the current telemetry.
Tim manually updated it in his spare time, and afterwards confirmed it was showing the correct flag when he looked out through the window. He also reported enjoying all the local phrases for the different countries. There are no downloads for this experiment because it was display only, and didn’t produce any files for return to ground.
Join the web’s most supportive community of creators and get high-quality tools for hosting, sharing, and streaming videos in gorgeous HD with no ads.
Uploaded by Daniel Cussen on 2016-03-01.
Kieran’s experiment was running during an ARISS HAM radio session with a school in Germany and can clearly be seen in the bottom-right of the video above (look for the green, red, and blue flashing).
Over 17,000 rows of measurements were recorded into two CSV files which you can now examine. This experiment tackles a design issue with the Sense HAT, which causes thermal transfer from the Pi CPU to the two temperature sensors, meaning the sensors always read a few degrees above ambient.
Kieran solved this by using the nominal difference of CPU temperature above background ambient as a means to calibrate the Sense HAT sensors. There was a minor issue with the code in flight, though: it was developed on a Raspberry Pi 2, whereas the Astro Pi flight hardware contains a Raspberry Pi B+, which produces less heat from its CPU. This means that the calculated ambient temperature was always out by several degrees.
This issue was spotted before flight, but we didn’t have enough time to correct it before the Astro Pi flight units had to be shipped to ESA for launch. It should, however, be possible to post-process the CSV files to correct for this error and still make use of the data which came back from space.
The special thing about Astro Pi IR (Izzy) is that she has the Pi NoIR camera that can see both visible and infrared light at the same time. The experiment aimed to study plant health over wide areas of land by looking at how much infrared light is reflected by green plants.
Chlorophyll is the molecule in plants which absorbs sunlight to perform photosynthesis (synthesise carbohydrates from carbon dioxide and water). It strongly absorbs visible light, but strongly reflects infrared light. So by comparing the amount of visible to infrared light reflected by plants, you can work out how much chlorophyll they have, how much photosynthesis is going on, and how healthy they are.
As noted in this post from when the Pi NoIR camera was introduced, a blue filter is needed to exclude red light, so that only infrared light goes into the red RGB channel of the image file. This then allows you to compare the level of visible to infrared light at a later stage.
There is a known algorithm that produces a special index called NDVI, which gives a value from +1 for most chlorophyll to -1 for least chlorophyll. Forests will have a very high value, while places like deserts will have a very low value. You then post-process each image to recolour each pixel based on their NDVI values; here’s an example showing the British Isles. It’s a technique pioneered by the NASA Landsat missions, and this experiment successfully implements it in Python.
A total of 4,301 Earth observation images were recorded. However, the blue lens filter was inadvertently not included in the Astro Pi payload. This was noticed just before the payload was due to be shipped to ESA for launch, and there wasn’t enough time to include it.
ESA staff were really keen for the experiment to go ahead, although the lack of a filter meant that the standard algorithm for calculating NDVI could not be used.
Undeterred, Oliver and Aidan set about working on an alternative algorithm for calculating NDVI so that the images could still be used. They enlisted the help of Principia education partner Catherine Fitzsimons, who runs the EO Detective programme. She suggested using a visible light-based index called GLI, which won’t be as good as NDVI but will allow some level of analysis to be done.
If you’re looking through the ISS downloads for this experiment, here is how you can look up their location with a worked example.
Let’s start with EnviroPi_20160224_145546.jpg.
- Extract the timestamp from the file name:
- You must then subtract 130 seconds from this time.
- After subtracting 130 seconds the time becomes
- Go to the historical lookup website and input the time with +0000, for example 2016-02-24 14:53:36+0000.
When you click the Lookup button, the ISS will move to the location and you can zoom in on the map for a closer look.
We’re currently not sure why the 130-second correction is necessary. The Astro Pi hardware clocks were synchronised to GMT before departure. It may be a slight discrepancy in how the lookup website calculates the positions.
If anyone would like to write a script to tag the latitude and longitude of the images onto the metadata of each image file, this would be appreciated.
So proud to announce that @UTCLincoln Y10 students have won the Astro-Pi national competition for KS4 & Best Secondary School. AMAZING!!!
There were eight games in total which Tim could choose from:
This experiment was successful. Due to circumstances beyond his control, though, Tim was only able to play two of the games on one occasion. He played Arrow and Caterpillar.
Having only one sample of his reaction time is not enough data to draw any conclusions about how his reaction time had changed throughout the mission. It may be possible to get Tim to play the same games now he’s back on Earth, though, and see what the differences are.
Radiation detector using the Raspberry Pi Camera Module. Uses image recognition software to count specks of light caused by radiation; these are seen by the camera sensor and the program produces a measurement of the radioactivity occurring.
High above the Earth, there’s a layer of energetic charged particles trapped by the Earth’s magnetic field. It’s known as the Van Allen radiation belt, and the levels of radiation inside it are hazardous to satellites, spacecraft, and especially to humans.
The South Atlantic Anomaly (SAA) is an area where this radiation belt dips down to an altitude of just 200 kilometres above the Earth’s surface meaning that, at certain times, the ISS has to fly right through it. Fortunately for the crew, the ISS has lots of hard radiation shielding to deal with this, but some types of radiation can still get through.
This experiment aims to measure the level of radiation occurring inside the ISS as it orbits the Earth, and to see if the SAA and other radiation anomalies are visible in the data being collected.
The Astro Pi is not equipped with any special radiation-detecting hardware, but, with a bit of ingenuity and some clever code, these students were able to repurpose the Raspberry Pi Camera Module to act as a Geiger counter.
Most digital cameras are based on a CCD sensor and a Bayer filter, which is basically a flat array of tiny charge receptacles or capacitors. Each capacitor equates to the red, green, and blue elements of the pixels in the final image. When light hits the sensor it charges up these capacitors, and the amount of charge held by each is measured to create the pixel data.
An interesting property of the CCD sensor is that it can also receive charge through kinetic energy transferred by particles of radiation whizzing through. So if you block all light from getting in by covering the camera lens, you’ll be able to see what looks like white noise when energetic radioactivity is occurring. This is shown in the still picture above.
What tends to happen, though, is that the radiation slowly causes damage to the CCD sensor, and especially the Bayer filter. This often results in dead pixels, or rather a pixel that always reports the same colour no matter what.
So part of this experiment was to have Tim cover the Astro Pi camera aperture, located in the heat sink on the base, with some opaque tape to stop any light getting in.
When the experiment starts, it first takes a series of four calibration images. If there are any dead pixels from radiation damage, these will show in all four images and from then on those pixels are always ignored.
A picture is taken using a long exposure, to allow some time for the radiation to hit the camera sensor, at which point the image is analysed using a library called OpenCV to count how many specks of light were captured. This is done using an edge detection algorithm, because sometimes the impacts are seen as linear streaks depending on the trajectory of the particle.
The students took their Raspberry Pi and Camera Module to the Rutherford Appleton Laboratory and borrowed their neutron cannon to test it. The images in this folder were from that round of testing, proving that the code worked before flight.
They didn’t have access to one of our special flight cases when they did their testing and, sadly, it seems that the thickness of the aluminium kept out a lot of the radiation when the code was run in space. 140 discrete, seemingly random, ionising radiation events were detected by the camera sensor during the one-week experiment. The students are continuing to analyse the data, though, and have found that the plot below raises a number of interesting questions.
Initially, the calibration routine was subtracting pixels (as it would for damage correction), but this then stopped after three days and there’s currently no explanation as to why this happened. It may be due to a number of things, including preexisting damage to the camera, light leaking in from the lens cover, or subsequent radiation damage. The students are continuing to study the results.