2015 Science Results

Here are the results from the ISS. Click the satellite icon to access the data downloaded from space for each experiment! satellite

Crew Detector

Team name: Cranmere Code Club
Key Stage: 2
Teacher: Richard Hayler

ISS DownloadsTakes photographs of the crew, using the humidity sensor as the trigger for detecting their presence.

The program has lots of cute animations which play back to back, both as the intro and also individually when a successful detection occurs. When this happens, the program asks “Are you there?” and the crew can press any button to confirm their presence. A picture is taken whether or not the crew confirm, though.

iss046e050739

Tim manually triggered a detection by breathing on the Astro Pi. Image credit ESA.

Eleven pictures were taken over the course of seven days. Nine of these show Tim Peake, one shows Russian Cosmonaut Mikhail Kornienko, and there were a few empty ones. The results consist of the images and three log files which everyone is welcome to analyse. The logs show that each spike in humidity successfully resulted in a picture.

In early March 2016, the students of Cranmere Code Club presented their results to a packed lecture theatre at the 4th Raspberry Pi Birthday event in Cambridge, UK.

Back to top

SpaceCRAFT

Team name: Hannah Belshaw
Key Stage: 2
Teacher: Peter Kelly

ISS DownloadsLogs a variety of sensor measurements of the ISS interior to a CSV file, with the intention to use the popular computer game Minecraft back on Earth as a data visualisation tool.

Over 60,000 rows of sensor measurements were collected, making a CSV file of about 26MB in size. This file can be loaded up and played back in Minecraft on any Raspberry Pi (you don’t need the Sense HAT). Martin O’Hanlon even produced a new educational resource to help people get started, called Exploring Space with Minecraft. Hannah’s father also wrote his own CSV analysis tool which can be found on GitHub.

spacecraft5

The columns are gauges for temperature, pressure, and humidity.

spacecraft2

The blocky ISS moves and rotates to show how the ISS was moving.

spacecraft3

The rocket lifts off and the timestamp from the data is written in the air.

Hannah was also invited to appear alongside Tim Peake on The One Show, aired on the 6th of November 2015, where she got to wear an actual space suit (see 8:40 onwards).

Tim Peake on The One Show 06/11/2015 19:00

Tim Peake was a guest on BBC1’s The One Show on Friday the 6th of November, following on from his final pre-flight UK press conference. He talked about the Principia mission, the Raspberry Pis he’s taking with him and featured guest star Hannah Belshaw, one of the Astro Pi competition winners!

Back to top

Flags

Team name: Space-Byrds
Key Stage: 3
Teacher: Dan Aldred

Tracks the location of the ISS and works out what country it’s above, then shows its flag on the LED matrix along with a short phrase in the local language.

iss046e042740

Astro Pi IR (Izzy) mounted onto the Nadir Node 2 hatch window.

The experiment uses two lines of stored Ephemeris telemetry (the path and speed of a space object defined by numbers), and extrapolates the path to get where the ISS should be at the time of the hardware clock on the Astro Pi. It then shows a flag on the LED matrix if the ISS is above land. Otherwise, it shows a twinkling blue-green pattern for the sea.

Initially, the stored telemetry (from September 2015 when we shipped the flight units to ESA) was out-of-date, causing the flag to not match the location of the ISS. This happens because the ISS continually makes slight course corrections to change orbital height or avoid pieces of space debris. So, over time, these changes add up and the stored telemetry from any particular date becomes out of sync with the current telemetry.

Tim manually updated it in his spare time, and afterwards confirmed it was showing the correct flag when he looked out through the window. He also reported enjoying all the local phrases for the different countries. There are no downloads for this experiment because it was display only, and didn’t produce any files for return to ground.

Private video on Vimeo

Join the web’s most supportive community of creators and get high-quality tools for hosting, sharing, and streaming videos in gorgeous HD with no ads.

Back to top

Watchdog

Team name: Kieran Wand
Key Stage: 3
Teacher: Christopher Butcher

ISS DownloadsContinually measures the temperature, pressure, and humidity, and displays these in a cycling, split-screen display. It raises visual alarms if these deviate from acceptable ranges.

ARISS Gesamtschule Leverkusen Schlebusch, Leverkusen, Germany 29 02 2016 12h02 27

Uploaded by Daniel Cussen on 2016-03-01.

Kieran’s experiment was running during an ARISS HAM radio session with a school in Germany and can clearly be seen in the bottom-right of the video above (look for the green, red, and blue flashing).

Over 17,000 rows of measurements were recorded into two CSV files which you can now examine. This experiment tackles a design issue with the Sense HAT, which causes thermal transfer from the Pi CPU to the two temperature sensors, meaning the sensors always read a few degrees above ambient.

Kieran solved this by using the nominal difference of CPU temperature above background ambient as a means to calibrate the Sense HAT sensors. There was a minor issue with the code in flight, though: it was developed on a Raspberry Pi 2, whereas the Astro Pi flight hardware contains a Raspberry Pi B+, which produces less heat from its CPU. This means that the calculated ambient temperature was always out by several degrees.

This issue was spotted before flight, but we didn’t have enough time to correct it before the Astro Pi flight units had to be shipped to ESA for launch. It should, however, be possible to post-process the CSV files to correct for this error and still make use of the data which came back from space.

Back to top

Trees

Team name: EnviroPi
Key Stage: 4
Teacher: Sam Page

ISS DownloadsTakes near-infrared pictures of the ground with the intention to post-process them, back on Earth, to measure plant health across wide areas of land.

 

EnviroPi_20160224_145546

Southwest England as seen by Izzy. The image is intentionally flipped in the code as part of the analysis process. Image credit ESA.

The special thing about Astro Pi IR (Izzy) is that she has the Pi NoIR camera that can see both visible and infrared light at the same time. The experiment aimed to study plant health over wide areas of land by looking at how much infrared light is reflected by green plants.

Scientific explanation

Chlorophyll is the molecule in plants which absorbs sunlight to perform photosynthesis (synthesise carbohydrates from carbon dioxide and water). It strongly absorbs visible light, but strongly reflects infrared light. So by comparing the amount of visible to infrared light reflected by plants, you can work out how much chlorophyll they have, how much photosynthesis is going on, and how healthy they are.

bluething

As noted in this post from when the Pi NoIR camera was introduced, a blue filter is needed to exclude red light, so that only infrared light goes into the red RGB channel of the image file. This then allows you to compare the level of visible to infrared light at a later stage.

There is a known algorithm that produces a special index called NDVI, which gives a value from +1 for most chlorophyll to -1 for least chlorophyll. Forests will have a very high value, while places like deserts will have a very low value. You then post-process each image to recolour each pixel based on their NDVI values; here’s an example showing the British Isles. It’s a technique pioneered by the NASA Landsat missions, and this experiment successfully implements it in Python.

A total of 4,301 Earth observation images were recorded. However, the blue lens filter was inadvertently not included in the Astro Pi payload. This was noticed just before the payload was due to be shipped to ESA for launch, and there wasn’t enough time to include it.

ESA staff were really keen for the experiment to go ahead, although the lack of a filter meant that the standard algorithm for calculating NDVI could not be used.

Undeterred, Oliver and Aidan set about working on an alternative algorithm for calculating NDVI so that the images could still be used. They enlisted the help of Principia education partner Catherine Fitzsimons, who runs the EO Detective programme. She suggested using a visible light-based index called GLI, which won’t be as good as NDVI but will allow some level of analysis to be done.

Geolocation

If you’re looking through the ISS downloads for this experiment, here is how you can look up their location with a worked example.

Let’s start with EnviroPi_20160224_145546.jpg.

  • Extract the timestamp from the file name:
    2016-02-24 14:55:46.
  • You must then subtract 130 seconds from this time.
  • After subtracting 130 seconds the time becomes
    2016-02-24 14:53:36.
  • Go to the historical lookup website and input the time with +0000, for example 2016-02-24 14:53:36+0000.

When you click the Lookup button, the ISS will move to the location and you can zoom in on the map for a closer look.

We’re currently not sure why the 130-second correction is necessary. The Astro Pi hardware clocks were synchronised to GMT before departure. It may be a slight discrepancy in how the lookup website calculates the positions.

If anyone would like to write a script to tag the latitude and longitude of the images onto the metadata of each image file, this would be appreciated.

Back to top

Reaction Games

School: Lincoln UTC
Team name: Team Terminal
Key Stage: 4
Teacher: Mark Hall

ISS DownloadsA suite of puzzles, memory games, and logic-based tests that record response times, with the goal being to investigate how crew reaction time changes over the course of a long-term space flight.

 

Dr Rona Mackenzie on Twitter

So proud to announce that @UTCLincoln Y10 students have won the Astro-Pi national competition for KS4 & Best Secondary School. AMAZING!!!

Reaction Games BPW

The information Tim saw about the games on the ISS. Image credit ESA.

There were eight games in total which Tim could choose from:

  • Arrow
    An arrow appears on the LED matrix after an unspecified amount of time. You then press the corresponding joystick direction as quickly as possible. The game measures the time taken for you to react.
  • Caterpillar
    A straightforward implementation of the classic Snake game, where the objective is to eat as many apples as possible and not run into your own tail as it grows. The game records the number of apples eaten before you crash and die.
  • Lights Off
    A logic-based game where the objective is to turn off a random pattern of lights using a crosshair that inverts the state of the lights selected. The game records the time taken to complete the puzzle. We found it quite tricky!
  • Maze
    An escape game where you’re an LED in the centre of the LED matrix, and the layout of the maze moves around you as you manipulate the joystick. It records how long you take to find the exit.
  • Memory
    A colour sequence memory test with increasing levels of difficulty. A sequence of colours is displayed and you have to input that sequence correctly to progress. The game records the level of difficulty you reach.
  • Pong
    An 8×8 version of the classic arcade game. You move the paddle with the joystick and try to keep the ball in play. It measures how long you survive for.
  • Speed
    A game where you’re controlling a vehicle going along an obstacle course. The longer you survive, the faster you go. The game records your survival time.

This experiment was successful. Due to circumstances beyond his control, though, Tim was only able to play two of the games on one occasion. He played Arrow and Caterpillar.

Having only one sample of his reaction time is not enough data to draw any conclusions about how his reaction time had changed throughout the mission. It may be possible to get Tim to play the same games now he’s back on Earth, though, and see what the differences are.

Back to top

Radiation

Team name: Arthur, Alexander, and Kiran
Key Stage: 5
Teacher: Dr Jesse Petersen

ISS DownloadsRadiation detector using the Raspberry Pi Camera Module. Uses image recognition software to count specks of light caused by radiation; these are seen by the camera sensor and the program produces a measurement of the radioactivity occurring.

frame_rate_13107over65536img013

An example of what the camera sees when it’s being hit by high-energy radiation. Click to enlarge.

High above the Earth, there’s a layer of energetic charged particles trapped by the Earth’s magnetic field. It’s known as the Van Allen radiation belt, and the levels of radiation inside it are hazardous to satellites, spacecraft, and especially to humans.

The South Atlantic Anomaly (SAA) is an area where this radiation belt dips down to an altitude of just 200 kilometres above the Earth’s surface meaning that, at certain times, the ISS has to fly right through it. Fortunately for the crew, the ISS has lots of hard radiation shielding to deal with this, but some types of radiation can still get through.

This experiment aims to measure the level of radiation occurring inside the ISS as it orbits the Earth, and to see if the SAA and other radiation anomalies are visible in the data being collected.

The Astro Pi is not equipped with any special radiation-detecting hardware, but, with a bit of ingenuity and some clever code, these students were able to repurpose the Raspberry Pi Camera Module to act as a Geiger counter.

Scientific explanation

Most digital cameras are based on a CCD sensor and a Bayer filter, which is basically a flat array of tiny charge receptacles or capacitors. Each capacitor equates to the red, green, and blue elements of the pixels in the final image. When light hits the sensor it charges up these capacitors, and the amount of charge held by each is measured to create the pixel data.

2000px-Bayer_pattern_on_sensor.svg

An interesting property of the CCD sensor is that it can also receive charge through kinetic energy transferred by particles of radiation whizzing through. So if you block all light from getting in by covering the camera lens, you’ll be able to see what looks like white noise when energetic radioactivity is occurring. This is shown in the still picture above.

What tends to happen, though, is that the radiation slowly causes damage to the CCD sensor, and especially the Bayer filter. This often results in dead pixels, or rather a pixel that always reports the same colour no matter what.

So part of this experiment was to have Tim cover the Astro Pi camera aperture, located in the heat sink on the base, with some opaque tape to stop any light getting in.

When the experiment starts, it first takes a series of four calibration images. If there are any dead pixels from radiation damage, these will show in all four images and from then on those pixels are always ignored.

A picture is taken using a long exposure, to allow some time for the radiation to hit the camera sensor, at which point the image is analysed using a library called OpenCV to count how many specks of light were captured. This is done using an edge detection algorithm, because sometimes the impacts are seen as linear streaks depending on the trajectory of the particle.

The students took their Raspberry Pi and Camera Module to the Rutherford Appleton Laboratory and borrowed their neutron cannon to test it. The images in this folder were from that round of testing, proving that the code worked before flight.

They didn’t have access to one of our special flight cases when they did their testing and, sadly, it seems that the thickness of the aluminium kept out a lot of the radiation when the code was run in space. 140 discrete, seemingly random, ionising radiation events were detected by the camera sensor during the one-week experiment. The students are continuing to analyse the data, though, and have found that the plot below raises a number of interesting questions.

pictogram

Radiation results. Click to enlarge.

Initially, the calibration routine was subtracting pixels (as it would for damage correction), but this then stopped after three days and there’s currently no explanation as to why this happened. It may be due to a number of things, including preexisting damage to the camera, light leaking in from the lens cover, or subsequent radiation damage. The students are continuing to study the results.

Back to top