Mike Salter, STFC RAL Space
Developing and testing two cameras that will stream unprecedented images and video footage of planet Earth from space. The cameras are bolted to a prepared rig on the Zvezda module of the International Space Station. The IIS orbits the Earth at an altitude of around 400 km, circling the globe 16 times a day. The cameras will capture video and imagery below the Station’s orbit, where approximately 90% of the world’s population lives. The medium-resolution camera provides static imagery at a resolution of 5 m per pixel, covering a swath of some 50 km, whereas the high-resolution video camera (HRC) will provide a ground resolution of 1 m per pixel, which could allow observations of large crowds and moving vehicles. The objective is to give everyone the chance to see, in near-real time, an astronaut’s view of our planet by broadcasting the footage over the Internet via a commercial website. With live video tracking from above, we hope to unlock many new applications, such as providing moving pictures of major events, aiding agricultural efforts, and providing relief to regions of the Earth hit by natural disasters.
Using LabVIEW and the NI PXI platform to develop and test the cameras. FlexRIO FPGA technology provided the means to retrieve and reconstruct image data in real time from sensors inside the cameras.
RAL Space is a department at the Science and Technology Facilities Council’s (STFC) Rutherford Appleton Laboratory in Oxfordshire. The department, which works in space research and technology development, has been involved in more than 200 space missions. RAL Space has a well-renowned reputation for designing, building, and testing satellite instrumentation. A commercial partner approached RAL Space to develop two cameras to attach to the International Space Station.
For space missions, special engineering ground support equipment (EGSE) tests individual parts of the system on its own. Therefore, whilst the HRC electronics are designed and built at RAL Space, other parts of the system that the cameras would eventually interface with were being developed elsewhere.
Our goal was to develop the HRCs over a very short timeframe, so it was imperative that any EGSE equipment was not only ready and available in time, but also didn’t divert too much engineering attention away from flight hardware design.
The HRC electronics output the raw video data over a bespoke parallel data interface, with a total data rate of more than 800 Mbps. The EGSE system needed to be capable of capturing and recording this raw data stream so that engineers could later analyse the images.
The data from the camera is presented as a set of parallel differential pairs containing the pixel data and additional flags to indicate the beginning and end of lines within the image.
A FlexRIO PXI Express FPGA module, combined with an NI 6585 adapter module, captures data. We used the LabVIEW FPGA Module to process some of the incoming data stream in real time. The FPGA code looked at flags within the data that delimit where image frames start and end, and was able to split the continuous stream into individual images and discard the meaningless data in between. All valid pixel values are placed into memory [a DMA first-in first-out memory buffer (FIFO)] and transferred over the backplane of a PXIe-1082 chassis to the host virtual instrument (VI) running on a PXIe-8133 embedded controller. A second FIFO reports to the host VI the number of valid pixels contained in each image.
The host VI retrieves each set of image data from the DMA FIFO and saves it to an NI 8260 high-speed data storage module (solid-state drive). The host VI also displays a graphical user interface so that the user can specify file paths and select whether or not to save the complete raw data stream or have valid images separated into individual files. This proved beneficial during testing, as engineers could simply command the EGSE system to capture a single frame, as opposed to recording a section of the data stream and manually extracting the individual image later. The core functionality, such as file path selection, was very quickly implemented in LabVIEW so that we could focus our efforts on specific application challenges.
We used the LabVIEW Vision Development Module to implement features such as debayering and displaying the image data in real time, which gave a very quick visual confirmation that the hardware was operating correctly.
Having invested in other PXI modular instruments, we quickly found situations in which we could use them to solve problems. For example, the camera included a printed circuit board (PCB) that crossed over signals between multiple connectors. We were required to qualify this design by putting the box through dozens of thermal cycles and monitoring signal connectivity on the PCB. Using LabVIEW with a PXI digital multimeter and PXI switch module, we were able to quickly automate this process so that we could test connectivity throughout the qualification process, instead of manually, at the end.
The combination of LabVIEW software with FlexRIO FPGA technology provided the throughput capability needed from test hardware to capture the image envelopes at the required rates and process what to keep and discard in real time. By adopting the NI platform, we were able to undertake technology test development work more rapidly than if we were to build bespoke validation and verification test equipment. We were also able to easily adapt to changing requirements quickly and accommodate for more than what we initially set out to test.
It is important that we kept to the scheduled launch date for the space mission and ensured that the camera technology was available to our commercial partners on time. We were pleased to have completed camera development for this programme in only two years. The NI platform helped the engineering team reduce time and effort in delivering the results.
Mike Salter
STFC RAL Space
STFC RAL Space, STFC Rutherford Appleton Laboratory
Didcot OX11 0QX
mike.salter@stfc.ac.uk