4 min

Digital assistants: a new team member joins manned flights

Space, almost by definition, is undoubtedly the most technological sector that exists. While early space-age machines, including the first manned capsules, didn’t even have on-board computers, the possibilities provided by today’s digital assistants seem limitless.

As discussed in a previous article, the success of the first manned lunar landing in history, Apollo 11 with Neil Armstrong and Buzz Aldrin on July 20, 1969, was a feat that owed more to the piloting skills of the two astronauts than to the precision of the avionics on board the craft. Let’s take a moment to remind ourselves of the performance of the computers on board the lunar module (LM) designed by Grumman.

Developed by the Massachusetts Institute of Technology (MIT) and built by Raytheon, the Apollo Guidance Computer (AGC) had a capacity of just four kilobytes of random-access memory (and 72 kilobytes of hard disk space). Deployed for the first time on August 25, 1966, during the Apollo capsule’s AS-202 suborbital test flight, it weighed 32 kilograms, which is gargantuan compared with the size and performance of today’s desktop computers. Two units where installed: one in the Apollo spacecraft and the other in the LM. The AGC, which was the first digital assistant in the history of space flight, reached saturation during the final descent phase to the lunar surface.

Two alarms went off in the small lunar cockpit: “1202” and “1201” at 1,830 meters and 600 meters of altitude, respectively. These indicated that the onboard computer was receiving too much data but nevertheless remained within acceptable limits.

New bases

The conditions in which the Artemis program to return to the Moon is being prepared are quite different from those in the 1960s. If we think about it, we really aren’t far from Stanley Kubrick and Arthur C. Clarke’s vision in 2001: A Space Odyssey, released in 1968. Its view of the conquest of space imagined a 21st century filled with digital tools of all kinds.

In addition to super-powerful computers, we now have digital tools and 3D simulations that show us exactly how a satellite or the inside of a space station will look when finished before it is even physically assembled. In this field, Thales Alenia Space (TAS) has developed a spectacular visualization of the environment of the future Gateway lunar station, whose first components are due to be sent into lunar orbit in 2025.

For the record, TAS is to supply the ESPRIT logistics and I-Hab housing modules for humanity’s new outpost. Astronauts use the evolutive 3D simulation to verify the Gateway’s interior configuration. This helps them to prepare for life and work in the pressurized modules that they will board during the planned Artemis missions to the Moon.

A breathtaking view

The result gives the impression of being in lunar orbit. The simulation is designed so that users can move around inside the various modules as well as outside, with a breathtaking view of the Moon and Earth. While it is obviously spectacular, augmented reality is not just being used to design the Gateway. It can also be used to present images of satellites in production, showing operators exactly where to place the various components during assembly.

More specifically, the Gateway components in which TAS is involved could benefit from internal home automation assistants currently in development, such as DOMAI (DOMotic, internal, communication system & AI), a wireless communication device, or SIDISSI (Systema Domotico per Stazione Spaziale) for activating station lights and sensors.

Such examples were already part of Kubrick’s vision, but isn’t the most famous one Hal-9000, the sentient supercomputer that manages all the internal systems on the Discovery spacecraft on the way to the planet Jupiter? “We’re not there yet, but it would certainly be useful in some cases,” jokes Leopold Summerer, head of advanced systems at the European Space Agency (ESA). While clearly Hal-9000 is coming any time soon, we can produce something close to it.

A pathfinder

On November 16, 2022, the first version of NASA’s SLS lunar rocket lifted off from Cape Kennedy. The objective was to test the first version of the Orion capsule, developed by Lockheed Martin, with the ESM (European Service Module) propulsion module assembled by Airbus DS.* This was the first mission in the program, which experienced no problems and was extended until December 11. Although there were no passengers, a dummy in a “Commander Moonikin Campos” flight suit was seated in the captain’s seat.

Equipped with sensors, the dummy measured the impact of vibrations and accelerations that astronauts would experience during the critical phases of flight. But “Orion-Artemis-1” also had a tool on its dashboard that Apollo did not: Callisto. Designed in partnership with Amazon, Webex and Cisco for Lockheed Martin, this interface, based on Alexa, aimed to demonstrate how voice and visual assistant technology can be used in space. Today, this technology is widely used in everyday life, with almost all desktop computers equipped with such technology, Siri being one example.

This kind of device could well become an important tool for astronauts on future human missions to the Moon, or even the farthest reaches of outer space. Several videos on the Lockheed Martin website show how voice activation works with the interface installed on the Orion spacecraft’s dashboard while on the way to the Moon more than 200,000 km from Earth. This cockpit system combines space-quality equipment with Amazon’s audio and acoustic processing software.

The manned Artemis-2 mission, scheduled for late 2024, is expected to feature a similar interface. According to “LM”, the future custom-designed Alexa hardware and software will be able to access on-board telemetry in real time to answer mission-specific questions. On its website, Lockheed Martin states that the interface will be able to answer thousands of mission-specific questions, such as “Alexa, how fast is Orion traveling? What is the temperature in the cabin?”. 

Ultimately, this payload could control connected devices in the spacecraft and even its orientation in space. But how far can it go? “This is a difficult question to answer, because in space exploration, the answer will not necessarily be the same as in other sectors, such as medicine,” says Léopold Summerer. One thing is certain, though: AI’s possibilities never stop surprising us.

*Although Airbus is heavily involved in manned flight, it has not responded to our numerous inquiries about digital assistance in this area.

Send this to a friend