The first American astronauts set to touch down on the Moon in the new millennium will not blast off for another 700 days, but now they can begin exploring potential landing sites by jacking into a Matrix-style simulation of its cratered South Pole sector.
Simultaneous revolutions in space-based imagers and laser scanners, and in virtual reality modelling toolkits, are giving rise to VR doubles of real-life lunar scenes that appear to be identical twins.
“We are building very high resolution, photorealistic digital twins of potential Artemis III landing sites,” and then developing these into cutting-edge virtual replicas of the orb’s ancient plains and impact formations, says Kip Hodges, who as founding director of Arizona State University’s School of Earth and Space Exploration helped transform the university into one of the top American space studies centers and research nodes.
“The best available imagery from lunar orbit today is provided by NASA’s Lunar Reconnaissance Orbiter,” which also sends out pulses of laser beams to chart the topography of the terrain, he told me in an interview. “We are now using images and laser scans from the orbiter” to create geometrically precise VR models of the lunar surface.
“One application of virtual planetary environments that is attracting attention is in mission design and planning. Mission planners can use them to design the safest and most efficient extravehicular activities,” Professor Hodges says. “They are excellent training tools for astronauts preparing for landings and surface exploration.”
Astronauts selected for the elite Artemis missions can use these simulations to plan their sorties and memorize their surroundings, Hodges says, and in the process limit their time outside their SpaceX spacecraft, and their exposure to hazardous solar and cosmic radiation.
“There is a limit to how long they can stay outside of the hab – the Starship – a finite amount of time even with the new generation space suits,” says Professor Hodges, a one-time member of NASA’s Space Advisory Council who has helped NASA train American, Canadian, and Japanese astronauts on surface operations.
His team’s digital dioramas are part of a Big Bang-like burst of space-tech demos being launched in the run-up to aeronauts returning to the black and silver globe – featuring inventions by independent space outfits that could change the course of exploration, Professor Hodges says.
Intuitive Machines – the first commercial spaceflight group on the planet to orchestrate a soft landing on the Moon earlier this year, aims to loft a second remotely piloted spacecraft this December, carrying a robotic photographer developed by Lunar Outpost that will survey the periphery of the pole and transmit the images back to Earth.
Professor Hodges says his Digital Discovery Initiative lab is joining forces with Lunar Outpost, which is despatching the rover- photographer on its 380,000-kilometer distant shooting assignment, to sculpt the new images into an immersive virtual guide to the sector.
His team’s digital doppelgängers, rich with close-up imagery captured by Lunar Outpost, could help astronauts and Mission Control at NASA survey the site in exquisite detail and sketch out follow-up expeditions of the future.
Trekking through looking glass doubles of these lunar worlds, Professor Hodges says, is likely to explode in the run-up to the spacefarer sojourns that will start in 2026.
Astronauts spread out across NASA and the European and Japanese space agencies, via their avatars, could all circumnavigate the Pole’s alien geography, and collectively schedule their sequel lunar liaisons, he says.
“I see this as an important step toward democratizing planetary science,” he says of the world-spanning reach of his Virtual Moon platform. “Virtual reality is such a global phenomenon that scientists in Australia, Asia, Europe, and Africa could conduct virtual field geology together at the same time as part of the same team.”
Each time a colossal SpaceX Starship alights on the Moon, its pilots can quickly deploy a squad of rovers equipped with cameras and lasers to map their extraterrestrial environs. Upon receiving these images and point clouds, Professor Hodges says, his group could speedily model a virtual mirror of the scene and beam it back to the astronauts at the speed of light.
The same platform could be used to connect up scientists involved in the Artemis program with the astronauts as they explore the surface of the Moon, with a slight delay as communications travel 1.3 light-seconds in each direction between the Earth and its age-old companion.
Professor Hodges adds that his Digital Discovery team and other ASU scholars have also begun creating virtual models for another sector of the solar system slated for human exploration ahead.
“We are already hard at work building digital twins using Mastcam-Z imagery from Mars,” an incredible array of photographs produced by the Perseverance rover charting the ghosts of rivers and lakes that once animated the now-frigid world.
It turns out Perseverance is the greatest photographer across the Red Planet’s dunes, and the twin Mastcam-Z cameras have captured a fantastical stream of images.
An ever-expanding cosmos of these simulations of the Moon and of Mars, Professor Hodges predicts, will be tapped to train the next waves of American and allied astronauts, along with future generations of space scientists and engineers spread out across the continents.