A NASA engineer is working on the software named LunaNet that will allow a machine to navigate the lunar surface using things on the Moon’s horizon, similar to how humans use GPS on their smartphones to determine where they are going.
LunaNet will provide “internet-like” services to the Moon, such as location services.
Alvin Yew, a research engineer at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, was the driving force behind this revolutionary initiative. Yew began by analyzing data from NASA’s Lunar Reconnaissance Orbiter’s Lunar Orbiter Laser Altimeter (LOLA).
LOLA generates comprehensive Moon topography maps and determines surface roughness and slopes. Using LOLA’s digital elevation models, Yew is training artificial intelligence software on how an astronaut on the Moon’s surface may see the Moon’s horizon.
By matching known boulders and ridges with those seen in images taken by a rover or astronaut, these computerized panoramas enable for precise position identification of any given area.
Yew noted that it’s comparable to stepping outside and attempting to establish your location by looking at the horizon and nearby landmarks.
“While a ballpark location estimate might be easy for a person, we want to demonstrate accuracy on the ground down to less than 30 feet (9 meters). This accuracy opens the door to a broad range of mission concepts for future exploration,” Yew said in a statement.
According to a study published by Goddard researcher Erwan Mazarico, a lunar explorer can only see up to 180 miles (300 kilometers) from any place on the Moon.
Yew’s location technology may even be helpful to Earth-bound explorers if their path takes them through a place where GPS signals are obstructed or interfered with.
Yew’s geolocation technology will also leverage GIANT’s capabilities (Goddard Image Analysis and Navigation Tool).
GIANT analyses photographs to determine the separation between and from visible landmarks as quickly and precisely as possible, unlike radar or laser ranging equipment, which pulse radio waves and light at a target to evaluate the reflected signals.
GIANT is a portable version of Goddard’s autonomous Navigation Guidance and Control system (autoGNC), which provides mission autonomy capabilities for all phases of spacecraft and rover activities.
Future moon explorers would profit significantly from this initiative since they would one day have an AI-powered tool to assist them in navigating the lunar surface.