Skip to content

Northeastern team uses innovative VR technology to develop controls for moon rover in NASA challenge

The team’s technology will be tested at night at the Johnson Space Center Rock Yard.

A person uses a virtual reality headset to work on a project in a lab setting.
Page Patterson, who studies electrical and computer engineering works on a project to compete in NASA’s SUIT competition. Photo by Matthew Modoono/Northeastern University

Mixed reality may be a key enabling technology for NASA astronauts as they work to develop long-term settlements on the moon. 

At Northeastern University, a group of students has spent the past school year developing a mixed reality system to help the space agency do just that. 

Later this month, those students will test that technology as they compete against nine other universities across the country as part of NASA’s Spacesuit User Interface Technologies for Students (SUITS) competition. The competition will be held at the Johnson Space Center in Houston from May 18 to 22. 

The students worked in tandem with a team from Michigan State University on the project, which has two main components. 

The Michigan State team was tasked with developing technologies to enhance the space suits NASA astronauts will use during extravehicular activities. Northeastern’s team has been tasked with developing software for a pressurized rover designed to accompany those astronauts on the mission.     

Northeastern’s team decided to take advantage of virtual reality for the task, utilizing Meta’s line of headsets, explains Paige Patterson, an incoming fourth-year computer and electrical engineering student and the project lead of Northeastern’s SUITS team. 

“We’ve created a virtual reality interface to be able to control and interact with the rover and also interact with its autonomous driving controls,” Patterson says.

The students will also be supporting Michigan State’s EVA team with a “virtual workbench,” a holographic UI display that sits in front of them and provides up-to-date information on the current mission. 

The team’s technology will be tested at night at the Johnson Space Center Rock Yard for the competition, which “consists of a pulverized granite base with scorpia boulders up to 1 meter (3.2 feet) and craters up to 2 meters (6 feet) deep and 20m (65.6 feet) in diameter,” according to NASA.  

Brandon Petersen, a graduate student pursuing his master’s degree in robotics and co-lead of Northeastern’s SUITS team, says NASA needs this technology because it will help support its robotic plans. 

“If you look at NASA’s plan for actually developing a colony on the moon, it’s entirely robotic,” says Petersen. “You have all these robots doing construction and probably doing exploration.

“How do humans control all these robots? A big thesis among NASA right now is maybe we do that through virtual reality.” 

Using a virtual reality headset, a user could control multiple robots at once to construct a lunar base or have them do scientific missions involving material collection. 

“That’s where I see the future of this technology,” he says. “We refer to this as tele-operation. These robots do have a level of autonomy, but you basically assign the goals and you monitor the robots.” 

Patterson adds that virtual reality headsets are also much lighter and cheaper to send into space than full command centers. 

“We can’t bring everything we want because of weight restrictions,” she says. “We couldn’t bring a whole mission control setup, but what weighs a lot less is a virtual reality headset, which you can use to create as many monitors. It’s the most flexible workplace space you could ask for.”