To summarize overall goals, needs, pain points, and success metrics for our two stakeholders.

Research

User Needs


Hands-free operation intuitive commands

minimal cognitive strain | reliable in hazard terrain

Goals & Needs


Hands-free operation intuitive commands

minimal cognitive strain | reliable in hazard terrain

Pain Points


Reduced Cognitive Load Task Completion Efficiency

Improved Situational Awareness

Success Metrics

👩‍🚀

astronauts


Hands-free operation intuitive commands

minimal cognitive strain | reliable in hazard terrain

Goals & Needs


Hands-free operation intuitive commands

minimal cognitive strain | reliable in hazard terrain

Pain Points


Hands-free operation intuitive commands

minimal cognitive strain | reliable in hazard terrain

Success Metrics

👷

mission control engineers

SOLUTION

Solution

MoonBuddy is a voice-based AR EVA interface to aid astronauts in communication, navigation, and lunar exploration, developed in Unity3D for Microsoft HoloLens.


In our final solution, we decided to develop an AR program to assist astronauts in lunar tasks including navigation, science sampling and rescue. In terms of the operational control, we provided voice interaction interfaces which would allow the astronauts to have more autonomy and efficiency than before.


Navigation

Real-time long-range routing and short-range hazard avoidance using terrain mapping + edge detection shaders.


Scientific Sampling

Step-by-step guidance for geology documentation, adaptable for both novice and expert astronauts.


Search & Rescue Emergencies (LSAR)

Emergency distress beacons with real-time location + guided navigation to assist teammates.


Vitals & Tasks

Always-available collapsible panels for suit vitals, instructions, and EVA checklists.

ideation

Lo-Fi Mockups

After our team sketched out a bunch of wireframes for how we were planning our thoughts, our team drew up lo-fi versions of how we wanted to envision our set up.

Design

Information Architecture

To better understand the organizational layout of our design, our team outlined.

Finalized Information Architecture Design Guide

Design

Team Sketching & Brainstorming

One of the most important insights that we got from Jay was an understanding of the difficulties that arise from being in an environment like the moon. For instance, pressurized suits make it incredibly difficult to move, making gestural control systems difficult. Additionally, LED displays often were not clear in conditions of drastic lighting.


These first-hand user insights inspired us to integrate mix-reality experiences and voice technology into our interface to make lunar exploration tasks easier for the users. With the development of AR headsets, we currently have viable alternatives that might provide a usable, clear and helpful user interface that helps an astronaut orient themselves, have certainty that their suits are safe, and provides tools for clear documentation of geological or navigational tasks. 

Team information architecture outline and UI brainstorm

ideation

MVP Prototype for MoonBuddy

After our team sketched out a bunch of wireframes for how we were planning our thoughts, our team drew up lo-fi versions of how we wanted to envision our set up

Our team designed the user interface to lie in the periphery of the user. Many of these elements can be expanded to reveal more information, and hidden away to get back to the task at hand.


Quotation mark indicate voice commands you can say like "Open Vitals", "Next Task" and "Go to Station A".

Living in the upper left corner, an astronaut can follow these onscreen instructions during the mission for current tasks, reducing cognitive load.


The expanded version of this list shows voice command suggestions with quotations, which could aid an astronaut in reminding them how to frame & take a photo of a rock for a time-sensitive geology task.

Interface includes the heads-up display are the vitals monitors, displaying the status of extra-vehicular activity suit status from the Telemetry Server, with the essential consumables always available with the remaining time. When expanded, it displays meters and color-coded warnings when vitals are out of safe range.



MoonBuddy provides a linear compass to display and easily direct you to your assigned station. By having a marker within forward range and start moving. As you do so, the overhead map indicates positions and updates your movement, as a marker moving closer to your destination on the lunar surface.


By staying safe during the EVA joruney with spatial mapping capabilities available, MoonBuddy employs a custom edge highlighting reveals edges of protrusions and drops. To help yhou be aware of hidden hazards as you navigatie to the next destionation.


When you get to yoru destination, MoonBuddy displays the steps needed to complete thee geologicy sampling process. Simply follow instructions for dscrinibg a rock or regolith. When it's time to document that space rock, simply say "Record photo" and it captures it all hands free.

MoonBuddy is equipped to receive Lunar Search and Rescue messages. When a LunarSAR message interrupts, it's distressed crew member's coordinates are displayed using the Compass and Map so you can start navigating ASAP. Meanwhile, you can send responses to the LunarSAR message to let them know you're on the way.

2 - Keep Track of Missions with Task List

3 - Monitor Vitals & Enironmental Conditions

4 - Provide Long-Range Navigation Assistance

5 - Provide Short-Range Navigation Assistance




6 - Guide Science Sampling Procedures

7 - Facilitate Lunar Search & Rescue

1 - Static State

Our Innovative Navigational Shader!

press & hold

ideation

Human in the Loop Testing

Our team tested a mockup version on the Microsoft Hololens 2, and asked participants to evaluate the prototype accordingly.



14 User Testing Sessions

26 design improvements

26 design improvements

6 Bug Fixes

"I like how the voice interaction is very simple”

“The basic functionality is so intuitive, i.e. the tasklist, minimap and compass”

indoor & good light conditions

outdoor & bad light conditions

ideation

Mid-Fi Updates

After our team sketched out a bunch of wireframes for how we were planning our thoughts, our team drew up lo-fi versions of how we wanted to envision our set up.

Research

Usability Testing @ Johnson Space Center

After our team sketched out a bunch of wireframes for how we were planning our thoughts, our team drew up lo-fi versions of how we wanted to envision our set up.

Design

Version #1 - Prototype

Our first night testing at NASA Johnson Space Center was a success, but led us to our main insight:

  • Orange UI was too bright

Design

Version #2 - MVP Prototype

Our final version tested on the second night at NASA Johnson Space Center went incredibly well. Our participants found MoonBudy intuitive to use, from the well-worded voice controls to the layout of the UI to the comfort of having personal health metrics available on the main navigation bar. Evaluating engineers and astronauts were most excited by our most innovative feature: the navigational shader, which was received with overwhelming acceptance from our users as it clearly outlined the short-distance walk way, helping users safe navigate unfamiliar rocky and uneven terrain.

HOW MIGHT WE

How might we design an AR/voice-based EVA interface that empowers astronauts to navigate, communicate, and complete mission-critical tasks autonomously — without increasing cognitive strain?

Research

How Might We

Consolidating our insights led us to our guiding question.

Operable while users are wearing EVA gloves and executing dexterous tasks.

1

Unobtrusive, easy to use, and decrease the cognitive load of users.

2

3

Astronauts must access EVA task instructions and suit vitals at any time

4

Exible enough to handle astronaut requests at any time.

6

Must be able to communicate with EVA system crews without interference.

7

Provide environmental awareness of hazardous regions.

8

Clearly display warning messages for important anomalies.

5

Provide real-time navigation between any given points or crew members.

Research

Design Requirements

From our pain points, our team translated an outline of what our design must have based on the following necessities.

High Cognitive Load

Astronauts are juggling intense mental workloads with limited time, the system design must reduce their mental strain.

Compromized Situational Awareness

Looking down at wrist-displays result in blind spots, compromising astronaut's awareness of their surroundings.

Time-Sensitive & Critical Mission Tasks

Geological sampling are time-sensitive, multi-step tasks, requiring completion free of error & direct MCC guidance.

Cognitive Overload &

Mission Demands

Astronaut Autonomy &

Critical Information

Physical &

Environmental Conditions

Synthesizing our insights from expert interviews and archiaval analysis, our team realized the following pain points that we had to design for within our final solution. Our team must find a way to counterract the following difficulties with ease.

Bulky Extra-Vehicular Activity Suit Limits Dexterity

Bulky spacesuit gloves make buttons impossible, requiring a completely hands-free system for astronauts to operate.

Harsh Lunar Lighting Conditions

Moon’s blinding glare and shadows require an interface that stays easy to read while revealing dangers in the terrain.

Unfamiliar & Hazardous Terrain

Moon's repetitive landscape makes it easy to get lost, requiring a system that provides constant, safe navigation.

Fragmented Data Streams

Key info like EVA task instructions or suit vitals are spread across separate displays, requiring a unified interface for all mission data.

Mission Control Center Latency

Communication delays hinder real-time help from MCC, astronauts may need to handle all situations autonomously.

Search & Rescue Emergencies (LSAR)

In an emergency, there is no system to automatically broadcast an astronaut's location, currently relies on MCC verbal communication.

Research

Pain Points

Our early research combined primary expert interviews and archival analysis.

Research

Preliminary Research

0:00/1:34

We analyzed publicly accessible audio files from Apollo missions to identify common pain points, communication breakdowns, and workflow inefficiencies experienced by astronauts in the field.

Review Audio of Past Missions

2

We held semi-structured interviews with former astronaut Jay Apt, NASA engineers, and geology experts to understand operational realities and technical constraints.

"Moving your finger in space is like picking up a gallon of milk" - Jay Apt

Expert Interviews

1

nASA IS ASKING FOR….

A context-aware interface that consolidates navigation, vitals, and documentation — reducing cognitive overhead while enabling

safe, autonomous operations.

Overview

Challenge

oVERVIEW

Challenge


Future Artemis missions will push astronauts into more autonomous operations than ever before.


Unlike past Apollo missions where Mission Control Center provided constant, step-by-step mission guidance, significant communication delays are imminent as we explore farther from Earth (a 1.3-second delay to the Moon, 20 minutes to Mars). This reality necessitates a fundamental redesign of astronaut tools.


Previous solutions, like paper cuff checklists and wrist-mounted displays, are cognitively taxing and interrupt situational awareness, especially when astronauts are constrained by bulky Extra-Vehicular Activity (EVA) suits.

Past Solution: Paper Cuff Checklists Photo: NASA

Example of an Extra-Vehicular Activity (Lunar Surface) Mission

Astronaut wearing paper cuff on left arm Photo: NASA

Photo: NASA

Preview

MoonBuddy at NASA

Helping future Artemis Astronauts navigate the lunar surface with ease.

In early 2022, our team collaborated with NASA SUITS and Carnegie Mellon to design an AR/voice-based headset interface for astronauts on upcoming Artemis lunar missions. Astronauts must navigate hazardous terrain, conduct time-sensitive geology research, and coordinate rescue operations, all while constrained by bulky Extra-Vehicular Activity (EVA) suits and limited resources.

I led research and communications, working directly with astronauts and engineers to better understand needs, pain points, and test usability under simulated lunar conditions.


Our solution, powered by the Microsoft HoloLens 2, introduced a novel navigational shader reduced astronaut cognitive strain and earned “Most Innovative Feature” recognition at NASA Johnson Space Center. Our research was published in UIST.

Year

Jan 2022 - Sep 2022

Skills

Qualitative Research

Information Architecture

Usability Testing

Field Studies

Lo & Hi-Fi Prototyping



3 Design Researchers

3 Engineers

Team

Design & Outreach

Role

Thank you for browsing :)

Interested in learning more? Please feel free to reach out on LinkedIn or my email!

Final Thoughts

Reflection

MoonBuddy was my first in-depth exploration into human-computer interaction under extreme environmental constraints.


The project taught me how human factors and technical limitations can catalyze design innovation, especially since every decision, from shader visibility to command syntax, impacted astronaut safety and cognition.


If expanded today, I’d explore:

  • Adaptive multimodal feedback (audio spatialization + haptic alerts).

  • Dynamic UI adaptation based on real-time cognitive load.

  • Context-aware AR landmarks for spatial orientation in low-visibility terrain.


This project reaffirmed my belief that well-designed systems are empathy in action, empowering users not just to perform tasks, but to perform them confidently under pressure.