Neil Mchenry, Leah Davis, Israel Gomez, Noemi Coute, Natalie Roehrs, Celest Villagran, G. Chamitoff, A. Diaz-Artiles
{"title":"Design of an AR Visor Display System for Extravehicular Activity Operations","authors":"Neil Mchenry, Leah Davis, Israel Gomez, Noemi Coute, Natalie Roehrs, Celest Villagran, G. Chamitoff, A. Diaz-Artiles","doi":"10.1109/AERO47225.2020.9172268","DOIUrl":null,"url":null,"abstract":"An Extra-Vehicular Activity (EVA) is one of the most challenging operations during spaceflight. The current technology utilized during a spacewalk by an astronaut crewmember includes real-time voice loops and physical cuff checklists with procedures for the EVA. Recent advancements in electronics allow for miniaturized optical displays that can fit within a helmet and provide an alternative method for a crewmember to access mission data. Additionally, cameras attached to helmets provide EV astronauts' several Point of Views (POVs) to Mission Control Center (MCC) and Intra-Vehicular (IV) astronauts. These technologies allow for greater awareness to protect astronauts in space. This paper outlines the design and development of a custom augmented reality (AR) visor display to assist with human spaceflight operations, particularly with EVAs. This system can render floating text checklists, real-time voice transcripts, and waypoint information within the astronaut's Field of View (FOV). These visual components aim to reduce the limitations of how tasks are communicated currently. In addition, voice commands allow the crewmember to control the location of the augmented display, or modify how the information is presented. The team used the Microsoft HoloLens 1 Head Mounted Display (HMD) to create an Augmented Reality Environment (ARE) that receives and displays information for the EVA personnel. The ARE displays the human vitals, spacesuit telemetry, and procedures of the astronaut. The MCC and other astronauts can collaborate with the EVA crewmember through the use of a 3D telepresence whiteboard, which enables 2-way visual communication. This capability allows interaction with the environment of the EV astronaut without actually having to be outside the spacecraft or even onboard. Specifically, mission personnel in a Virtual Reality (VR) Oculus Rift head mounted display could draw shapes in the EV members' view to guide them towards a particular objective. To test the system, volunteers were asked to proceed through a mission scenario and evaluate the user interface. This occurred both in a laboratory setting and in an analog mockup at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC), using both the Microsoft Hololens and Oculus Rift in coordination with the NASA Spacesuit User Interface Technologies for Students (SUITS) Competition. The major goal of testing the User Interface (UI) was determining features contributing to a minimized cognitive workload and improving efficiency of task completion. AR technology has the potential of dramatically improving EVA performance for future manned missions. With the HoloLens, the team implemented an efficient and elegant design that can be individualized by the user. The system provides as much functionality as possible while remaining simple to promote user-friendliness.","PeriodicalId":114560,"journal":{"name":"2020 IEEE Aerospace Conference","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Aerospace Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AERO47225.2020.9172268","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
An Extra-Vehicular Activity (EVA) is one of the most challenging operations during spaceflight. The current technology utilized during a spacewalk by an astronaut crewmember includes real-time voice loops and physical cuff checklists with procedures for the EVA. Recent advancements in electronics allow for miniaturized optical displays that can fit within a helmet and provide an alternative method for a crewmember to access mission data. Additionally, cameras attached to helmets provide EV astronauts' several Point of Views (POVs) to Mission Control Center (MCC) and Intra-Vehicular (IV) astronauts. These technologies allow for greater awareness to protect astronauts in space. This paper outlines the design and development of a custom augmented reality (AR) visor display to assist with human spaceflight operations, particularly with EVAs. This system can render floating text checklists, real-time voice transcripts, and waypoint information within the astronaut's Field of View (FOV). These visual components aim to reduce the limitations of how tasks are communicated currently. In addition, voice commands allow the crewmember to control the location of the augmented display, or modify how the information is presented. The team used the Microsoft HoloLens 1 Head Mounted Display (HMD) to create an Augmented Reality Environment (ARE) that receives and displays information for the EVA personnel. The ARE displays the human vitals, spacesuit telemetry, and procedures of the astronaut. The MCC and other astronauts can collaborate with the EVA crewmember through the use of a 3D telepresence whiteboard, which enables 2-way visual communication. This capability allows interaction with the environment of the EV astronaut without actually having to be outside the spacecraft or even onboard. Specifically, mission personnel in a Virtual Reality (VR) Oculus Rift head mounted display could draw shapes in the EV members' view to guide them towards a particular objective. To test the system, volunteers were asked to proceed through a mission scenario and evaluate the user interface. This occurred both in a laboratory setting and in an analog mockup at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC), using both the Microsoft Hololens and Oculus Rift in coordination with the NASA Spacesuit User Interface Technologies for Students (SUITS) Competition. The major goal of testing the User Interface (UI) was determining features contributing to a minimized cognitive workload and improving efficiency of task completion. AR technology has the potential of dramatically improving EVA performance for future manned missions. With the HoloLens, the team implemented an efficient and elegant design that can be individualized by the user. The system provides as much functionality as possible while remaining simple to promote user-friendliness.