NASA Ames Intelligent Robotics Group (IRG)
See the latest news, photos, and videos on our blog... NEW!!
The Intelligent Robotics Group (IRG) explores extreme environments, remote locations, and uncharted worlds. We conduct applied research in computer vision, geospatial data systems, human-robot interaction, planetary mapping and robot software. (IRG brochure)
IRG operates the "K-REX" and "K10" series of planetary rovers (slideshow, fact sheet). We conduct robotic field tests each year in planetary analog sites, such as Black Point Lava Flow and Haughton Crater.
We are committed to collaboration. Read about our Google partnership in the Mt. View Voice and our Microsoft partnership). If you are interested in working together, please contact us today.
IRG at the 2017 Consumer Electronics Show (CES) (GeekWire, 2017) NEW!!
CNET's CES 2017 Robotics panel: Are they ready to help? (CNET, 2017) NEW!!
Technology Transfer: Human-Robot Teaming (NASA Ames (YouTube), 2017) NEW!!
Designing Robots For Future Space Exploration (NASA Ames (YouTube), 2017) NEW!!
A Conversation with Terry Fong (NASA in Silicon Valley Podcast, 2017) NEW!!
Getting the Buzz on Astrobee (Space Station Live (YouTube), 2016)
Rover Searches California Desert for Water to Simulate Future Lunar Missions (NASA Ames (YouTube), 2015)
Smart SPHERES (Space Station Live, NASA Television, 2014)
Technology Demonstration Missions (K10 and ISS) (NASA Edge, 2014)
Controlling a robot on earth from space (Space Station Live, NASA Television, 2013)
NASA tests next-gen rovers to explore the moon and Mars (CNET, 2013
Controlling a rover from space (ISS Update, NASA Television, 2013)
Smartphone powers Star Wars-inspired NASA robot (CNET, 2013)
Robots and Humans in Space (Forum, KQED-FM, 2013)
Exploration Ground Data System (xGDS) (Google Tech Talk, 2013)
Planetary Mapping (Google Tech Talk, 2013)
Smart SPHERES test on the International Space Station (ISS Update, NASA Television, 2012)
Smart SPHERES (Google Tech Talk, 2012)
Human Exploration Telerobotics (Destination Innovation, 2012)
Human Exploration Telerobotics (NASA, 2012)
Android on SPHERES (Google, 2011)
GigaPan Time Machine (Drive to Discover, KGO-TV, 2011)
Open Source at NASA (NASA Open Source Summit, 2011)
Robotic Recon and Follow-Up for Human Exploration of the Moon (NASA Lunar Science Institute Seminar, 2011)
GeoCam Disaster Response Project (2011)
WWT Mars (Microsoft WorldWideTelescope, 2010) Courtesy Microsoft. Copyright 2010. All rights reserved.
Robotic Follow-up Field Test, Haughton Crater (Haughton-Mars Project, 2010)
Planetary Robotics for Human Exploration (Robotics Summit, 2010)
Planetary Exploration REBOOTED: New ways of exploring the Moon, Mars and beyond (PARC Forum, 2010)
Robotic Recon Field Test, Black Point Lava Flow (This Week @ NASA, 2009)
GigaPan robot camera (NASA 360, 2009)
K10 Robots at Moses Lake Sand Dunes (NASA Launchpad, 2008)
Human-Robot Site Survey Field Test, Haughton Crater (Discovery Channel Canada, 2007)
Human-Robot Site Survey Field Test, Haughton Crater (NASA Ames, 2007)
K10 Robots (The Next Step, 2007)
Peer-to-Peer Human-Robot Interaction (Talking Robots, 2006)
Peer-to-Peer Human-Robot Interaction (NASA Ames, 2006)
IRG Open-Source Software
From July 26 to August 8, 2010, IRG conducted a field test at Haughton Crater in the Canadian Arctic to study how human field work can be augmented with subsequent robot activity. For more information, click here!
Exploration Ground Data System (xGDS)
: Tamar Cohen and Matt Deans
IRG’s Exploration Ground Data System (xGDS) provides software tools to plan robot activities, monitor task execution, log robot telemetry, archive science data, and visualize a wide range of information.The xGDS makes use of the NASA Ensemble framework, Web interfaces and geospatial data browsers.
: Trey Smith
The GeoCam project helps people better understand and respond to disasters. GeoCam consists of a GPS-enabled camera (or cell-phone) and a “live” geospatial workflow. Disaster responders have used GeoCam for urban search and rescue exercises and on-site at major wildfires.
+ Visit GeoCam
: Randy Sargent
GigaPan makes creating billion-pixel panoramas easy to do. The low-cost GigaPan robotic camera mount captures hundreds of images, which are combined into interactive panoramas. These panoramas have a myriad of uses including discovery, education, entertainment, and science. Read about the Google-NASA partnership
+ Visit GigaPan
Human Exploration Telerobotics (HET)
: Terry Fong
The Human Exploration Telerobotics (HET) project explores how advanced telerobotics can improve the productivity of human explorers and increase the performance of human missions. HET is conducting tests and demonstrations of robot systems remotely operated by crew in space and by ground controllers on Earth. HET makes use of the International Space Station (ISS) and a wide variety of robots (Robonaut 2, SPHERES, K10, Centaur 2, and ATHLETE).
Human-Robotic Systems (HRS)
: Matt Deans
We are field-testing robots to understand how they can be used to improve human productivity and science return. When humans return to the Moon, crews will initially be on the surface less than 10% of the time. Robots can perform work even when humans are not present, including reconnaissance, survey, and inspection. Field tests: 2007
Lunar Mapping and Modeling
: Ara Nefian
NASA’s goal of returning humans to the Moon has led to renewed interest in lunar data sets and lunar science. Our work involves processing on an unprecedented scale, using terabytes of data acquired by the Lunar Reconnaissance Orbiter and newly processed Apollo camera images.
: Ross Beyer and Ara Nefian
We focus on advanced computer vision techniques for planetary mapping, planetary geospatial data, and research aspects of photogrammetry and stereogrammetry.
+ Visit IRG's MapMakers
NASA Vision Workbench
The NASA Vision Workbench (VW) is an open-source C++ framework for efficient computer vision. Vision Workbench has been used to create gigapixel panoramas, 3D terrain models from satellite images, and high-dynamic range images for visual inspection.
+ Visit NASA Vision Workbench
: Ted Scharff
The Neo-Geography Toolkit (NGT) is a suite of automated processing tools
for geospatial data. NGT can transform raster/vector data, metadata and
geo-tagged data into a variety of formats including KML and WTML. NGT is highly
scalable and can process multi-terabyte data sets. NGT was used to help create
WorldWideTelescope | Mars.
+ Visit Neo-Geography Toolkit
Pipeline Threat Detection
: Xavier Bouyssounouse
The Pipeline Threat Detection project develops algorithms to automatically detect heavy digging equipment near oil and gas pipelines. Data from multiple sensors are analyzed with computer vision and machine learning. Improving threat detection will help reduce the risk and cost of pipeline damage.
The Planetary Content project makes NASA's vast stores of planetary data more accessible and useful through Web-enabled tools. In collaboration with Google and other NASA centers, Planetary Content has produced The Moon in Google Earth
Google Mars 3D
, Google Moon (2D maps)
, and the NASA Gallery in Google Earth
. Read about the Google-NASA partnership
+ Visit Planetary Content
: Hans Utz
IRG’s Rover Software (RoverSW) is a service-oriented architecture that encapsulates robot functions (locomotion, navigation, localization, instrument control, etc.) as self-contained computing services. With this approach, robotic applications can be built as a collection of on-demand services.
Science Operations for Robots
: Matthew Deans
To understand how best to integrate ground-based “backroom” teams into lunar surface operations, we are developing a new ground control team structure and operational protocols. We are testing the ground control with planetary scientists and the NASA JSC Mission Operations Directorate.
: Zachary Moratto
Digital terrain models have long been essential for science analysis, mission planning and mission operations. We have developed the Ames Stereo Pipeline to generate and mosaick high-quality topographic models automatically from 3D range data (stereo images, LIDAR scans, etc.)
Susan Lee and Mark Allan
VERVE is a high-performance, robot user interface that provides scientists, robot operators, and mission planners with powerful, interactive 3D displays of remote environments. VERVE derives from the prior Viz system, which was developed for Mars Polar Lander (2001) and used for the Mars Exploration Rover (2003) and the Phoenix Lander (2008).
3D Surface Reconstruction for the Context Imager (CTX)
Project Lead: Laurence Edwards
Martian surface models created from imagery by the Mars Reconnaissance Orbiter Context Imager.
Project Lead: Vytas SunSpiral
Tools and techniques for mobile and non-prehensile manipulation.
Astrobiology machine Vision Toolkit (AMVT)
Project Lead: Matthew Deans
AMVT was a set of computer vision tools used in geological and exobiological research.
ATHLETE Footfall Planning
We developed planning system to enable JPL’s ATHLETE robot to walk on natural terrain. Our system processes images from robot cameras into 3D terrain models, which are analyzed for walking. A 3D user interface and motion planner enables remote operators to plan and visualize robot steps.
+ Visit ATHLETE Footfall Planning
Communications Relay Deployment
Project Lead: Estrellina Pacis
Automatically deploy Wi-Fi relays from a mobile robot to extend a data network. Collaboration with SPAWAR.
Project Lead: Liam Pedersen
This system enabled safeguarded remote driving under lunar crater relevant conditions.
Haughton Crater Site Survey Field Test
Project Lead: Terry Fong
A field test of a robotic survey system at a lunar analog site.
+ Visit Haughton Crater Site Survey Field Test
Hydrogen Resource Prospecting
Project Lead: Linda Kobayashi
Search for subsurface hydrates using the HYDRA neutron spectrometer on a K10 rover. Collaboration with Los Alamos National Laboratory.
Project Lead: Leslie Keely
Interactive 3D user interface for exploration visualization. Used for the Mars Phoenix mission.
Multi-robot Site Survey
Project Lead: Terry Fong
The goal of this project was to develop robust human-robot techniques for site surveying and sampling.
Peer-to-Peer Human-Robot Interaction
Project Lead: Terry Fong
Development of techniques for improving task coordination and collaboration between human and robot partners.
Percussive Dynamic Cone Penetrometer (PDCP)
Project Lead: Susan Lee
Collaboration with Honeybee Robotics to develop a robot-mounted instrument for making geotechnical measurements.
Project Lead: Mark Allan
Real-time 3D display of health, status, and kinematic configuration of the JSC Robonaut robot.
Single-Cycle Instrument Placement
Project Lead: Liam Pedersen
Techniques enabling a rover to visit and examine multiple targets in a single command cycle without supervision from mission control.
Wireless Communications Mapping
Project Lead: Vinh To
Systematically map Wi-Fi network coverage using a mobile robot to identify dead zones and coverage quality at an outdoor site.