Earth support coming to ASP

February 23 2012

For those not familiar, Ames Stereo Pipeline (ASP) is an open source toolset provided for free by IRG. We built it on top of another fantastic collection of tools called ISIS provided by USGS. This allowed our group and the public to produce 3D models of planets and moons across our solar system. ASP played an essential role in our Lunar and Mapping Modeling project where we produced high-resolution elevation models using images from the 1970’s Apollo missions.

DEM Produced by ASP of the Rocky Mountains.

Up until recently, ASP hasn’t ever performed work using satellite images of the Earth. This will change in our next binary release where we will be adding support for stereo images captured by Digital Globe.  Our source code on Github has recently added a change to allow ASP to read camera information from Digital Globe’s XML format. This has allowed us to produce our first colorized height model of the Rockies shown right.

The Earth Sciences directorate of NASA has graciously funded our recent developments in ASP. We have the immediate hope that our tools will allow researchers to model ice-sheet flows in both the Antarctic and Greenland. The autonomous nature of ASP will allow scientists to get weekly ice flow updates where it was previously impossible due to manual costs. This potentially gives researchers new methods for insights into our changing climate.

(7) comments.

Robots Aboard the International Space Station [video]

February 10 2012

There’s a short article with a video about our Human Exploration Telerobotics project on the IEEE Spectrum website.

(1) comment.

NASA ‘Smart SPHERES’ Tested Successfully on International Space Station

December 19 2011

Technology Demonstration Missions has put up a feature about our smartphone work. View the article here.

(8) comments.

Planetary Lake Lander Blog

November 28 2011

The Planetary Lake Lander (PLL) team, headed by PI Nathalie Cabrol of SETI, is studying the effects of melting glaciers on lakes in the Central Andes. The first goal of the research is to learn about the changes in Earth’s glacial lake ecosystems, but the findings will also be relevant to the study of life on Mars during comparable times in Mars history. In addition, PLL must be operated remotely with limited bandwidth and power; the technologies PLL matures to address these challenges will be relevant to future missions to the seas of Titan.

Two members of IRG are currently at Laguna Negra in Chile with the Planetary Lake Lander team. They’ll post more details about what they’re doing when they get back, but in the meantime you can follow their progress on the Planetary Lake Lander blog.

(38) comments.

Moon Maps from the Apollo Metric Camera

November 21 2011

Not everyone at IRG plays with robots. Some play with Eclipse and will never be understood. Others, like the Mapmakers, play with satellite imagery. IRG is happy to announce that those oddballs just completed a 3-year project to produce the “Apollo Zone” Digital Image Mosaic (DIM) and Digital Elevation Model (DEM). These maps cover approximately 18% of the Lunar surface at a resolution of 1024 pixels per degree (or about 40 m/pixel).

To preview the “Apollo Zone” maps, download the following KML file for viewing in Google Earth.

http://byss.ndc.nasa.gov/stereopipeline/dataviz/apollo_metric.kml

Once you open the file in Google Earth you will have radio options to view both maps overlaid Google Earth’s Moon mode. These maps have also been upload to the Lunar Mapping and Modeling Project (LMMP) portal and will soon be available for visualization and download via that site.

These maps are created from the Apollo Metric (Mapping) Camera, which flew aboard Apollo 15, 16, and 17 in the 1970s. Much later, ASU’s Apollo Image Archive scanned them into high-resolution digital form. This made it much easier for scientists and the public alike to explore the dataset. From there, IRG aligned 4000 of the images and then processed them for 3D data using the Pleiades Super Computer. IRG’s open source software, Vision Workbench and Ames Stereo Pipeline performed all the work. Those same tools can be used to process other datasets like the Lunar or Mars Reconnaissance Orbiters.

This work was funded by the Lunar Mapping and Modeling Project (LMMP). We gratefully acknowledge the support of our collaborators at NASA MSFC, NASA GSFC, JPL and USGS. Our special thanks go to Ray French and Mark Nall for their support and leadership of LMMP.

 

(8) comments.

HET Smartphone Operated in Space

November 18 2011

Mike Fossum shows off the Smartphone attached to a SPHERES satellite.

On November 1, 2011, IRG hardware and software was operated in space for the first time. During SPHERES Test Session #29, commander Mike Fossum unpacked the Smartphone and attached it to the SPHERES satellite. During the session, the satellite performed several simple maneuvers while the phone was running the Sensor Data Logger (a free app for Android available here). The test ran smoothly and now we are busy analyzing the log files from the phone. The log files will tell us how the phone sensors behave in space and help us design our ground interface for teleoperation of the SPHERES satellite through the Smartphone.

Commander Mike Fossum operates the HET Smartphone on a SPHERES satellite.

Video: Our experiment is mentioned in the ISS Update for 11/1/11 (watch here, SPHERES portion starts at 0:46).

Leave a comment.

Article: Nexus S takes to space aboard Atlantis [video]

September 2 2011

Here’s an Engaget article about our smartphone, including a video. Enjoy!

(26) comments.

Article: Exploration Ground Data System at Pavilion Lake

August 30 2011

Click here to read about the work of our Exploration Ground Data System (xGDS) team at the Pavilion Lakes Research Project.

Leave a comment.

Gigapan Voyage Remotely Captures First Underwater Panorama

August 17 2011

IRG recently captured its first underwater panorama in Key Largo, Florida, for the NEEMO project. IRG was using Gigapan Voyage (vo-YAH-je), a system that provides high-resolution imagery for planetary analog field tests. Gigapan Voyage uses the same technology that was spun off by the GigaPan Project to create and explore photographic panoramas. However, while a regular GigaPan requires the user to set up the hardware, Gigapan Voyage can be configured remotely through a web interface. This change allows for mounting on mobile platforms such as robots, and the system has already proven its usefulness during several robotic field tests.

The upcoming NEEMO 15 experiment will test equipment and operational concepts for exploration of near-Earth asteroids. Gigapan Voyage will provide panoramas to the scientists, mission operators, and astronaut crews involved in the two-week test. This is the first time the Gigapan Voyage camera will be used underwater, so the team has developed a new prototype underwater-rated unit especially for NEEMO. The new unit has a Sanyo 5400 HD Pan-Tilt-Zoom unit in a glass dome enclosure and custom control software to remotely set and capture panoramas. We’ll post more (including pictures!) when NEEMO 15 gets underway on October 17.

This is the prototype Gigapan Voyage that will be used underwater during NEEMO 15.

In the meantime, Gigapan Voyage units are mounted on the ARC K10 rovers and two Space Exploration Vehicles (SEV’s) used during the Desert Research and Technology Studies (Desert RATS) summer field tests at Black Point Lava Flow, Arizona. DRATS this year will be in September, check back here for updates on IRG’s role in this year’s field test!

Here are a few panoramas we took during our tests from Aug 12th-16th :

Link to full gigapan: http://www.gigapan.org/gigapans/84298/

Link to full gigapan: http://www.gigapan.org/gigapans/84344/

Link to full gigapan : http://www.gigapan.org/gigapans/84381/

(7) comments.

HET Smartphone In Space

July 12 2011

A key question for future deep-space human missions is under what operational conditions and scenarios is it advantageous for crew to operate a robot “locally” versus controlling it from mission control on Earth? Future human missions to the moon, Mars, and other destinations offer many new opportunities for exploration. But, crew time will always be limited and some work will not be feasible for astronauts to do manually. Robots, however, can complement human explorers, performing work under remote control from a crew vehicle, or even from Earth.

The HET Smartphone project will explore how we can use free flying robots to augment and support crew activities. It uses the MIT SPHERE Satellite, which is a self-contained free flyer with power, propulsion, computing and navigation equipment. For our experiments, the Samsung Nexus S Smartphone is connected to the SPHERES free flyer through a communication cable. A wireless network connection to the ISS provides the data path to the ground. The Nexus S acts as an embedded computing platform and integrated sensor package since it has gyros (position), accelerometers (movement), magnetometers (compass), cameras, and microphones. The size and integration of the components make it an ideal platform for experiments like this.

Is this the first smartphone to communicate from the ISS?

Yes. This Google Android Nexus S will be the first smartphone to communicate from the ISS to the ground.

How do you plan to control the Smartphone? Is it like a radio-controlled toy?

Communications to and from the ISS can often be delayed as much as 6 seconds and there are regular communication outages due to ground station coverage. As such, we plan to use very high-level control for the Smartphone. “Navigate to this location.” “Look over here.” “Follow this path,” are just some examples of the types of command series that we might send to the Smartphone from Earth. The significant delay will prohibit us from flying the Smartphone the way that a person might “joystick” a radio-controlled toy on earth.

Will the SPHERES and Smartphone go outside of the ISS?

No. Currently the equipment is designed to operate within the atmosphere of the ISS. That said, the experiments that use the Smartphone are being used to explore how this technology might be used inside, and outside spacecraft on future missions.

What hardware modifications did you make to the phone?

The make it through flight qualifications, the team made two major modifications to the phone. First we removed the GSM chip on the phone so that the cell phone transmitter would not interfere with ISS equipment. This modification can be thought of as hardware “airplane mode” and is done for the same reason that you turn off your cell phone on airplanes.

The second modification was the removal of the Lithium batteries. Lithium rechargeable batteries are difficult to certify for flight, so we decided that using safer Alkaline AA batteries would be appropriate. We power the phone with 6 AA batteries that are connected to a DC regulator that makes the voltage compatible with the phone.

What software modifications did you make to the Android operating system?

We needed to load Android programs on the phone via the internal SD card and download movies and telemetry data from the phone’s SD card. The majority laptops on the ISS are Windows XP SP3. Since the Nexus S is a very new phone, we found that XP had some problems identifying the mass storage device functionality of Android.

We modified how the phone emulates a mass storage device by emulating the USB hardware ID of a Nexus One (which did work correctly) and allowed the user to enable the mass storage device mode prior to plugging in the USB cable.

The wireless tethering mode was also modified. In the even that we need to connect a laptop to the phone directly, we modified the wireless tethering feature so that it can act as an AP, but then the phone routes traffic through the connecting laptop. This is backwards from the normal mode of operation, where the connecting laptop routes its traffic through the phone.

Were there other safety modifications that you made to the phones?

There was a concern that the glass screen might break and eject material into the ISS. We discovered that the capacitive touch screen worked with Acrylic and Teflon tape covering the screen surface. So, during flight preparation we applied Teflon tape to the front of the phone and then trimmed the tape to the outside edge of the phone. In the event of a glass breakage event, the Teflon tape would contain the glass and ensure that no material escapes.

Is the software available to download on my Android phone?

For general data collection our first experiments, we used the Cellbots logger by Google. It is available on the Android Market — It is an open source sensor data logger and the version that will be running on the ISS only changes the icons to comply with NASA interface guidelines.

(36) comments.

  • Measuring Lake Tahoe from Space

    This fall, the crisis mapping team extended their work on flood mapping to measure the water level of Lake Tahoe over time ... more

  • Flood Mapping with Adaboost

    The Crisis Mapping team recently wrapped up our research on flood mapping. Previously, we evaluated existing MODIS flood mapping algorithms across a diverse range ... more

  • Crisis Mapping Toolkit Released

    We are excited to announce the open source release of the Crisis Mapping Toolkit under the Apache 2.0 license! The ... more