Advertisement

NASA’s cloud push is motivated by data constraints

The agency is moving satellite ground stations to its DAPHNE service and wants TMF funds for a FISMA High enclave.
A NASA ground station in Antarctica for tracking and communicating with satellites in space. (Getty Images)

Editor’s Note: This story has been updated with information on the first missions to use the DAPHNE service, the clouds that are part of the project and a timeframe for mission launches.


NASA’s move to the cloud is in large part to accommodate the data it’s receiving from space and accelerate resulting innovations, rather than simply migrate applications to a more secure environment.

The agency operates satellite ground stations worldwide that are part of its IT footprint, and its Data Acquisition Processing and Handling Network Environment (DAPHNE) project aims to connect them to commercial clouds.

Advertisement

NASA has spent billions of dollars to date trying to address the growing challenge of transmitting data from next-generation satellites and overseas ground stations to make scientific discoveries, and the global scale of many cloud providers is hard to replicate.

“We’re going to be beaming back data from space and transporting it back to the United States via the cloud network,” said Joe Foster, cloud computing program manager at the Goddard Space Flight Center, during a recent ATARC event. “It’s saved us money, in terms of investing in infrastructure, and it’s allowed us to refocus those dollars into things like upgrading the actual satellite antennae themselves.”

The first missions to adopt the DAPHNE service will be the NASA-Indian Space Research Organization (ISRO) Synthetic Aperture Radar (NISAR), being codeveloped to monitor Earth hazards and environmental change, and the Plankton, Aerosol, Cloud, Ocean Ecosystem (PACE), a satellite that will help understand how the ocean and atmosphere exchange carbon dioxide. NISAR will produce 41 terabits of earth science data per day — which for comparison would take thousands of cellphones to hold — and PACE 4 terabits per day.

DAPHNE will allow that data to be processed at a fraction of the price, with cloud costs dependent upon each mission’s data volume. Exactly how missions will use DAPHNE depends on their launch schedule, which has shifted due to the COVID-19 pandemic.

NASA onboarded several commercial cloud services with which it’s growing operational maturity. Although DAPHNE currently only uses Amazon Web Services, it’s designed to be interoperable among various cloud service providers whose services will be used depending on mission requirements.

Advertisement

DAPHNE is part of the Near Space Network Initiative for Ka-Band Advancement (NIKA), which has a readiness review scheduled for the fall in advance of mission launches. Future missions that will also use DAPHNE include the Nancy Grace Roman Space Telescope, as well as the Spectro-Photometer for the History of the Universe, Epoch of Reionization, and Ices Explorer (SPHEREx).

NASA is also part of an open-source project, Pangeo, a managed Jupyter hub and data-science platform designed to be deployed in any cloud environment. Along with the National Oceanic and Atmospheric Administration and U.S. Geological Survey, the agency is using Pangeo to store machine-learning algorithms performing earth science.

In recent years, Goddard has placed legacy microscopes and other instruments with firmware too old to be upgraded in a Secure Lab Enclave, but data from those devices can still be analyzed in the cloud.

“We basically firewall off and put on a private network all of these legacy pieces of equipment that can’t get upgraded anymore,” Foster said. “That doesn’t mean that you can’t take telemetry data or files off of the piece of equipment and try to run diagnostics or things like that in a more modern cloud environment.”

NASA is standing up cloud-enabled projects at a faster rate because by departing from its old, contract-task order approach, which took as long as six months. Now projects are launched in 45 minutes, and the agency wants to replicate the National Geospatial-Intelligence Agency’s authority to operate (ATO)-in-day effort, where core services are approved at the enterprise level.

Advertisement

While NASA’s cloud platform is currently only accredited at a FISMA Moderate level because its science is largely public-facing, the agency is in the process of creating a FISMA High enclave for more sensitive data dealing with flight and launch capabilities and human space exploration. The enclave is part of NASA’s effort to implement the recent cybersecurity executive order and move from legacy, on-premise systems to a zero-trust, cloud-based model, and the Technology Modernization Fund could cover the cost.

“With the new executive order we wrote a proposal to the TMF [Board] to upgrade and create a FISMA High enclave,” Foster said.

This story was featured in FedScoop Special Report: Government Powered by Data - A FedScoop Special Report

Latest Podcasts