Effective application rationalization eludes agencies

Not enough agencies are rationalizing applications effectively before migrating them to the cloud.

App rationalization involves agencies deciding which apps to keep, replace, retire or consolidate — but too often their chief information officers (CIOs) lack the data they need to make those choices, said Thomas Santucci, the director of the Data Center & Cloud Optimization Initiative project management office (PMO) within the General Services Administration.

While the federal Cloud Smart strategy mandated app rationalization, not enough agencies have a good handle on their inventory — where their application programming interfaces are and what data is being transferred.

“Right now there are too many enterprise architects using Excel spreadsheets, collecting moment-in-time instances of all of their data collections,” Santucci said during a Digital Government Institute event Wednesday. “Application rationalization takes a little bit more holistic approach in incorporating it into the acquisition process.”

The Federal Information Technology Acquisition Reform Act (FITARA) encourages agencies to have their CIOs approve major IT investments, but that process occurs too late and with too little data for them to make informed decisions, he added.

Agencies should collect that data in real-time. Instead, many financial management systems only deal at the investment level — like one small agency that had three investments, five security boundaries and 150 applications in one bundle, Santucci said.

App rationalization done right evaluates the total cost of ownership. And labor, not licensing, costs are paramount.

“If we start looking at the labor costs, we may save more money than we’ve done in the billions of dollars that we’ve saved closing data centers,” Santucci said.

The good news is the Cloud and Infrastructure Community of Practice (CoP) that Santucci co-chairs has grown by 2.5-times what it was 18 months prior to about 2,000 members. The CoP has 25 trainings planned, an IPv6 summit in the works and continues to share use cases.

Meanwhile, the CIO Council had Santucci’s PMO, which resides within GSA’s Office of Government-wide Policy, release a playbook on app rationalization for agencies. “We continue to concentrate on data center consolidation first and foremost,” Santucci said. “We urge agencies to close data centers, especially inefficient ones.”

Army finalizing new plan for ‘unified network’

The Army’s plan for how it will redesign its global network is slated to be finished in the coming months, the top uniformed officer overseeing the Army’s IT said Tuesday.

The service will create a “unified network” that links its enterprise IT architecture with tactical networks used by warfighters in the field, a change from the current segmented system, said Lt. Gen. John Morrison, deputy chief of staff and G-6. Creating a unified network is critical to achieving the Army’s goal of using data from the field to create a multi-domain operational system where soldiers on land can work seamlessly with fellow service members in the domains of sea, air, space and cyberspace, Morrison said.

“We are finalizing what we are calling the Army Network Plan,” he said during AFCEA’s TechNet Augusta virtual event Tuesday. Morrison is the Army’s top uniformed IT official, a role that was created last summer after the service split the traditional CIO role. He set the unification of the Army’s network as his office’s top guiding pillar.

Creating a network that will encompass enterprise and tactical workloads will require significant assistance from commercial industry, Morrison said. Once the plan is done, the Army intends to engage industry to help it build new network architecture.

In the “summer and fall we will have the architecture discussion,” he said.

One of the major challenges the Army faces in achieving the unified network is balancing access, speed and security. The whole point of unifying the network is to allow data to transfer more smoothly between systems and machines and not require humans in the loop to make connections that could easily be automated. But with more places for data to go, and presumably more endpoints using that data, opportunities ripen for hackers.

Morrison has previously said he wants to beef up the security of both the network itself and the tech operating on it with periodic reviews. He has not specified what new systems he wants to put in place but said the Army’s current security posture for its network is not up to the task.

“This is one of those effective drills that I think will allow us to apply our resources in a more efficient manner but brings a level of security to the network that, quite frankly, I don’t think we have right now,” he said.

The U.S. government needs access to commercial technologies to drive innovation

The U.S. government has been a world leader in technology innovation, making the government work better on behalf of our citizens, and funding many of the breakthroughs in commercial technology that underpin our daily lives. However, as a growing share of technology investment is coming from the private sector, the federal government is failing to adopt the best technology available.

This is costing taxpayers dearly when the government tries to build technology products from scratch when the same thing exists off of the shelf, and the lengthy government procurement process means that those charged with keeping this country safe and delivering essential services are using obsolete tools to do so.

That is why we are launching a new organization that will advocate on behalf of the nation’s most innovative technology companies and startups who are looking to do business with the federal government.

The Alliance for Commercial Technology in Government will advocate for its members in Washington, D.C., to ensure that the United States leverages commercial technologies to accelerate progress and enhance the lives of all Americans.

Many of the best, most innovative technologies are developed in the private sector with private capital. But with the time-consuming and costly nature of entering the federal market and with rules requiring consideration of existing off-the-shelf products being routinely ignored; our own government acquisition system is a barrier to progress and innovation. Based on the timelines and scale of many government acquisition programs, a small technology startup has no chance of entering the government marketplace on its own.

These barriers lead to the best available technology in the private sector only being purchased by the private sector and available to our adversaries, while the U.S. government is left with outdated and expensive products specifically designed only for government use, often incompatible with the technology products that tech-savvy employees would use every day outside of government.

I saw this personally as a government employee at the Pentagon in tasks as simple as collaborating with colleagues on documents. Our solutions were limited to emailing Microsoft Word documents with a naming convention for version control and typing messages on BlackBerries. We were using outdated technology to make policies such as the Third Offset Strategy, which was intended to give the U.S. military a long-term strategic advantage over our adversaries based on the adoption of cutting-edge technology. Fortunately, those policies generated some success. New organizations such as the Defense Innovation Unit were born out of that effort, an organization tasked with breaking down acquisition barriers to cutting-edge technology. We will work to amplify the successes the government has achieved and make easy access to government contracts the norm and not the exception.

The Alliance aims to help everyone by solving these challenges and transforming the federal government into an accessible marketplace for all technology companies, especially small companies and startups.

The Alliance has four main policy priorities it intends to tackle:

Significant progress on these four priorities would revolutionize the entire federal marketplace to be more accessible to startups and the entire commercial technology ecosystem, which will lead to much-needed modernization of our government’s technology infrastructure.

America was built on startups and small businesses; it is time the federal government creates a more accessible marketplace for commercial technology. Advocating for policies that can improve our government services and maintain our nation’s competitive advantage is long overdue. The Alliance will be the new voice to help Washington bring the best technology to the government.

David Vorland is the Executive Director of The Alliance for Commercial Technology in Government, a non-profit advocacy organization. Previously, he worked in the Office of the Secretary of Defense from 2009 to 2017.

New DARPA initiative gives contractors access to cutting edge commercial tech

Contractors working on emerging technology for the Defense Advanced Research Projects Agency could get access to the latest and greatest tech from other commercial companies through a new partnership the agency is forming with industry.

The Toolbox Initiative is a framework where DARPA facilitates agreements between “providers” of computing tech that could advance the work of DARPA’s contracted “performer” companies working on groundbreaking tech.

The program is currently gathering so-called “providers” to provide their tech, from chips to front-end compilers, to the DARPA “performers” working on everything from artificial intelligence to communications tech.

“We want the latest and greatest tech to come easy,” Serge Leef, the program manager in the Microsystems Technology Office leading the initiative, told FedScoop. Any time a performer spends haggling with another company for access to tech that could help advance science is “time wasted,” he added.

The Toolbox framework allows companies to access non-production licenses of intellectual property usually out of reach without lengthy contract negotiations on production terms, compensation and legal protections. It could help save the agency — which eventually would need to pick up the bill on costs associated with performers’ research — tens of millions of dollars, Leef said.

In the commercial world, if a chipmaker sells its tech to an autonomous car company, the manufacturer will put legal protections into the contracts to shield their tech from legal liabilities resulting from its use. It’s a process that can take months to settle legal negotiations over, said Leef, who has experience in private industry before coming to DARPA.

Since DARPA performers do not take their inventions to market or scale productions, they can use the non-production licenses for things like ARM processors or Rambus Inc.’s security interface controllers.

“I want the DARPA performers to have the same benefits as commercial industry,” Leef said.

For the providers in the program, they get essentially free marketing and access to the companies working on the cutting edge of science and technology. Leef described it as a win-win for both performers and providers.

The program is still waiting to fully launch, with an internal marketing campaign planned for the fall when DARPA plans to start hosting industry days. Leef said he has already heard from roughly a dozen other program managers interested in taking advantage of the framework to help their performers.

Customized agency networks are the ‘real benefit’ of 5G

Government agencies should model their 5G rollouts after the Department of Defense, which customized its network to meet mission needs rather than making it widely available, according to a National Science Foundation official.

NSF similarly has invested in foundational 5G research for the last decade and started the Platforms for Advanced Wireless Research program in 2016 to provide researchers with city-scale testbeds for such customizations, said Thyaga Nandagopal, deputy division director with NSF.

The initiative plans to launch a fourth testbed focused on affordable rural broadband access in the next month or so as the government continues to refine programmable network standards.

“The approach that the DOD has taken is very much tapping into the real benefit of 5G,” Nandagopal said, during the Federal Mobility Group and ATARC 5G Government Symposium on Tuesday. “Which is you can create a custom network that suits your needs in a very localized instance.”

DOD opted to test and evaluate 5G technologies in an initial tranche consisting of five military installations before expanding to a second tranche, still underway, of seven additional sites.

Meanwhile, the Cybersecurity and Infrastructure Security Agency has focused on managing the risks to 5G networks among civilian, state, local, tribal and territorial agencies. Of primary concern are telehealth, telecommunications, sensitive government facilities and mass transit, said Serena Reynolds, chief of CISA’s Initiative Management Branch.

“We know that with 5G largely operating on a non-standalone network, it’ll really rely on that existing 4G infrastructure to provide speed and connectivity,” Reynolds said. “And certainly as 5G is largely deployed and moves to its own standalone network, government agencies are really being able to experience more of the advanced benefits.”

Until then, legacy 4G infrastructure presents vulnerabilities CISA is looking into along with:

CISA is attempting to address these challenges through threat briefings and engagement with government agencies, the Federal Mobility Group and DOD research and development, Reynolds said.

The U.S. remains “among the leaders” in 5G technology, but competitors like China have kept pace or are catching up, Nandagopal said.

“We are still ahead,” he said. “But there’s nothing like somebody right behind you to keep you running faster.”

Space Command to launch Joint Cyber Center

The unified combatant command overseeing the military’s joint operations in space is working to stand up a Joint Cyber Center, its commander told senators Tuesday.

U.S. military branches are directing resources to the cyber center, which will look to ensure the cybersecurity of satellites and space-based communications, said Gen. James Dickinson, the Army general in charge of Space Command. Dickinson said the center will be a critical part of the command’s mission and act as a central unit that can help it integrate with other cyber-focused commands, like U.S. Cyber Command.

“We are in competition each day, both in space and cyber,” he told the Senate Armed Services Committee during a hearing on the fiscal 2022 defense budget request.

While the larger command is focused on space operations, it already has three general officers focused on cyberspace, Dickinson said. He repeatedly responded to questions about the security of satellite-based communications saying he has plenty of cyber capabilities to protect them but it is important to integrate operations across the military. The joint center will serve as a key part of that integration.

Dickinson said he has the support of Gen. Paul Nakasone, commander of U.S. Cyber Command. The 16th Air Force — the Air Force’s cyber and information warfare center — is also assisting the center along with service members from the Navy and Marine Corps already stationed at the command.

Gen. Dickinson also said a critical part of defending space-based communications is building constellations of mesh networks that can ensure the resilience of the overall system in the event one part of the network is attacked.

“I have got the resources that I need right now and I am confident in our ability to protect,” he said.

EIS team developing network-as-a-service offering

The Enterprise Infrastructure Solutions program team is developing a network-as-a-service (NaaS) offering to free agencies from the “vicious lifecycle” of replacing outdated technologies, said Allen Hill, the deputy assistant commissioner of category management at the General Services Administration.

NaaS will move agencies to a cloud business model allowing for continuous modernization of network infrastructure, similar to what’s been done for email and collaboration tools, Hill said during an ACT-IAC event Monday.

Agencies will be able to adopt emerging technologies faster from startups and build out and manage their networks without worrying about integrating new security components and software.

“There are some agencies that are going to be challenged,” Hill said. “They have a large amount of legacy inventory, and we’ve provided a number of options for them to consider to ensure there’s no break in services.”

The federal government is in a “better place” with its transition to the $50 billion network and telecommunications modernization contract, EIS, than it’s been previously, he added.

Out of 212 forecasted EIS task orders, 164 have been released to industry. A total of 93 task orders have been awarded with 55 completed and 48 more awards expected soon.

Of the 17 large federal agencies, nine have awarded all task orders, and 11 of 25 midsize agencies awarded all theirs.

The EIS team has begun its transition closeout project focused on limited and authorized users of the Networx, Washington Interagency Telecommunications System (WITS) 3, and Local Service Agreement (LSA) contracts that expire May 31, 2023.

Agencies’ EIS transitions consist of two parts: transitioning off the legacy contracts and moving all network services onto appropriate EIS task orders.

About 40% of the National Oceanic and Atmospheric Administration‘s spend is already on commercial contracts.

“Once we get through the legacy contracts, we’ll be moving those commercial contracts over as appropriate onto EIS task orders,” said Jeff Flick, deputy director of the Service Delivery Division within NOAA’s Office of the Chief Information Officer.

NOAA is “pushing hard” to meet GSA‘s 18-month deadline for transitioning but needs risk-mitigation contracts in place as stopgaps for any services still lagging behind, Flick added.

While agencies structure their own task orders and requirements, GSA is willing to help however it can when vendors fail to deliver on time.

“The one thing at GSA, we don’t necessarily have insight into what those specific task order requirements an agency has,” Hill said. “The agencies can certainly reach out to us, and we’ll work with them and help them to facilitate any type of challenges they may be having with the vendors.”

New bottleneck emerges in DOD’s contractor cybersecurity program, concerning assessors

Companies in line to become certified assessors for the Department of Defense‘s supply chain cybersecurity program are facing a new roadblock: Getting and passing an assessment of their own.

There’s a bottleneck in licensing assessors under the DOD’s Cybersecurity Maturity Model Certification (CMMC), according to multiple organizations waiting to go through the process. It not only frustrates these companies that are waiting to enter a potentially lucrative market but also threatens to complicate the timeline for implementing a critical DOD cybersecurity program.

The CMMC program requires every contractor in the defense industrial base to hire a licensed assessor to inspect its networks, something that cannot be done if there are no fully licensed assessors to hire.

“There is … a little bit of a logjam,” Johann Dettweiler, director of operations for TalaTek, a prospective Certified Third-Party Assessor Organization (C3PAO), said in an interview.

TalaTek’s assessment is slated to get its required assessment from the DOD’s Defense Industrial Base Cybersecurity Assessment Center (DIBCAC) this spring if all goes well with those in line before it. But Dettweiler was concerned that might not be the case, after learning on a call with other C3PAOs that they were having difficulty meeting CMMC level three, the mid-tier level of security required in the DIBCAC assessment.

Four people familiar with the matter who asked not to be named said they also were told initial audits were difficult and taking longer than expected; one source directly familiar with the matter pointed to the maturity documentation associated with level three as what was tripping some up.

“You have to be able to show that you have the policies and that you have been living the policies, and that last part is really tricky” Jim Goepel, a former CMMC Accreditation Body member and the CEO of Fathom Cyber, said in an interview.

The CMMC Accreditation Body, the organization that issues licenses to C3PAOs and oversees other parts of the CMMC ecosystem, announced in March that one assessment had been completed but did not share the results or name the company. More than 100 companies are cleared to get their assessment and hundreds of others are awaiting their initial background check and training from the AB.

The AB had little comment on the bottleneck besides expressing its steadfast support for the current requirements and saying the “CMMC-AB is on target for projections” to meet demand. DOD and DIBCAC did not return multiple requests for comment.

In public comments, Katie Arrington, the chief information security officer for acquisition and sustainment, has defended the need for the level three requirement for assessors, saying it’s a security imperative.

“Why? Because they’re going to be the ones processing your company’s information and inputting that into the only place that your company’s information will be stored,” in DOD’s own database called eMASS, Arrington said last week at an Amazon Web Services summit.

Cascades of bottlenecks

A secondary challenge with the C3PAO rollout is that it is a critical piece of a multi-step process to ensure the success of the CMMC program. The end goal of having a third-party verification of the cyber standards of the roughly 300,000 contractors in the defense industrial base relies on having those third parties available to do the verification. Without enough assessors and C3PAOs, the entire ecosystem could fall short of its stated goals.

The DOD has given the program five years to get its feet under it — after that, CMMC will be a requirement in all defense contracts. The AB, which is largely responsible for rolling out the program, says it remains on track to meet the DOD’s timeline. But Dettweiler and others have concerns about the potential downstream effects from the current pace of C3PAO accreditation.

“If you do the math on that…how is that feasible?” Dettweiler said of getting the number of companies certified by C3PAOs on time.

Matt Titcombe, CEO of Peak InfoSec and chief information security officer of its parent company Gigit, added a strong “no” on whether he thinks the program is on track to meet the eventual demand for CMMC assessments.

“I don’t know if we are even going to get one done this year,” he said, adding that he thinks the current timeline is based on a “perfect world” scenario.

Many offered potential fixes, such as allowing for the initial assessment for the C3PAOs by DIBCAC to be less stringent and allow for assessors to submit their plans to meet compliance instead of demanding maturity of compliance upfront. Others have urged DOD and the CMMC AB to issue a more clear written policy on the scope of assessments.

“You’ve got to make this upfront huge investment without having any potential business from the DOD at all,” Max Aulakh, CEO of Ignyte Assurance Platform, said of the current state of security requirements.

Small assessors, bigger concerns

For small assessors, the concerns are even more acute. With only five people on his team, Steven Senz, CEO Ascertis Solutions, knows he would have to hire more people just to meet the level three security maturity requirements. He said in an interview he is hoping to subcontract with larger C3PAOs to be able to do assessments in the future.

Senz said he wished there was more regular, official communication from DOD and the AB about the requirements and policies of the program. If he had known more when the initial application payment was due, a $1,000 fee, he may have chosen a different path for his new company.

“Pay $1,000 in, then another $3,000 to show you can be CMMC level three, but you never really put a disclaimer in that unless your company is of a certain size, don’t bother to apply,” he said of the information provided by the AB when he initially applied. “You took my money and I’m not certain under the criteria now you are imposing on C3PAOs I am actually going to be able to get through all the gates.”

Sustaining telehealth services with data intelligence

The use of electronic telehealth platforms by public and private health care organizations have become essential to delivering care remotely during the pandemic. They’ve also demonstrated a promising way to deliver health care at scale — especially as telehealth adoption gains wider acceptance by patients and doctors.

However, the underlying technology that is required to make telehealth a sustainable, long-term solution depends both on data intelligence tools and security controls to be properly deployed, according to Ann Mehra, strategic healthcare programs leader at Splunk, and a former associate director at Massachusetts General Hospital. Telehealth goes well beyond videoconferencing. It requires assembling real-time data — at scale — to facilitate end-to-end communications and workflow.

telehealth

Read the full report.

“Health care presents a special challenge in gathering information, both because of compliance regulations as well as the stakes involved in making clinical decisions — all of which demand that IT systems and security controls be properly deployed, up to date and monitored continuously,” she says in a new FedScoop report, underwritten by Splunk.

Mehra describes how health care organizations initially raced to scale up virtual appointments. The result, however, was a swift realization that their systems weren’t configured to handle the added volume — in some cases, services were ground to a halt.

The report cites an example of the new volume demands from the U.S. Department of Veteran’s Affairs which had already launched a live-video consultation platform in the summer of 2017. In fiscal year 2019, the VA conducted a little over 2 million “episodes” of telehealth care. That number grew from roughly 10,000 per week in February 2020 to 120,000 per week by that May, according to VA figures cited in the report.

This level of behind-the-scenes technical support demonstrates the need for a modern and secure platform to manage capacity, interoperability, security and risk, the report argues.

Mehra shares how Splunk is working with their health care partners to more fully assess the security and reliability of their IT ecosystems.

Through Splunk IT Service Intelligence, they provide organizations of all sizes the ability to “attain end-to-end visibility of the services operating on their networks in real time; identity abnormalities and perform root cause analysis; and perform automated remedial actions to streamline incident resolution,” she says.

The report describes Splunk’s “data-to-everything” platform which allows organizations to ingest machine data, non-machine data, structured and unstructured data and turn it into actionable insights. “As we ingest more of this historical data, we then can start to apply AI and machine learning… and predict over time how the system will perform,” says Mehra.

The platform ultimately helped one of Splunk’s customers decrease teleconference call failures from thousands per day to less than 10. It also improved bandwidth and call capacity from 50% to nearly 100%. And it exposed unknown interoperability gaps and extended visibility into remote network user access to improve communications.

That enhanced level of assurance will become more essential as health care agencies and providers continue to embrace telehealth as a model for delivering care more efficiently.

“It becomes very important to not just have telehealth for the sake of telehealth; but having all of the backend applications and technologies necessary to have a productive encounter, just as you would if you were in a clinic,” she says.

Learn more about how Splunk can help your agency better prepare for the data demands of today’s modernized health care systems.

This article was produced by FedScoop and StateScoop and sponsored by Splunk.

Enabling new air, space and satellite capabilities through the cloud

When the Department of Defense was tasked with creating the new U.S. Space Force, few people offered more experience to lead the planning and implementation effort than Maj. Gen. Clint Crosier, a 33-year Air Force veteran. Crosier had headed numerous high-profile assignments, including as director of Space and Intelligence Programs in the Office of the Under Secretary of Defense, and Chief of long-range strategic planning for the Air Force. Now he is taking that experience to AWS’s Aerospace and Satellite business.

Crosier has built a team of aerospace and satellite experts to bring the power of cloud computing to an industry that itself is undergoing its own revolution. In this exclusive interview, Crosier talks about how the aerospace and satellite industry is leveraging cloud computing, and how AWS is helping organizations get “to the stars, through the cloud.”

FedScoop: What are some of the broad challenges you’ve seen where the scale and power of cloud computing and analytics proved critical to advancing the mission?

aerospace

Clint Crosier, Director, Aerospace and Satellite Solutions, AWS

Crosier:  In my experience, teams often encounter some common challenges that can make it difficult to successfully complete their mission. In aerospace, these challenges or barriers might include anything from the upfront costs associated with building infrastructure, to the time and cost associated with engineering work, modeling, simulation, and testing for new satellites, launch vehicles, or designs.

As we look to the future, moving necessary processes like these to the cloud is going to be a game changer for defense as well as commercial aerospace applications because it has the potential to save significant time and expense. Digital engineering, digital testing, and digital modeling and simulation using the cloud will allow you to do all of this in a much more efficient way.

In the satellite industry, a huge challenge continues to be the ability to rapidly and reliably downlink, store and manage the vast amounts of data that operators are capturing every day. Consider that operators are capturing high-resolution satellite imagery that amounts to petabytes every single day. There is simply no way to analyze and share such massive volumes of raw data quickly and efficiently without the help of the cloud. And as satellite operators continue to grow their constellations, the amount of data will grow, too.

And then there are the challenges associated with space exploration. NASA JPL, for instance, is using the AWS cloud for mission-critical communication and transfer of telemetry data in support of its Perseverance rover mission on Mars.  The Mars Rover team is receiving hundreds of images from Mars each day from a record number of cameras, resulting in thousands of images over Perseverance’s time on the planet. By using AWS, NASA JPL is able to process data from Mars, on Earth, faster than ever before. The increased processing speed is helping NASA JPL scientists to plan the rover’s next day activities. The increased efficiency will allow Mars 2020 to accomplish its ambitious goal of collecting more samples and driving longer distances during the prime mission, compared to previous rovers.

Simply put, the cloud is removing barriers that traditionally have held back the space industry and is helping to redefine the art of the possible.

FedScoop: There has been something of a renaissance in how the commercial aerospace industry has brought more affordable and technically advanced solutions to launching payloads, including satellites, into space. How are advances in cloud computing enabling those gains?

Crosier: We’re probably at the most exciting and significant inflection point in the space industry since the original Apollo days, back in the 1960s. Back then, the U.S. government would design, acquire, build, launch, operate and sustain all of its own systems. This was because one: nobody else in the world could do it. And two: security needed to be built in to protect those systems.

The industry growth that we are seeing today is creating enormous opportunities for companies of all sizes. AWS has extensive experience helping commercial and government customers and partners design satellites, and conduct space and launch operations. Our AWS Aerospace and Satellite team was established last year to directly support these customers and their long-term goals.

In order to support these customers and their space missions, we know that a flexible and secure cloud computing environment is essential. At AWS, security is a top priority and has been from day one. We did a lot of listening in the early days to really understand the challenges of our federal customers and show that we deliver security that is second to none.  AWS has been a proven partner to the federal government for years, and government agencies trust us to handle their most sensitive workloads.

FedScoop: How do you see cloud computing playing a larger role in supporting space infrastructure and the growing mesh of satellites in orbit?

AWS Ground Station is a fully managed service that allows customers to downlink data and provide satellite commands across multiple regions with speed and agility — and at a low cost. AWS Ground Station customers can downlink data and provide satellite commands across multiple regions quickly, while paying only for the satellite time they use. We’ve found that companies can save up to 80% of their ground station infrastructure costs by using AWS Ground Station.

As the number of satellites in orbit continues to grow, operators will need to be able to increase the rates at which they deliver high-precision data to the people who need it most. The more we make cloud-based solutions available in near real-time, the more we will see companies develop new and exciting ways to use that data.

I think we will also continue to see the cloud playing a larger, more valuable role by supporting autonomous activities, allowing certain tasks to be done without a human in the loop. Artificial intelligence and machine learning can help to automate many tasks including data analysis, space traffic management and collision avoidance. With so many satellites in orbit, automating capabilities like these and delivering results to end users more rapidly are things that can only be done using the cloud.

FedScoop: Moving a bit back down to Earth, how do you see cloud computing playing a larger role in suborbital space?

Crosier:  That’s a great question. People talk about the air domain and the space domain as though they’re interconnected; that it’s a singular domain where you just move from one to the other. But it’s not easy. We in the space industry are always cognizant that they behave in two very different ways.  However, we have to transit between those domains interchangeably. For instance, we have aerospace operations — drones, doing intelligence in different parts of the world — that are taking their cues and information from satellites in space.

Boom Supersonic is a wonderful example. Boom is developing a new generation of supersonic aircraft and is using the cloud to perform digital design and engineering, stress tests, high-performance compute modeling and simulation. They estimate achieving a six-times increase in productivity operating on the AWS cloud versus running these simulations in an on-premises environment.

Here’s the amazing piece: Boom tells us that they have consumed 53 million compute hours on the AWS cloud; and they will double that over the next two years, in order to complete design and testing of their aircraft design. Boom is demonstrating how you can build an aircraft entirely on the cloud. It’s just a powerful example of where the aerospace industry can really benefit.

FedScoop: Looking ahead, what’s on your radar that excites you most about how the public is likely to benefit from these cloud-enabled advances in the aerospace industry? And what’s perhaps the biggest challenge you see?

Crosier: We see a number of exciting things that our customers are doing to advance the global good that really can only be executed on the cloud – like environmental protection, climate change monitoring, or disaster response activity.  I’ll give you two examples.

Fireball International, one of our customers located in Australia, is using space data in the infrared spectrum to monitor and detect new wildfire breakouts within three minutes of ignition. Being able to respond in this way can only be achieved by using AWS cloud capabilities and our global infrastructure.

Another example is a company called Digital Earth Africa. They use high resolution imagery from space to focus relief efforts on the continent of Africa. There are ways you can detect what’s happening in patterns of life from space, such as where crops are not getting enough water and where there’s the risk of famine. They estimate that as satellite resolution improves, they’ll also be able to improve decision-level intelligence 800 times faster than what they could achieve before they moved to the cloud.

Exciting capabilities like these are possible because of the global infrastructure that AWS provides, coupled with the ability of the cloud to process large data sets more quickly and efficiently. I believe the cloud will become an indispensable   foundation to the aerospace industry. Government and commercial customers alike will only stay competitive and relevant if they operate in the cloud.

Watch this story featuring Major General Crosier and Astronaut Peggy Whitson on how AWS is helping astronauts, scientists and everyday heroes make the future of space a reality.

Read more insights from AWS leaders on how agencies are using the power of the cloud to innovate.

This article was produced by FedScoop and underwritten by AWS.