Army cloud agency expanding its team
The Army’s newly dubbed Enterprise Cloud Management Agency (EMCA) is growing its cloud operations team and extending new partnerships as the service tries to implement cloud-based tech.
The growth of the cloud team comes a year into ECMA’s operation and as it just recently gained new authorities as a field agency. It’s unclear exactly how many more cloud operators the agency hopes to add, but doing so will play a key role in supporting the deployment of a new tactical cloud network and other modernization initiatives, Director Paul Puckett said during an AFCEA webinar Wednesday.
“We are leaning in to expand our cloud operations team and really try to turn that into the new normal,” Puckett said, adding that the team will expand work on things like security and tactical deployments.
The ECMA has also expanded its partnerships across the Army, working closely with regional cyber centers, program executive offices and support commands, like the Army Network Enterprise Technology Command. Puckett said he meets weekly with other tech leaders across the service to work cohesively under the Army’s cloud modernization strategy.
“There is nothing that one does that the other is not involved in,” he said of the partnerships ECMA has formed.
Having a larger team for cloud operations means that the Army can take a more central approach to its cloud modernization. Before ECMA was stood up as the Enterprise Cloud Management Office in 2019, Army offices faced the daunting process of migrating their data to the cloud on their own, Puckett said. The shift is from the “thousand flowers blooming” approach to a more centralized push that can orchestrate a more common cloud architecture for the Army to work within.
Other impacts of ECMA’s growth will be seen in expanded environments for tech-related initiatives. The Army’s new software factory has soldiers code within a cloud-based environment supported by the ECMA, for example.
Other projects that straddle the worlds between technology and tactical use are also moving to the cloud. Along with partners like the Program Executive Office for Command, Control and Communications-Tactical (PEO C3T), the ECMA helped launch the recent “Tactical Cloud Infrastructure.” It’s the cloud version of the former “Tactical Server Infrastructure” that used on-premise and physical stacks to get compute at the edge.
But not everything has moved to the cloud as bandwidth in austere environments is limited. Puckett said the Army is working to “figure out what data needs to be local” and what can be stored in the cloud.
IT Insights: Interview with AWS federal director Brett McMillen
Brett McMillen has devoted most of his career helping government harness information technology and tackle innovative initiatives. Since joining Amazon Web Services 10 years ago, he’s also played a contributing role to the rise of cloud computing in government.
Among other projects, he’s helped make the 1,000 Genome Project available as public datasets. He helped the Department of Veterans Affairs integrate more than 200 previously distinct websites and services to implement the Vets.gov portal. He was part of the team that helped develop a facial recognition program that Customs and Border Protection uses to improve airport security. And he worked with federal officials to obtain FedRAMP certification for AWS’s government cloud services.
Today, as Director of U.S. Federal at Amazon Web Services, McMillen sees AWS’s experience in helping federal agencies take advantage of the cloud as important as the technology itself.
In this exclusive FedScoop interview, McMillen talks about how the U.S. Census Bureau offers an example of ways that government is taking advantage of recent advances in the capabilities of the cloud:
FedScoop: Where are you seeing noteworthy progress or success in the way government is taking advantage of technology advances, like those offered by your company?
FedScoop: What critical steps did that the Census Bureau take to address those issues?
FedScoop: What were the major outcomes and lessons gleaned from the Census Bureau’s efforts that other agencies could learn from?
Learn how AWS can help your agency capitalize on today’s cloud or contact AWS.
Read more insights from AWS leaders on how agencies are using the power of the cloud to innovate.
This video interview was produced by FedScoop and underwritten by AWS.
‘Significant deficiency’ risks security of sensitive federal debt data
The agency responsible for managing the $26.9 trillion federal debt needs to improve its information system controls or risk the security of sensitive financial data, according to the Government Accountability Office.
While the Bureau of the Fiscal Service addressed five previous recommendations, 16 related to security management, access controls and configuration management deficiencies remain unresolved — on top of eight new ones in areas like segregation of duties, GAO found in its annual audit.
Details on the deficiencies were deemed “sensitive information” by BFS and not publicly disclosed, but the agency said it’s drafting a comprehensive audit remediation plan.
“These new and continuing information system control deficiencies, which collectively represent a significant deficiency, increase the risk of unauthorized access to, modification of, or disclosure of sensitive data and programs and disruption of critical operations,” reads GAO’s public report.
BFS managed to maintain “effective internal control” of federal debt reporting by strengthening access and monitoring controls around data sets that can only be altered with its mainframe change-management tool, reads the report. The agency also improved its monitoring of compliance with baseline security requirements.
But GAO found mainframe security controls weren’t used in accordance with the concept of least privilege and mainframe security architecture documents needed improvement.
Security and configuration management controls remain inadequate and responsibilities unclear, with one person sometimes in charge of activities better split between two or more people or units to catch errors and suspicious activity, according to the report.
The head of BFS has 180 days to formally respond to the report with actions taken or planned.
CMMC is under an internal DOD review
One of the most consequential programs in defense contracting is getting a second look by the Biden administration.
The Cybersecurity Maturity Model Certification (CMMC) — the new cyber standards all defense contractors will need to adhere to to bid on contracts — is under an ongoing “internal assessment,” according to a Department of Defense spokeswoman.
The DOD did not provide details on the review but said it was routine for a high-impact program like CMMC.
“As is done in the early stages of many programs, the DoD is reviewing the current approach to CMMC to ensure that it is achieving stated goals as effectively as possible while not creating barriers to participation in the DoD acquisition process,” spokeswoman Jessica Maxwell said in a statement to FedScoop.
While the program is over a year into development, new brass within the Pentagon could choose to make some big changes to what has been program loaded with controversy since inception. Many companies have expressed concern over the cost to adhere to the new CMMC standards, which require them to pay for third-party assessors to inspect their networks against a five-tiered set of controls. If a contractor doesn’t meet the CMMC level required in a contract, it won’t be eligible to bid on it.
“It is now timely to consider what we might want to do differently in the implementation of CMMC,” said Robert Metzger, the head of the Washington, D.C. offices of the Rogers Joseph O’Donnell law firm and co-author of several reports on supply chain cyber threats.
While there is uniform agreement on the need to increase the overall cybersecurity of the defense industrial base, the program has been criticized in its rollout. The initial decision to push much of the implementation responsibility of CMMC to a third-party volunteer organization — the CMMC Accreditation Body — caused some backlash. Eventually, two leaders on the board resigned over a perceived “pay-to-play” marketing scheme.
Metzger suggested the new administration could make changes to the relationship the government has with the CMMC Accreditation Body and what responsibilities it gives to the third-party group. He also anticipates the review could take a look at other issues like staffing of the program management office for CMMC, the interim final rule Defense Federal Acquisition Regulation for CMMC and funding for the program’s implementation.
“It would not surprise me at all if the new administration would want to consider very carefully how best to get this objective achieved,” Metzger said.
FDA undertaking ‘unprecedented’ data infrastructure modernization during the pandemic
The COVID-19 pandemic has given the Food and Drug Administration an “unprecedented” opportunity to modernize IT systems and data infrastructure that was just “chugging along,” according to one senior agency official.
FDA went from having an “antiquated” system that could only process low volumes of low-complexity COVID-19 reporting to developing a core diagnostic data set with clinical, graphics, testing and results data, said Dr. Sara Brenner, the associate director for medical affairs in the Center for Devices and Radiological Health.
New technologies like rapid antigen tests are being untethered from laboratories so people can self-administer them, but that requires new wireless infrastructure to harmonize data at the source and get it to the FDA and other health agencies tracking COVID-19’s spread, Brenner said.
“This rapid expansion of volume has really just completely blown the wheels off of the conventional data collection and reporting system, which was never really designed for pandemic-scale data transmission,” she said during AFCEA Bethesda‘s Health IT Summit on Tuesday.
COVID-19 tests were the first diagnostic used to track the virus’ spread so FDA could intervene and stop transmission, but now it’s working with other agencies, states, laboratories, clinicians, and device makers to expand the data coming in. The Data Standards and Execution Work Group within the Department of Health and Human Services has begun working with IT infrastructure offices at other agencies to improve data quality and flow.
“We believe that if we’d had a better data infrastructure, we would’ve been able to answer many obvious questions that people needed to know — in terms of supply chain, why do we run out of certain products and goods — better,” said Vid Desai, chief technology officer at FDA. “And that’s certainly going to be a focus for what we’re going to be looking at going forward.”
FDA published a technology modernization action plan last September and quickly followed that with an accelerated data plan. The agency also hired a chief data officer during the pandemic and is forming a data team to address infrastructure issues, Desai said.
One issue the FDA was able to get ahead of is cybersecurity, raising threat levels in mid-March of 2020 when it became clear the agency would play a critical role in COVID-19 therapeutics, diagnostics and vaccinations.
“We also knew that would attract a lot of nefarious, rogue characters who would try and stop our work, and I think our predictions were true,” Desai said. “If you think about all these supply chain attacks that have occurred in the therapeutic and vaccine distribution mechanism, literally every week we see something new there.”
What government needs to know about accelerators
Bringing cutting-edge emerging technology from the private sector into the U.S. government is critical to better serving Americans and strengthening our competitive advantage globally. And accelerators are an important tool to bridging that gap between the tech industry and federal agencies.
There’s a misconception within federal agencies that running a typical, early-stage accelerator will drive instant, innovative, lasting results for government missions. That confusion stems from a lack of clarity on what the government’s needs and goals are.
More than early-stage ideas and introductions, the government needs emerging tech companies that are fully vetted for federal and ready to scale their solutions into programs of record.
In her recent testimony before Congress, Christine Fox, the former head of the Department of Defense’s powerful Cost Assessment & Program Evaluation (CAPE) unit, said: “The principal challenge DOD faces is not a lack of innovation. The tougher task is how to adopt all this new innovation more rapidly into DOD programs… We have lots of prototypes, but what we need is sustainable programs.”
So, if the need is bringing proven tech onto contract fast to improve mission outcomes in the long run, accelerators must be specialized and tailored to address that requirement. After all, accelerators are not one-size-fits-all.
First, understand the existing market
When the mandate is to innovate, the government does not need to reinvent the wheel. Rather, look to the emerging tech landscape first to assess what is available and can accelerate mission outcomes.
The National Geospatial-Intelligence Agency (NGA) launched a new tech accelerator in St. Louis to help startups develop new geospatial tech, and the first cohort will focus on early-stage startups in advanced analytics and modeling, data integrity and security, data management, and artificial intelligence. While the NGA Accelerator will provide a helpful cash infusion to spur more development of the St. Louis startup ecosystem, the reality is that the early-stage focus misses the opportunity to take advantage of existing, funded ventures that are already well-positioned to solve the same government problems.
High-growth, venture-backed companies like Fraym, Unearth, Uptake, and Hyperscience have tech solutions that are already in action in the private sector, backed by hundreds of millions of dollars of private investment, vetted for the federal market, and ready to improve mission outcomes at scale.
Of course not all commercial tech companies are equipped to support government missions and to forego rigorous evaluation is even more harmful than failing to work with the existing commercial tech landscape altogether. Good thing there are specialized training and partnership opportunities to help the U.S. government navigate the tech industry and make sure companies are fully vetted and equipped to win in the federal market.
If the tech that the government needs already exists in the commercial market, government should work with those vetted, later-stage companies that can move fast. Where technology gaps exist, government should look to earlier-stage companies, but expect that adoption will take much longer and the risk is much higher. Too often, the government spends time and money to run an accelerator to help companies develop tech that already exists, which the government could just evaluate and buy if it’s the right fit.
Go beyond ideas and introductions
From what we’ve seen at Dcode, when the government calls for an accelerator, it’s really calling for a way to work with emerging tech companies and accelerate their solutions onto contracts and into missions.
Not synonymous with incubators, angel investors, or co-working spaces, typical accelerators provide education, mentorship, and financing to early-stage companies in cohorts. Over the course of a few months, typical accelerators help companies establish themselves as corporate entities, develop products, and secure funding.
An accelerator can be a way to get commercial tech mission-ready, but there is so much more to it than generating ideas and making introductions if you want to drive real, lasting tech modernization in the government. What the government needs is a “scalerator”: a next-level model that accelerates proven, government-viable tech from the private sector into the federal market to improve mission outcomes fast in a meaningful, sustainable way.
Finding emerging tech is easy, equipping it to succeed in government is hard. Government agencies should look to venture capital firms and specific accelerators that have the expertise to guide tech companies through the government market contracting process and equip them to succeed. More than winning just a singular contract award, to get over the “valley of death,” these tech companies must have a strong grasp on government use cases, federal contracting, and operational processes to scale into programs of record.
Don’t stop at educating tech
Showing tech companies all the ins and outs of working with the U.S. government through a typical accelerator is only half the battle. If there’s no contract and plan to pull the right tech in, then you won’t be able to advance missions with commercial technology.
In addition to accelerating tech companies, federal agencies must better define problem sets, know the emerging tech landscape, employ innovative procurement, align innovation hubs with their mission-focused offices, and connect with leaders across other agencies to share lessons learned. It’s critical that government teams know how to scale solutions to advance their mission. Then they will be ready to work with cutting-edge tech companies that make sense for the mission.
Accelerators that provide education only for tech companies won’t cut it. There’s a reason government leaders have been requesting training from Dcode for years on how to innovate like a startup, evaluate like an investor, and apply agile procurement. Forward-leaning leaders recognize the need to shift culture and processes on the government side too.
Given this evident need for complementary education, if an accelerator does not also equip the government side, the effort will fall short.
We already know that if the U.S. does not bring new, cutting-edge, commercial emerging tech into the government, our country will continue to fall behind globally in the competition for economic prosperity and national security. The time is now to work smartly and swiftly with organizations that can bring the most promising, innovative tech from the private sector onto public sector contracts.
Meagan Metzger is the founder and CEO of Dcode, a privately-owned company focused on connecting tech and government to bring commercial solutions to critical challenges.
Army’s cloud office gets upgraded to an agency
Goodbye Enterprise Cloud Management Office; hello Enterprise Cloud Management Agency.
The team tasked with laying the foundation for the Army’s enterprise technology modernization has a new name and upgraded status, the Army’s Office of the CIO announced Monday.
The boost in status to a field operating agency of the CIO instead of just an office comes a year after the erstwhile ECMO became functionally operational. The agency will retain its director, Paul Puckett, and continue to report to Army CIO Raj Iyer. The change gives “new responsibilities and authorities to orchestrate and synchronize enterprise-wide cloud activities,” Iyer said in a statement to FedScoop.
“The formation of the ECMA as a new field operating agency represents the Army’s commitment to centralized acceleration to the cloud and adopting new digital technologies to implement the Army’s Digital Modernization Strategy,” Iyer said.
ECMO was originally stood up in November 2019, reaching its “functional” operating capability in March of 2020 — a status designating less than fully operational as it was short on staff due to the pandemic. The office was set up to lay the foundation for much of the modernization the Army hopes to achieve in migrating to the cloud. On top of this core mission, last March the then-ECMO helped support the rapid shift to teleworking on a new cloud-based Commercial Virtual Remote Environment.
Since that initial pivot to telework support, the agency has worked on building out core enterprise technology modernization capabilities. One example of its work is cARMY, the Army’s enterprise cloud environment that leaders have said is important in supporting broader programs like Project Convergence, which aims to fuse battlefield data in multi-domain operations.
Matthew Travis hired as CMMC Accreditation Body CEO
The third-party board implementing the Department of Defense‘s new cybersecurity standards for contractors finally has a CEO after months of searching.
Matthew Travis, a former deputy director of the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency, has been tapped to lead the Cybersecurity Maturity Model Certification Accreditation Body (CMMC-AB) — the organization tasked with overseeing the ecosystem of assessors who will inspect the IT networks of the 300,000 companies in the defense industrial base.
Travis will lead day-to-day operations of the CMMC-AB, a job that has largely been filled by the AB’s board of directors in the nearly 15 months since it was incorporated.
“We are extremely thrilled to have someone as respected and accomplished as Mr. Travis lead the Accreditation Body,” Board Chair Karlton Johnson said in a statement released Monday. “His organizational development skills as well as in-depth understanding of security and the Federal government will enable us to continue to quickly ramp-up AB operations and execute against our mission in service of the nation’s defense,”
Travis joined DHS in 2018 supporting what was then the National Protectorate and Programs Division — CISA’s precursor. His work to transition and stand up CISA within DHS was an attractive experience for the AB, which is also a rapidly growing organization steeped in government work. Travis resigned from CISA in November 2020 after then-Director Christopher Krebs was fired.
A former naval officer, Travis also served as a White House liaison from the Office of the Secretary of the Navy in the late ’90s. His public resume shows work experience focused primarily on homeland security and counter-terrorism technology.
The AB had been searching for a CEO since the summer of 2020.
“Joining and leading the CMMC-AB is a tremendous opportunity. I look forward to using my collective experiences of running a security company start-up as well as my time at CISA, where I focused on supply chain risk, to ensure we mitigate risks as they relate to both the DoD and the contractor community,” Travis said in a statement. “There is no more important cyber mission right now than building a trusted, verified, and resilient cybersecurity ecosystem within the Defense Industrial Base.”
CMMC is the new requirement the DOD is phasing into contracts to certify companies’ cybersecurity to shore up its supply chain. The new model is a tiered system where contractors will need to pay for an assessment from a CMMC-AB-certified assessor, which will inspect the company’s networks and give it a 1-to-5 score based on the ability to meet the security controls laid out in the CMMC model.
Since CMMC’s initial introduction in 2019, supply chain security has become an increasingly more important topic in defense and government contracting following the SolarWinds supply chain breach that impacted a multitude of government networks.
“When we look at where true cyber risk currently resides, the CMMC mission is a critical component of the safety and security of our nation and its citizens,” Travis said.
DOD officials have spoken previously about their hopes that DHS, Travis’s former employer, will adopt the CMMC model or something similar to it for the supply chain of civilian agencies.
‘Small number’ of DHS email accounts accessed during SolarWinds breach
Hackers accessed a “small number” of Department of Homeland Security employees’ email accounts during the SolarWinds breach, according to an agency spokesperson.
The suspected Russian operatives successfully targeted then-acting Secretary Chad Wolf and cybersecurity threat hunters, forcing top DHS officials to get new phones with the encrypted messaging application Signal on them, the Associated Press first reported.
Still, the new details on the breach aren’t a good look for the department that’s home to the Cybersecurity and Infrastructure Security Agency, which just received a $650 million appropriation in the American Rescue Plan Act, in part, for SolarWinds recovery.
“The department no longer sees indicators of compromise on our networks and remains focused on further securing our networks against future attacks, integrating lessons learned from this incident,” the spokesperson said. “However, this widespread intrusion campaign has again shown that our strategic adversaries are sophisticated, persistent and have increasing capabilities.”
DHS activated response teams from CISA and the private sector to respond to the SolarWinds breach, which is known to have compromised at least nine agencies by the time it was discovered in December. The department remains in contact with employees affected by the breach for guidance and services, as the response is ongoing.
CISA shared best practices on the SolarWinds Orion eviction guidance it sent affected agencies, as well as detection and forensics tools, with the White House as it considers how to modernize cybersecurity.
The Biden administration has yet to issue a promised executive order on cyber modernization, or name a national cyber director, and continues to mull its options for retaliating against Russia.
Meanwhile, CISA has shifted its focus to developing capabilities for monitoring the insides of networks for anomalous activity to defend against future supply chain attacks like the SolarWinds hack. And the tech industry called on the Office of Management and Budget and General Services Administration to proactively fund urgent cybersecurity projects with Technology Modernization Fund dollars.
AWS’ U.S. federal director talks about rapid response at agencies using the cloud
Brett McMillen has been at the forefront of the federal government’s move to cloud computing for the last decade. An engineer by training, McMillen has devoted most of his career helping government agencies harness innovative IT solutions, having worked for a number of leading telecommunications companies before joining AWS in 2011, where he now serves as Director of U.S. Federal Amazon Web Services.

Brett McMillen, Director, U.S. Federal, AWS
McMillen has been recognized for his work helping the federal government make important information, such as the 1,000 Genome Project, available as public datasets on AWS and working with federal officials to obtain the FedRAMP certification for AWS.
FedScoop caught up with McMillen recently to get his current take on how agencies are moving forward with the cloud.
FedScoop: Government agencies are constantly being asked to do more with less. In many ways, the pandemic revealed that agencies were able to do more than many thought possible. Where did you see some great examples of that among federal agencies?
Brett McMillen: If you would have asked me where I thought we would spend most of our time and effort during the pandemic, I would have guessed places like FDA, CDC, and NIH. And while we have done a lot of work to support those agencies, there have been other areas that we didn’t expect. One example is the U.S. Small Business Administration, which was tasked with administering the Paycheck Protection Program under the CARES Act. We assisted the SBA with providing an additional portal to process loans via SBA’s E-Tran system. We stepped in to help, and with the support of our partners, SBA was able to offer a portal to allow more lenders to submit their loans to the agency for the federal guarantee. These loans were then used to provide financial support to small businesses to keep workers on payroll and open for business
Another example was the U.S. Census Bureau. While they’ve been conducting the census since 1790, it’s really been done with pen and paper. 2020 was the first year that people could participate in the decennial count online. While AWS started working with them years ago to bring the decennial census online, the pandemic made the modernization even more critical. At a time when the Census Bureau has scaled back some of its in-person data collection, the ability to collect responses online enabled Census operations to continue safely and efficiently.
FedScoop: In some cases, the economic shutdown that followed the pandemic resulted in a big fall off in fees for some governmental agencies. Where did you see technology helping agencies innovate or optimize operations in response?
McMillen: While some organizations had to scale up, other organizations were experiencing less demand and needed to scale down. One example is U.S. Citizenship and Immigration Services. Since fewer people were looking for visas and passports during the pandemic, they had a large drop in the fees they were receiving and needed to scale down. Scaling down allowed CIS to serve the people that needed it at a smaller expense. We saw that at multiple agencies.
The ability to scale both up and down is one of the benefits agencies see by moving to the cloud. Agencies can’t be sure when the next crisis will occur or what resources they’ll need. Having every system ready to either scale up or scale down is essential, so that they can meet those changing needs.
FedScoop: Where else did you and your support teams at AWS see the innovative or progressive use of cloud services?
McMillen: When the federal government first introduced its “Cloud First” policy and “25 Point Implementation Plan,” most agencies would choose a specific workload to try moving to the cloud. They saw success with these workloads, and began moving more. As agencies transitioned more workloads to the cloud, they saw a need to move into enterprise-wide engagements.
Now, rather than looking at each workload separately, agencies are asking, “How would you solve this mission problem with the latest and greatest technologies?” With AWS, agencies have access to our technologies and services, as well as technologies available from our partners that build on top of AWS. These technologies can help build new systems or improve existing applications.
For example, the Centers for Medicaid and Medicare Services help distribute money to states to help people in need through programs like the Child Health Insurance Program. However, CMS didn’t have a good feedback mechanism to determine who was getting those benefits; they weren’t able to get really granular with the data and understand it. Now, we work with CMS and help them build applications to answer questions like, “In this zip code, are below-poverty-income mothers getting the prenatal care that they need?” With these improvements, they’re able to better understand the funding that goes out and how it’s serving citizens.
FedScoop: What in your estimation made the difference in the way agencies were able to mobilize so quickly and productively through the pandemic?
McMillen: The agencies that were able to pivot quickly had already made some fundamental IT improvements before the pandemic began. When a pandemic or any unforeseen event happens, you don’t know where you’re going to have to scale up and scale down. Making those investments and improvements in advance helped position them to move quickly to respond.
One example of an improvement that can be made in advance is setting up an acquisition vehicle for general operations, which allows anyone within that department to quickly and easily access modern cloud technologies. Agencies can also set up standard operating procedures and an authority to operate that any organization within the agency can leverage.
Having some of those fundamentals in place makes it easier to respond to unforeseen events and stand up a new system in the cloud. For example, AWS is working with agencies to build what we call landing zones — a compliant environment that meets that agencies’ security and acquisition needs.
It’s also helpful if government agencies continue to focus on workforce modernization and train employees on modern technologies on an ongoing basis. Then, when unforeseen events occur, their employees are already trained to quickly pivot and respond.
FedScoop: What lessons from the pandemic suggest ways agencies can still accelerate their modernization efforts even with limited budgets?
McMillen: Public service can be unpredictable, and transitions in leadership and administrations can bring about frequent change. So agencies should look at each of their systems and ask whether they are scalable and responsive to change. Unfortunately, too many of these systems were built a long time ago and don’t have that scaling capability.
However, IT modernization doesn’t have to mean boiling the ocean. Agencies can inject modern technologies into existing systems. Over time, this results in continuous modernization or continuous improvement of those systems. Agencies that adopt that approach are the ones that are able to pivot very quickly.
As we’ve already seen, this pandemic isn’t going to end overnight. It’s still going to be a long process. Agencies should use this process to modernize and make long-term improvements to their systems, which will improve the delivery of services to the American people.
Learn how AWS can help your agency capitalize on today’s cloud or contact AWS.
Read more insights from AWS leaders on how agencies are using the power of the cloud to innovate.