The new urgency for federal financial authorities to leverage the cloud

Olivia Peterson leads the U.S. Federal Financial Services business at Amazon Web Services. She previously served as Senior Vice President of Client Services at SS&C Primatics and Senior Business Initiatives Director at Freddie Mac.

Olivia Peterson, Head of U.S. Federal Financial Services, AWS

By most measures, the world’s financial institutions have made significant strides in the past few years digitizing their operations and their service offerings.

The pressure to keep up with consumers and investors — now accustomed to switching institutions with a few clicks on their smartphones — as well as an emerging cadre of technologically-disruptive competitors, among other factors, have driven most financial services firms to invest heavily in a variety of digital transformation strategies.

However, the continuing speed and impact of technology changes underway at banks, investment firms, insurers and other financial institutions are also putting enormous pressures on government monetary officials and financial regulators to keep up.

The need for greater agility at scale by federal agencies to monitor, examine, regulate and support financial markets clearly took on new urgency this past year in the face of the pandemic and news of the cyberattack on SolarWinds, which impacted the departments of Treasury, Commerce, Justice, Homeland Security, the U.S. court system, along with a number of corporations.

The good news is: The cloud computing capabilities propelling technological innovation among financial institutions are also available to federal agencies and regulators.

Moreover, the wide range of high-performance data processing, analytics and AI capabilities available from AWS today offer federal financial agencies and regulators not only a smarter way to procure state-of-the-art infrastructure. Those capabilities also provide altogether new and pivotal opportunities to:

Expand mission-focused capabilities — The critical mass of computing resources and engineering talent assembled by the leading cloud providers have led to a vast and expanding array of secure, turnkey and AI-assisted tools and IT services, many of which have become essential to operating in today’s digitally-connected world. These tools offer federal financial authorities far more powerful, flexible and automated mechanisms to help monitor and regulate the U.S. economy and its participants than what’s commonly available on most existing government IT systems.

Reduce mounting risks and costs — Financial services agencies face a triumvirate of technology challenges: Aging technology platforms that will only grow more expensive to maintain; a declining number of people who know how to program and maintain them; and a widening gap in agility and speed in responding to changing market conditions compared to financial sector leaders and malicious state actors. Modernization doesn’t just mean improving platforms and adopting emerging technologies like machine learning; it  also means investing in infrastructure that can flex and scale at a moment’s notice, which only the cloud can achieve.

Enable advanced data strategies and analysis — The growth of digital transactions globally has put tremendous burdens on both regulators and regulated commercial entities to gather, process and analyze massive data sets for timely insights. The cloud makes it easier to collect, ingest, store and analyze data — and do so faster and more cost-effectively and securely. That helps alleviate burdens for both examiners and the regulated; but it also helps equip under-staffed agencies to leverage that data and respond more quickly to market risks, fraud and abuse.

Certainly, a number of regulatory organizations have already begun capitalizing on the scalability and capabilities of the cloud.

For instance, the Financial Industry Regulatory Authority (FINRA), the nation’s securities industry self-regulator, built a petabyte-scale data lake on AWS. It then took advantage of open-source technologies and cloud-native analytics tools to enable 1,500-plus analysts and business partners to securely query financial trading data — involving terabytes of data updated daily — across the U.S. securities market. This kind of performance could not be achieved on-premises.

The Federal Deposit Insurance Corporation (FDIC), meanwhile, has communicated its plans to modernize regulatory reporting processes and requirements to obtain more detailed and frequent data on banks’ loan portfolios. Currently, banks collect between 1,400 and 2,400 data fields, and transfer them to the Federal Deposit Insurance Corp. for aggregation and analysis each quarter. The goal, according to FDIC Chairman Jelena McWilliams, is to develop a “modernized and automated data system (that) would improve the ability of supervisors to identify bank-specific and systemwide risks sooner and more efficiently, while reducing the compliance burdens on individual institutions.”

And other agencies, such as the National Credit Union Administration, are also taking advantage of the cloud. NCUA has been piloting a web-based platform aimed at streamlining the examination process for credit unions and examiners. The new platform — the Modern Examination & Risk Identification Tool (MERIT) — is expected to be available this year and ultimately replace a 25-year old legacy application called the Automated Integrated Regulatory Examination System (AIRES).

There’s one other compelling reason why federal financial agencies and regulators should start capitalizing more fully on the cloud now: Today’s cloud services have made modernizing IT systems easier to procure and maintain for the future.

When AWS first launched the Amazon Elastic Compute Cloud in 2006, it also recognized the importance of making computing services easier to acquire. That led to the concept of “infrastructure-as-code,”  which lays the foundation for applications that can launch and scale in seconds-to-minutes through code implementations instead of long procurement cycles.

What this means is innovation and modernization are no longer dependent on or stalled by technology refreshes and lengthy acquisition cycles; they can happen in minutes. Fast-forward to 2021: Given all of the FedRAMP-secured cloud solutions available through AWS and its partners, and AWS’s unique experience available to the federal government, the ability to innovate and modernize in new and powerful ways and to harness the power of data are literally at your fingertips.

Learn how AWS can help your agency capitalize on today’s cloud, or contact us at USFedFin@amazon.com.

Read more insights from AWS leaders on how agencies are using the power of the cloud to innovate.

VPNs pose challenges for agencies sustaining remote work

Virtual private networks (VPNs) are presenting some agencies with added challenges as they increase remote work during the COVID-19 pandemic.

Some agencies had to make emergency acquisitions for more VPN licenses and are now looking to segment their data because the technology provides more internet exposure than advocates of models like zero-trust security are comfortable with. Infrastructure, not cloud, remains the focus as agencies attempt to remotely connect employees to network assets that may still be on-premise, and zero-trust security architectures are preferable, said Dan Jacobs, director of cloud adoption and cybersecurity within the General Services Administration Centers of Excellence.

“I know several organizations went through some crippling issues when COVID first happened,” Jacobs said during an AFCEA Bethesda event Tuesday. “They simply didn’t have enough licenses, and the ones that did have enough licenses didn’t necessarily have the throughput. And their VPN failed them.”

The Nuclear Regulatory Commission is considering segmenting its data as part of its VPN approach and changing the way it handles authentication and provides permissions due to security concerns, said Jonathan Feibus, the agency’s chief information security officer.

According to a Zscaler risk report released this month, among 357 IT and cybersecurity professionals — 25 of them in government — 93% said their organization had deployed VPN services despite 94% acknowledging cybercriminals exploit their vulnerabilities to access network assets. Social engineering, ransomware and malware are the most common ways to compromise VPNs.

“Right now VPN just throws open the fire hose and gives me access to everything I had when I was in the building,” Feibus said. “Do I necessarily need that when I’m remote?”

Of the professionals Zscaler surveyed, 67% were considering remote access alternatives to traditional VPNs and 72% were prioritizing zero-trust security. And 59% were accelerating those efforts because of increased remote work.

“It’s encouraging to see that enterprises understand that zero-trust architectures present one of the most effective ways of providing secure access to business resources,” Chris Hines, director of zero-trust solutions at Zscaler, said in a statement. “As organizations continue on their journey to cloud and look to support a new hybrid workforce, they should rethink their security strategy and evaluate the rising cybersecurity threats that are actively exploiting legacy remote access solutions, like VPN.”

A cloud-delivered, zero-trust service that brokers all user-to-app connections is the best approach, Hines said.

But agencies aren’t so sure. Maintaining ownership of infrastructure is often easier than using cloud services because then you have to work with the provider to adjust for efficiencies, Feibus said.

The Air Force is taking another approach with its software factories using DevSecOps to embed security into service mesh architectures from the outset. But that isn’t a “panacea” for VPN woes either, said Ron Ross, a fellow at the National Institute of Standards and Technology.

NIST wrote the Federal Information Processing Standard 199 back in 2004 to ensure all data in federal systems was categorized as high, moderate or low impact.

“We understood then that complexity was going to overwhelm us at some point and that making sure we could identify the things that were most important; we can separate those, isolate those resources and give them better protection,” Ross said. “That concept is still very much in play today.”

But even DevSecOps developers are reliant on code libraries imported from a variety of sources without much transparency or trust. Broad-based policies and strategies are needed to address that “systemic” problem, Ross said.

“How much trust do we have in those code libraries? Who manages those libraries?” he asked. “What’s in the libraries?”

Citing JEDI, a top Microsoft executive calls for reform of contract protests

The president of Microsoft told lawmakers Tuesday that legal reforms are needed to shorten the timeline for federal contract award protests — a process the company is all too familiar with as it’s tied up in disputes around the Pentagon’s multibillion-dollar cloud contract for more than a year now.

Microsoft President Brad Smith didn’t offer specific recommendations on what the federal government could change to speed up protests, but he broadly suggested there should be a more efficient adjudication process without sacrificing the chance for companies to make their voices heard. Microsoft has not been able to start work with the Department of Defense under the $10 billion Joint Enterprise Defense Infrastructure (JEDI) cloud contract that it won in October 2019 because of a grueling ongoing award protest Amazon Web Services filed that November.

Reforming procurement protests would allow federal agencies like the DOD to acquire and use technology more quickly without needing to wait for outdated acquisition processes to play out, Smith said during a Senate Armed Service Committee hearing on emerging technology development in the DOD

“How do you move quickly when the protest process moves so slowly?” he said.

Others testifying with Smith on Tuesday concurred that the DOD needs to find ways to streamline acquisition to take better advantage of emerging tech.  Hawk Carlisle, the retired Air Force general who now leads the National Defense Industry Association, agreed with Smith on costs of a lengthy bid protest, saying there is no disincentive for companies to file a suit. And ultimately, he said, this hurts the military customers who have to wait for protests to wind through the claims process.

“Don’t disadvantage the person that is waiting for the equipment,” Carlisle said.

The longer Amazon’s protest stretches on, the more future of the JEDI contract is coming into question. Smith acknowledged Microsoft may never get paid or be able to move forward with work under the contract. The DOD also said in a letter to Congress recently it is may have to consider an alternative if the protest continues on much longer.

“We have literally been frozen by a federal court on our performance on the JEDI contract for more than 12 months,” Smith said.

Federal Data Service looks to borrow from success of agency dashboard projects

The under-development Federal Data Service is set to focus on accelerating agencies’ data maturity with dashboard projects, like those the U.S. Department of Agriculture has undertaken the past few years.

USDA developed more than 500 dashboards across its 29 agencies in the nearly three years since Ted Kaouk was named chief data officer, without any staff at first.

The enterprise analytics program is a testament to how quickly CDO shops can mature when they provide agency leaders with the best data available via dashboards, Kaouk said during a meeting of the Federal Advisory Committee on Data for Evidence Building on Friday.

“It was a real big culture change,” Kaouk said. “It was about making visible some of the data quality issues.”

Kaouk’s team and those of USDA’s eight assistant CDOs do that by providing confidence levels with each dataset.

Now the 25-member advisory committee is working to launch the Federal Data Service for improving how agencies access, link and protect data — at least initially through pilot projects.

And USDA and the Census Bureau could provide models for how agencies determine what data is fit to use in dashboards, said Dominic Mancini, acting federal chief statistician. The bureau’s “pulse” surveys and COVID-19 data hub, for instance, have proven quick ways to visualize general pandemic trends concerning households and small businesses despite added uncertainty in the data, Mancini said.

Dashboard projects don’t have to break the bank either. USDA invested a “modest” $2 million across 19 agencies to begin with, Kaouk said.

“Largely it was about consolidating resources we already had,” Kaouk said. “We had Tableau servers across multiple mission areas, and we were able to bring them into a common platform and I think in many cases actually save money as agencies were looking to move their own cloud platforms.”

USDA worked with five of the General Services Administration‘s Centers of Excellence — including the Data & Analytics CoE — which helped accelerate some of the dashboarding work and performed data maturity assessments,  Kaouk said.

A question that remains unresolved is just how expansive the Federal Data Service’s scope will be. Will it be a federal service helping federal agencies stand up a lot of dashboards revealing potential efficiencies, or will that offering be extended to states and localities, asked Amy O’Hara, director of the Georgetown Federal Statistical Research Data Center.

“If we want this to be a national secure data service that supports the measurement needs of state and local, we’ve got to address this value proposition,” O’Hara said. “Are we thinking about resourcing this appropriately to achieve both of those goals?”

VA issues new data ethics principles

The Department of Veterans Affairs issued new ethics principles for accessing and handling veterans’ data, it announced Monday.

The nine principles are designed to ensure the safe and responsible use of data, especially personally identifiable information like medical data. With the increased use of data, particularly during COVID-19 response, comes thorny issues of how that data is used, protected and accessed, which the new principles aim to address.

In full, the principles developed by VA’s Data Ethics Group are:

“VA’s principle-based ethics framework takes a proactive approach to data management and privacy by setting standards for our partners to follow,” acting VA Undersecretary for Health  Richard Stone said in a statement. “VA is applying this framework to all data interoperability initiatives, including those tied to our COVID-19 response and modernization efforts.”

The set of principles comes as the VA is undertaking a massive modernization of its legacy electronic health record system, migrating to a new cloud-based Cerner EHR platform that the department hopes will bring its health care into the digital age. The Government Accountability Office recently instructed the VA to pause the rollout of the program to do more testing and fixing known issues, but the department said it won’t heed that advice.

The VA has expanded data access and usage in other ways with a host of new applications and APIs, some of which allow veterans to access their own data from iPhones. The expansion of data access has raised questions from lawmakers in the past over the VA’s security practices and their work to rid their networks of Chinese technology. The VA has also dabbled in AI projects to detect veterans at risk for suicide.

The principles follow other agencies that have enumerated broad guidelines for ethical data use. In 2020, the DOD adopted ethical principles for the use of artificial intelligence, which is heavily data-reliant.

The VA said it wants to have all its policies reflect the new principles by the end of 2022.

Marines piloting 5G to improve warehouse logistics

The Marine Corps Logistics Command in Georgia is piloting the use of 5G to enhance operations in new “smart warehouses.”

Federated Wireless and a group of other technology companies have kicked off work with the Marines to develop a high-bandwidth wireless network that can handle more data. Together, they will work to expand the capability of warehouse operations and improve the new network security with a zero-trust security architecture, according to a news release.

The project, based out of the command’s headquarters in Albany, is one of several in which the military is hosting 5G “test beds” on bases, offering private companies the opportunity to test their tech in less regulated environments while boosting connectivity for the services.

The new network is hosted on the Citizens Broadband Radio Service (CBRS). This part of the electromagnetic spectrum within 3.5-3.7GHz bands is reserved by the Federal Communications Commission for both federal and non-federal use.

The upshot for the Marines is the new network could improve the logging of receipts, storage, inventory control and auditing of supplies that support global operations, according to the release. The hope is the 5G network will also be able to support warehouse robotics and holographic, augmented and virtual reality applications.

The Department of Defense’s 5G strategy aims to improve base connectivity and eventually transition the technology onto the battlefield as a means to improve overall command, control and communications.

Cisco has partnered on the effort to provide a security architecture for 4G and 5G networks that follows a zero-trust model where all points of the network require continuous security checks, not just at the perimeter. With the added bandwidth of 5G and more data flowing, added layers of security are needed to ensure malicious actors are not corrupting information.

Other partners on the pilot include Amazon Web Services, Perspecta Labs, Vectrus, Capstone Partners and JMA

How open technology and process help the public sector innovate

Secret Service wants 2,000 body-worn cameras

Secret Service members may soon wear body cameras in the line of duty, according to a request for information from vendors.

The Secret Service is part of the Department of Homeland Security, which wants to award a firm-fixed-price contract for 2,000 body-worn cameras but first needs a better sense of vendor capabilities to plan the procurement.

Federal interest in body-worn cameras has increased since the Department of Justice started permitting their use on federal task forces in October.

The desired cameras will store at least 12 hours of video at a high-definition video resolution of 1080 pixels and ideally 24 hours at 8K. They’re to have at least a 150-degree field of view and ideally 180 degrees, as well as night vision and vehicle adaptability.

Cameras will have pre-event recording that can be adjusted to between 30 seconds and 2 minutes prior to their activation. Activation and deactivation can be manual but will automatically occur whenever the camera, a weapon or a taser is withdrawn from its holster.

Live feeds will be able to be remotely activated and monitored and all videos timestamped and dated. GPS can also be tied to the video and manually disabled or remotely activated.

The Secret Service wants the ability to categorize, label and redact videos captured. The cameras will be able to have their internal memory remotely monitored, and uploads will be both wireless and wired.

Cameras will wirelessly upload their contents on-premise or to a Federal Risk and Authorization Management Program-certified cloud platform within at least 10 and ideally five minutes of activation. The device will purge that data once it’s uploaded.

Non-evidentiary video files will be stored at least 30 and ideally 90 days, while the desired range for evidentiary files is 3 to 32 years. The Secret Service also wants the ability to label videos “permanent” to automate their perpetual storage.

Logs must be auditable and the cloud platform must have access controls capable of resisting denial-of-service attacks and exfiltration attempts.

Other camera features the Secret Service seek include:

Vendors have until noon on March 1 to respond to the request for information.

Air Force’s Operation Flamethrower aims to torch outdated IT policy

The Air Force‘s effort to burn down old IT policies that are holding back network modernization has a name befitting the type of change it seeks to spark: Operation Flamethrower.

The program has existed for several months to modernize IT policy, and now it’s keying in on nixing any policies that stand in the way of the Air Force’s move to an enterprise IT-as-a-service model.

In doing this, the Air Force faces challenges of budgetary stress — operating multiple networks boosts the IT price tag during a transition — and the complexity of gutting the decades-old policies designed around outdated technology, Brig. Gen. Chad Raduege said Thursday at a virtual AFCEA St. Louis meeting.

“Operation Flamethrower is all about creating offsets,” Raduege said. These “offsets” the program is looking to create are policy changes that would reduce the bloat of network operations with automation and secure endpoint weaknesses.

Central to that is the “shrinking of the AFnet,” the Air Force’s enterprise network, he said.

“This is a challenge that we are going through right now,” Raduege said of shrinking the legacy network and reducing the cyberattack surface.

The Air Force has been in a multi-year network transformation journey where it is championing enterprise IT-as-a-service to replace outdated legacy systems that are less secure and limit connectivity. Operation Flamethrower is also looking to reduce the redundancies created in the transition, burning out old systems that are no longer needed as new services come online.

“We are trying to figure out how to get from the legacy network where we are today into the future,” Raduege said.

The project has backing from senior leaders in the Air Force, Raduege said. It also has the support of the Cyberspace Capabilities Center at Scott Air Force Base.

Top Air Force general champions power of code during software factory visit

The chief of staff of the Air Force and other senior leaders paid a visit Tuesday to Kessel Run, the Air Force’s software factory and tech hub, where he underscored a commitment to a software-driven transformation of the department.

During his first visit to Kessel Run‘s Boston headquarters, Gen. Charles “CQ” Brown, the top officer in the Air Force, emphasized how he wants the Air Force to achieve some of its biggest modernization goals with rapid software developments. New programs at the heart of the Air Force’s evolution to a more digital branch will rely on software that is developed with the end-user in mind, Brown said during his visit to the coding factory. Kessel Run started as a pilot program for agile software development — where code is developed iteratively and refined to user needs — and has grown into a hub to rapidly and securely write and buy software for systems across the Air Force.

Brown has adopted a guiding mantra of “accelerate change or lose” as he leads the force. Much of that change will come from divesting from legacy platforms and dated aircraft programs in favor of modern software-defined platforms.

A big part of that, Brown said, is prioritizing the education of airmen on software development and using code as a tool to modernize systems in the fleet.

“Those are the areas that are going to be important,” he said during a Wednesday follow-up media roundtable in which he talked about the visit. “Someone is going to have to write code.”

Brown later added how he sees software as one of the most critical ways to evolve the Air Force’s current weapons systems and platforms. He pointed to the F-16 fighter jet — an aircraft that Brown himself has flown and even taught others to fly — as a system that can be transformed with more frequent software updates.

“The airplane actually changes because you are able to push more information to it,” he said.

Kessel Run helped develop programs that allow for more seamless software updates to the F-16, as opposed to a previous multi-day update process. Brown says that allows for the plane, and the force writ large, to be more nimble and invest more information from sensors.

“You are using software to change our approach,” he said.

JADC2 on the calendar

Software will also play a pivotal role in the Joint All Domain Command and Control (JADC2) operational concept: the futuristic strategy for defense driven by the coordination of an “internet of military things.” Brown told reporters Wednesday that he recently held talks on JADC2 with his counterpart in the Navy, Chief of Naval Operations Adm. Mike Gilday.

His discussions with Gilday come just before Lt. Gen. Dennis Crall, chief information officer of the Joint Staff, is set to hand over a document outlining the data standards and JADC2 strategy to the Chairman of the Joint Chiefs of Staff Gen. Mark Milley.

The Joint Staff is “laying out some level of standards and how we do data and digital architecture,” Brown told reporters. “That’s the lifeline.”

The Air Force has a memorandum of understanding with the Army to collaborate on JADC2 efforts, as the entire concept relies on sharing data across all domains and all services. Brown said he anticipates holding another meeting with Army Chief of Staff Gen. James McConville soon. Brown added that the publication of the DOD data strategy has helped increase collaboration.

“What we are seeing is a lot more dialog between the services,” Brown said.