DevSecOps is fueling agencies’ cloud migrations

Agencies have increasingly migrated to the cloud to expand their DevSecOps efforts over the last few years, according to federal officials.

Both the Bureau of Information Resource Management within the State Department and the Navy moved to the cloud to automate processes, integrate security into the software development process and deploy updates faster.

The embrace of the cloud as an enabler of DevSecOps and cybersecurity more broadly represents an evolution in agencies’ approaches to the technology.

“Two years ago the biggest driver was ‘my boss told me to,'” said Tom Santucci, director of IT modernization within the Office of Government-wide Policy within the General Services Administration, during an ATARC event. “Now people are starting to see the benefits of this.”

IRM functions as the main provider of IT resources, from infrastructure to messaging, for the State Department, and its Systems Development Division deploys software solutions for any domestic or overseas office with a business need. The bureau moved its previously separate development and production environments to the cloud over the last few years to bridge the two.

Now IRM can not only run up a build agent in the development environment to automate tests or scans but actually create a pipeline to push it to the staging or production environments for users.

“It was possible before the cloud, but it was a larger tactical effort and a larger security effort because you had differences between environments,” said David Vergano, systems development division chief of IRM, during a FedInsider event. “So the cloud backbone is helping to make things smoother, and now we can really try to change how we do things because we have the tooling.”

The department’s other offices come to Vergano with the particular cloud products they want to use, and he advises them to use Federal Risk and Authorization Management Program-certified tools to smooth acquisition and ensure security.

Like IRM, the Naval Information Warfare Systems Command’s (NAVWAR’s) Program Executive Office for Digital and Enterprise Services (PEO Digital) is delivering environments that can be built once and adapted to multiple use cases. Cloud platforms that enable continuous integration/continuous deployment (CI/CD) and DevSecOps make that work easier, but migration isn’t always immediately affordable.

“In [the Department of Defense], nobody has buckets of funding laying around to dump into modernizing their architectures so that everyone in July 2021 can move toward containers and microservices,” said Taryn Gillison, program executive director of the digital platform application services portfolio.

Alternatively, PEO Digital is developing enabling capabilities like Naval Identity Services, infrastructure as code (IaC) and middleware.

If PEO Digital can automate testing and tools for the Navy’s various components, it allows them to shift focus to modernization, Gillison said.

Hurdles remain for NAVWAR, however — namely installation timelines. Gillison said she’s “impatient” to see cloud-to-ship software pushes and other policy changes happen more broadly.

In the prior six to 12 months, 52% of IT executives at public sector organizations said they’d chosen a new cloud provider, according to an Ensono report from June. That number jumped to 85% in the prior 24 months, with 78% of public sector respondents citing security as a concern.

With most public sector organizations managing their cloud environments centrally, a multi-cloud model prevails — largely due to the flexibility it affords agencies, Clint Dean, a vice president at Ensono, told FedScoop.

That jives with DOD canceling its $10 billion Joint Enterprise Defense Infrastructure (JEDI) cloud procurement, citing its intent to launch the multi-cloud, multi-vendor Joint Warfighter Cloud Capability (JWCC) environment.

“As much as Amazon, Microsoft and Google would have us believe, maybe there’s not as much brand loyalty as folks think there is,” Dean said.

Education Department is getting ‘smarter’ about hiring, training data scientists

The Department of Education is crafting its first data-focused workforce plan with an emphasis on the professional development of both data professionals and general staff, according to Deputy Chief Data Officer for Analytics Sharon Boivin.

The Office of the CDO established just under two years ago in response to the Foundations for Evidence-Based Policymaking Act — is supporting other parts of the department by creating standard position descriptions for data professionals and career ladders for data scientists.

OCDO had to staff up fairly quickly and hired several master’s- and PhD-level data scientists, some straight out of school or with industry experience, under the Office of Personnel Management‘s 1530 statistics series last year. The office will continue to leverage such governmentwide opportunities, Boivin said.

“We’ve gotten smarter in what we ask for in vacancy announcements,” Boivin said during a Data Coalition event Thursday. “We now specify technical skills in the required specialized experience statements, and we’ve started asking for a code sample from each applicant.”

The latter keeps people who don’t have code samples from applying, which is a good thing, she added.

OCDO handles data governance, information collection, infrastructure, analytics, and data access including open data for the department, and its data professionals are involved in monitoring grants, performance measure analysis, distributing student financial aid, and planning for future budget and workforce needs.

New data scientist hires need to fit team needs, rather than being unicorns who can do it all, Boivin said.

Some prefer coding in Python, Stata, SAS or R languages and making data visualizations using Tableau, Power BI, R Shiny or Excel. But they also need domain knowledge and a strong foundation in statistics, Boivin said.

OCDO is considering university partnerships to better communicate hiring opportunities.

The office recently released a new Education Data Strategy, and one of the four goals is building human capacity to leverage data effectively for decision making. That includes filling short- and long-term staffing needs, launching a data literacy program for general staff, and an emerging program for education data professionals.

OCDO is also rolling out a Data Professionals Community of Practice and is structuring its curriculum with learning pathways, short topics, presentations, a mentoring program, and career paths and competencies for every General Schedule level in areas like data analytics and data architecture.

A final endeavor the office is undertaking is sponsoring rotational assignments that allow education staff to come in, learn and return to their home office with new data skills and a better understanding of the government’s data priorities, Boivin said.

What makes a good TMF proposal? Maria Roat has some tips

If you want to score money under the Technology Modernization Fund, you’re going to need to catch the attention of the TMF Board. And the best way to do that, Deputy Federal CIO Maria Roat says, is to really key in on a solid business case and get to the point.

“It’s about a business case, right — the CIO, the CFO, the mission and the alignment,” Roat said Thursday during IBM’s Think Gov 2021 event, produced by FedScoop. “And the initial project proposals need to get to the point.”

While the mission of the TMF is, broadly, to fund technology modernization projects, most of the evidence the TMF Board wants to see in proposals is relates to the business aspects, said Roat, who sits on the TMF Board as an alternate board member as deputy federal CIO. The TMF Board that evaluates proposals and awards funding is currently comprised of seven voting members and six alternate members.

“When you look at the questions [in the project proposal template], one of them is specifically about technology,” she said. “The rest of the questions are broadly about the business of what this proposal is, and what are those measures and what are those outcomes you’re trying to achieve? So the board is looking for the mission alignment, that it’s mission-focused — the partnership with the CFO, with the business owner. It’s not about an IT thing … it’s about solving a hard business problem. And this is where a thoughtfully crafted business case comes in.”

Often, the board receives proposals filled with extraneous information that doesn’t detail the actual work an agency hopes to get funded, Roat said. “Too often people are going on about their agencies. We know who your agency is, we know who you are. And if we’ve never heard of you, heaven forbid, we’ll go look it up. But you have to get to the point.”

At the end of the day, the initial proposal is meant to be a “low burden” so that the board can “maximize the number of projects” it analyzes, she said.

The TMF Board introduced an expedited process after the fund received $1 billion from the American Rescue Plan earlier this year, prioritizing selecting and funding projects “that cut across agencies, address immediate security gaps, and improve the public’s ability to access government services,” leaders announced in May. Along with that, it introduced new flexibilities in the agency repayment process built around those priority areas.

While the June 2 deadline for that expedited process has passed, Roat said the board is continuing to accept proposals and is “making sure that as those proposals are coming in, we’re doing very quick reviews of those.” To support that, the board has added alternate members so it can meet more frequently to assess proposals.

Additionally, the TMF Program Management Office has expanded to work more closely with agencies as they propose projects, helping them to come to the board with good proposals — again, focused on a business case — to save everyone time.

“They’ve done a great job over the last three years working with agencies and prepping those proposals, making sure that they’re in good shape even before they come to the board,” Roat said. “Having a good proposal coming into the board helps move things along a little bit faster. It expedites the board discussion.”

Industry presses government to invest in more practical quantum computing projects

Quantum computing industry experts urged agencies Wednesday to invest more of their budgets in practical projects that address mission needs while advancing commercial products.

The government’s ongoing quantum projects tend to focus on esoteric fields and theories like black hole edge conditions at NASA and high-energy physics at the Department of Energy, but that doesn’t help the Department of Commerce address more pressing issues like infrastructure and climate change, Christopher Savoie, CEO of Zapata Computing, said during Center for Data Innovation event.

Moonshot-type projects are great in intention, often leading to unintended breakthrough developments, Savoie said, like how the Apollo program aimed to send a semiconductor-based computer to the moon, creating Silicon Valley in the process. But lately foreign adversaries like China have been more successful at getting its industrial and academic bases to work on practical projects.

“They have a lot of near-term commercial outcomes for government and for industry that they’re putting in place that incentivize people to move to more near-term commercial things,” he said.

DOE provides researchers access to its testbeds — through the Quantum User Expansion for Science and Technology (QUEST) program — and its National Quantum Information Science (QIS) Research Centers conduct research and development. But generally, the goal is to address a problem that industry can’t, before gracefully bowing out to allow the industry to become viable, said Rima Oueid, commercialization executive at the Office of Technology Transitions within DOE.

That doesn’t mean DOE can’t do more practical work in the program space though.

“Within DOE there is a cohort that is looking at some of the more viable use cases, some of the shorter-term wins that are possible that fit within our mission space,” Oueid said.

With quantum likely in a hybrid state with classic computational computing for another eight years, breakthroughs are still being made in drug discovery, autonomous vehicles communicating with each other and allocating resources for emergency response, said Allison Schwartz, global government relations and public affairs lead at D-Wave Systems.

Still, private sector products remain in the early stages, and its important policymakers make targeted investments to ensure small companies can supply industry with quantum-enabling technologies like lasers and cryogenic cooling moving forward, said Celia Merzbacher, executive director of the Quantum Economic Development Consortium.

Industry needs to be better about keeping government researchers abreast of practical applications, and researchers must similarly improve sharing the results of their work with the private sector, Merzbacher said.

“The different parts of the ecosystem, the innovation supply chain, need to be in communication so that the results of the research move efficiently to the people who can develop it and incorporate it into their products and services,” Merzbacher said.

VA reforming EHR deployment following 12-week review

The Department of Veterans Affairs will restructure how it implements its electronic health records modernization program following a 12-week strategic review of the $16 billion program, Secretary Denis McDonough told lawmakers Wednesday.

User training is a major theme of many of the changes that need to be made to improve the way clinicians and patients use the new system, McDonough said during a Senate Veterans’ Affairs Committee hearing.

“I think that there is just no doubt the training was wanting,” McDonough told the

Included in those changes will be the introduction of new virtual testing environments — “sandbox” deployments of purely technical changes to improve underlying IT infrastructure that shouldn’t impact user experience. While the updates will bring fixes to some technical aspects, McDonough said they are mostly programmatic improvements.

Training and testing issues led to a delay in the initial go-live of the system in Spokane, Washington, in February and were the subject of multiple government watchdog reports, including one that sounded the alarm on potential system failure without adequate testing.

McDonough also reiterated the VA’s commitment to sticking with its prime contractor, Cerner, which has been working on developing cloud-based overhauls for both the Department of Defense’s and VA’s health IT systems.

The secretary said the continued rollout of the EHR across the nation would be dependent on testing and the ability for the facilities to make progress on deploying the system, rather than based on their location as was the initial plan.

“You will see us pursue a surge of activity in the coming weeks and months, intently focused on veteran experience, patient safety and employee engagement. Specifically, VA will pursue technical-only (“sandbox”) deployment of Cerner technology at previously planned sites in Veterans Integrated Service Networks (VISNs) 10 and 20 – ensuring technical readiness without affecting veterans or frontline clinical employees,” McDonough said in his prepared opening statement.

There will also be new data teams that will work to integrate databases associated with the new system. The VA’s EHR system is eventually supposed to seamlessly connect with the DOD’s MHS Genesis modernization program.

“However, gaps remain in our ability to govern and manage data between the two EHRs and with DOD to ensure seamless veteran and employee-centric information sharing and provision of managed, trusted data,” he said, adding that a clinician told him that most of the data integration happens in workers’ heads, not on the computer systems they use.

By the end of the year, the VA will also publish a data strategy specific to making EHR data more interoperable, McDonough revealed.

New ‘chaos engineering’ tool shared between DOD software factories

The Air Force’s Kessel Run software factory is transitioning to the Navy a tool that it has been developing for the past two years that is designed to emulate persistent enemy attacks on a system.

The Navy’s Black Pearl software factory will be the first group outside of Kessel Run to get the tech stack and list of best practices on implementing it. But eventually, the goal is for as many coders to get their hands on it as possible, lead engineer Omar Marrero told FedScoop.

The tech stack and Air Force team behind it are jointly known as Bowcaster, named after the weapon Star Wars character Chewbacca used in the film series. And the discipline behind their work is something referred to as chaos engineering.

“You have to constantly break the system to find where our weaknesses are,” said Marrero, whose official title is chaos and performance engineering lead. “That’s essentially what chaos engineering is.”

The idea behind chaos engineering is to unleash unpredictable, persistent attacks that can still be controlled in what exactly they target within a system to emulate an enemy. Kessel Run launched its first internal attack using the system in the summer of 2020 after launching the program in 2019.

Marrero said the idea to put resources into chaos engineering came organically from the need to more thoroughly test systems. He said he attended several tech conferences to learn from others that had deployed similar systems, even though he already has a background in this type of cybersecurity testing.

“As part of my career in the Air Force I have always done some flavor of chaos,” Marrero said in an interview.

The lessons the Air Force learned from others and in its own practice developing the tech stack is part of what it will be transitioning to Black Pearl as part of a chaos engineering “playbook.” It will also be porting code into Platform One’s software repository Iron Bank for others to start experimenting with.

One of the biggest lessons Marrero and the team learned was to “control the blast radius,” meaning don’t let the code start unplugging too many things.

Sharing tech stacks and tools like Bowcaster is a practice Kessel Run plans to continue. The Air Force and Navy are working on a new memorandum to share even more code between the two services.

Mike Brown backs out of nomination for top Pentagon contracting job

Mike Brown on Wednesday requested that his nomination to be the Pentagon’s top contracting officer be withdrawn amid an ongoing Department of Defense Inspector General investigation.

Brown, currently director of the Defense Innovation Unit, sent a letter to Department of Defense Secretary Lloyd Austin requesting that President Biden withdraw his nomination, saying that the investigation has roadblocked his confirmation to be DOD’s head of acquisition and sustainment and “our men and women in uniform deserve Senate-confirmed leadership as soon as possible.”

Brown came under scrutiny after one of his former employees alleged to the IG that he handed contracts and jobs to close allies and friends in a way that pushed the ethical boundaries in that role. The allegation came soon after Brown’s nomination was announced.

“Unfortunately, it appears that an ongoing investigation by the Department of Defense Office of Inspector General into personnel practices at the Defense Innovation Unit will delay consideration of my nomination by up to a year,” he wrote in the letter, which was obtained by FedScoop.

In his withdrawal request letter, Brown signals that he will continue leading DIU.

Brown would have brought a unique background to the role traditionally held by contract lawyers and former defense industry executives. He developed his career in Silicon Valley namely as the CEO of cybersecurity company Symantec before becoming a Presidential Innovation Fellow prior to taking over DIU. During his time leading DIU, he has championed the need for DOD to better scale the innovative technologies through rapid contracting.

Scaling cloud training programs a major challenge for CIOs

Scaling up internal training programs for staff within federal agencies is one of the biggest challenges facing agency CIOs, technology recruitment consultants have told FedScoop.

Departments currently use a range of tools, including 90-day and 180-day interagency job rotations, as well as sessions sponsored by cloud service providers. But experts say a more programmatic approach could help retain existing staff and attract new employees with specialist skills.

Britaini Carroll, principal director of Accenture Federal Services’ human capital division, told FedScoop it’s been an even bigger issue during the remote work of the pandemic. “A lot of CIOs I’ve talked with about this past year, whether it’s cyber or cloud, have had trouble getting their training online and leveraging the broader tech skilling that is out there.”

“100-150 [trainees] at a time isn’t going to scale, so there needs to be more intentional programming that enables both hiring new folks, bringing them up so speed, and broad-based learning on prioritized cloud roles,” Carroll said.

Meghan Sullivan, a principal within Deloitte’s government and public sector consulting practice, told FedScoop that she has seen a similar focus from CIOs on concerns about the best way to implement best-in-class cloud training for all staff. “[They are asking] how do I do this at scale, so that I’m not just investing in five people, but 50 people, so that if five or 10 leave I still have enough there,” she said. “How do I do it in a holistic manner, looking at training programs?”

Advocates say that programmatic training allows agencies to keep staff engaged, and also can help to address skills gaps left by the departure of staff amid a still-tightening technology labor market. Federal agency technology leaders in recent months have told FedScoop of an increase in the pace of departures from the government for the private sector, while a report last year by research firm Global Knowledge found that cloud and IT security skills continue to be most in-demand and that IT decision-makers were “struggling” to hire in these areas.

Advocates say also that a broad agency-wide training program helps to ensure staff remain engaged and that they have the necessary knowledge of different cloud platforms to be able to work on multi-system procurement contracts.

This story is part of FedScoop’s Special Report — The Continued Push to the Cloud.

How the DOD plans to approach cloud differently outside of the U.S.

The Department of Defense wants cloud computing to support everything from back-office tasks to battlefield operations. But how it gets cloud in regions outside of the continental U.S. comes with significant extra barriers.

The DOD’s process for addressing those barriers was outlined in a new strategy published in May. The department shed new light to FedScoop on how exactly it will execute the technical and resource-intensive hurdles involved in getting cloud at the so-called “tactical edge.”

“Cloud computing can help solve today’s national defense challenges, but its true potential is to solve tomorrow’s challenges,” the strategy states. “Collaboration across these domains, increasingly enabled by high-tech, software-driven solutions, must occur at the global point of need, at the tactical edge, and at the fight.”

The department wants more cloud and data storage capabilities in areas it operates to enable multi-domain operations: the ability to transmit data between airplanes, ground vehicles and any other platforms in battle, which will rely on the ability to expand networks with cloud storage capabilities to turn that data into actionable information. By having cloud computing capabilities in the field, military operations have more of a technical backbone to support that kind of rapid data transfer and the computing power to analyze that data.

Physical challenges

The harsh environments the military often operates in present a range of challenges for the computers that often need carefully controlled setups. One solution the DOD says it is pursuing smaller, less-power hungry machines.

“OCONUS Cloud Strategy acknowledges that space and power is limited because the locations will be hosted in U.S.-controlled military locations that are treated as U.S. soil. This approach is necessary to avoid all data sovereignty issues with a host nation,” a DOD spokesperson told FedScoop.

The hardware that forms the computing backbone also needs to be mobile as DOD is constantly shifting its operations.

“The rapid pace of advances in mobile cloud compute capabilities creates the belief that a mobile cloud could be managed like any other set of forward-deployed resources, such as planes, ships, or infantry battalions,” the spokesperson said.

Not only does the physical hardware need to be mobile, but the elasticity of data processing is critical. With unsuspecting surges in data possible in an environment where the military would need to respond rapidly, the ability for the systems the expand to meet the demand is one of the benefits of cloud that the military wants to take advantage of.

“Similarly, the data capacity needs will be very elastic and linked to mission objectives. One of the core values of the commercial cloud is that it has freed commercial enterprises from having to plan for and purchase against the largest but, isolated compute surges. Building on the above, as mission demands expand, and more resources are deployed in-theater, a planned expansion of data and compute capacity can accompany those units,” the DOD spokesperson said.

Cybersecurity challenges

Cybersecurity is also complicated when cloud hardware is outside the borders of the U.S. DOD regulations require sensitive data to be kept on U.S. soil as foreign internet connections and easier access to hardware presents a ripe target for attackers. But the solution, DOD says, will be similar to how the department wants to protect its networks and cloud capabilities inside the U.S.

“The principles and pillars of the Department’s zero-trust strategy and reference architecture will guide the implementation of all Cloud-based services deployed by the Department regardless of physical location.  This applies to both CONUS and OCONUS deployed assets,” the spokesperson said.

Zero trust is a network architecture based on the principle that every user is granted “zero” trust and not given free roam to move about a network just because they have credentials. That framework is designed to segment networks to stop attackers that make it past the first line of defense, whether it’s inside or outside of the U.S.

All of the challenges OCONUS cloud operations face are also being met with human resources. The DOD plans to deploy teams of cloud engineers and experts into the field to set up and run the unique technology.

“As an illustration, instead of transporting hard drives back to the CONUS for processing because of bandwidth limitations, the strategy calls for deploying research teams to explore novel ways to process this data closer to the tactical edge. These focused research and development efforts are intended to lead to capability advancements that very well may necessitate a faster tech-refresh cycle,” the spokesperson said.

This story is part of FedScoop’s Special Report — The Continued Push to the Cloud.

Austin commits to $1.5B for DOD’s Joint AI Center over next 5 years

Secretary of Defense Lloyd Austin on Tuesday said he will commit $1.5 billion for the Department of Defense’s Joint Artificial Intelligence Center over the next five years.

While Congress ultimately decides what funding the JAIC will get, in recent years, it has shown willingness to appropriate funding to develop it as DOD’s AI “enabling force.” The most recent budget requests have yielded around $200 million annually for the JAIC, a number that would increase to about $300 million per year if Austin’s promised $1.5 billion is authorized by Congress.

“Done responsibly, leadership in AI can boost our future military tech advantage — from data-driven decisions to human-machine teaming. And that could make the Pentagon of the near future dramatically more effective, more agile, and more ready,” Austin said at the National Security Commission on AI conference Tuesday.

Austin said that the department has more than 600 AI projects running, many more than the year prior. The JAIC has been at the center of the DOD’s AI push, at first working on individual projects but now focusing on assisting the DOD’s myriad of AI offices. One of the programs the center continues to focus on is the Joint Common Foundations, an AI development platform that it eventually hopes will be the central tool developers across the department will use to write code, work with data and advance AI projects of their own.

The vision for AI in the DOD revolves around “integrated deterrence,” Austin said. In essence, the idea is to weave AI tools and the concept of operation into everything the DOD does, from logistics to waging war in all domains. It includes the new framework of Joint All-Domain Command and Control, where all sensors from across the domains of battle are integrated and use AI to make sense of data from the battlefield.

“AI and related technologies will give us both an information and an operational edge,” Austin said.

He also acknowledged that in order to achieve this type of integration, new acquisition methods will need to be used to field the rapidly changing technology. “But we know that truly successful adoption of AI isn’t just like, say, procuring a better tank,” he said.

One of the new tools in the DOD’s acquisition toolbox is the Rapid Defense Experimentation Reserve (RDER), which helps get promising tech across the so-called “valley of death” — the struggle for the Pentagon to transition and scale research-and-development efforts, particularly innovative technologies, to use on the battlefield

“In today’s world, in today’s department, innovation cannot be an afterthought. It is the ballgame,” Austin said.