Agency leaders face a steep and costly road ahead in implementing zero-trust security
The federal government’s ambitious plans to implement zero-trust security across federal agencies, while well-intentioned, will require substantially more funding and foundational technical work than officials may have anticipated, according to agency IT executives in a new FedScoop survey.
As a result, the goal of establishing zero trust operating environments — and the ability to continuously verify users, devices, and applications accessing federal IT systems — could take much longer than federal officials envision.

Efforts to implement zero-trust security took on new urgency when increasingly sophisticated malicious cyber campaigns prompted the White House to issue an Executive Order on Improving the Nation’s Cybersecurity in May of 2021. The EO, and subsequent directives from the Office of Management and Budget, tasked federal agencies with implementing a series of strategic security initiatives by the end of the fiscal year 2024.
According to 177 pre-qualified federal agency IT decision-makers surveyed in May, a lack of dedicated funding, the complexity of federal IT systems, and the need to deploy solutions capable of working across siloed business units suggest that agencies will be hard-pressed to meet the administration’s timelines,
Among the survey’s key findings:
Funding squeeze – One in 4 IT decision-makers working at civilian agencies — and 1 in 5 at defense/intelligence agencies — predicted that implementing zero trust will consume as much as 10% or more of their annual IT budgets in FY 2023 and 2024. For some agencies, that means redeploying hundreds of millions of IT dollars. Moreover, roughly two-thirds of respondents believe that at least 4% or more of their agency’s IT budgets will need to be redirected from other initiatives to meet OMB’s zero-trust mandates, likely resulting in various IT modernization and maintenance efforts to be put on hold.
Still in the starting blocks – Another significant factor constraining progress is the extent to which agencies are still actively modernizing the foundational systems required to support zero trust capabilities. A majority of respondents reported that their agency security tools were on par with standard federal security practices; however, roughly one-third revealed that their agencies remain in the early stages of supporting the five pillars of zero trust outlined by OMB around users, devices, networks, applications, and data.
Cloud’s role – The ability to deploy, automate and orchestrate the five pillars of zero trust will depend heavily on agencies’ experience using the cloud. However, 24% of respondents at civilian agencies and 34% at defense/intelligence agencies reported that only one-third or less of their agency’s mission and operations applications currently operate in the cloud. “Zero trust requires dynamic policy enforcement at scale. Suppose two-thirds of agency IT operations is still on-prem. In that case, the realities are, it will be very, very costly to enable zero trust,” said one federal agency CISO who reviewed the study’s preliminary results but wished to remain unnamed.
Key challenges ahead – Among the top technical challenges IT leaders said they still face in adopting zero trust, 4 in 10 respondents cited the interdependency and complexity of existing technology. Conflicting IT priorities and managing the growth of data pose additional challenges. Another third expressed challenges with advancing security in one security pillar with breaking things in another, and those challenges varied by agency size, as did recommendations for what measures would help agencies most in achieving the White House’s zero-trust goals.
Confidence in meeting deadlines varied – Just under half of civilian and defense/intelligence agency respondents voiced confidence that their agency would achieve OMB’s zero-trust security goals closer to the end of FY 2024. However, another 46% in both groups reported being skeptical or not confident about meeting OMB’s deadline. And roughly 1 in 4 respondents said they didn’t expect to have the underlying tools in place to manage, analyze, automate and orchestrate security controls across all five zero trust pillars before the end of the fiscal year 2025 or beyond.
The new FedScoop study, “The Quest for Zero Trust,” conducted by Scoop News Group and underwritten by Forcepoint, provides a snapshot of how agency executives believe security experts describe the maturity of various components within the agency supporting the five pillars of zero-trust security. For instance:
- About 1 in 3 respondents said their agencies are still in the planning stages — or laying the foundations for — multi-factor authentication and centralized identity management capabilities.
- Another third said their agencies were at similar stages for deploying endpoint detection and response systems — which struck the CISO who reviewed the study as alarming given the importance of EDR as an essential security tool.
- Roughly 4 in 10 respondents at both civilian and defense/intelligent agencies are still in early stages of deploying network micro-segmentation — and replacing VPNs with zero trust network access.
- While 6 in 10 respondents said their agencies have centralized access authorization capabilities for applications — and two-thirds had single sign-on — at least 3 in 10 are still getting started with dedicated app security testing and continuous authorization to operate tools.
- While a majority of respondents indicated having data encryption and data loss prevention tools on par with their peers, more than 1 in 3 said their agencies are still getting started with data tagging and tracking, data inventory and governance, and automated data flow mapping capabilities.
After reviewing the results, the federal CISO noted that despite half to two-thirds of respondents reporting various zero-trust security capabilities as being on par with, or superior to, their peers, in all likelihood, most of those capabilities were only partially deployed or operational. Consequently, agency leaders may be overoptimistic and have a steeper road ahead with implementing zero-trust security than many may fully recognize.
An additional challenge agencies face has been the extended amount of time it takes to get existing security products, such as EDR and asset management tools, to operate properly at the scale many larger agencies require, according to a federal IT director who reviewed the results, but wished to remain unnamed.
“This transformation to better situational awareness and telemetry is not easy when you look at what is deployed today,” commented Doctor Nicholas Lessen, a solutions architect specializing in User, Entity and Behavior Analytics at Forcepoint’s Global Governments division. “Many agencies have started down the road of zero trust with a focus on identity security; however, further integrations and automated, dynamic responses will be necessary to deliver a comprehensive view of the relationship of who and what is on the network, associated actions, and the risks to organizations.”
Download the full report, “The Quest for Zero Trust” for detailed findings.
This article was produced by Scoop News Group for FedScoop and sponsored by Forcepoint.
DOD, Air Force aim to ‘streamline acquisition’ with latest and forthcoming JADC2 awards
The Defense Department and Air Force recently approved more than two dozen companies to compete for opportunities to provide capabilities that will ultimately enable the military’s envisioned next-generation, sensor-driven command and control setup.
Last week, the department awarded 27 companies, nine of which are based in Virginia, spots on a multiple-award contract for capabilities across domains to enable Joint All-Domain Command and Control (JADC2).
In the past, each of the U.S. military services produced and relied on its own tactical network that was not widely compatible with those of the other services — but JADC2 is the Pentagon’s modern concept to change that. Through JADC2, DOD aims to connect sensors and shooters across all the military services and allow for quicker data-sharing to drive better decision-making across one single network as the conflict landscape evolves.
The 27 businesses mark the latest of a growing group to be selected for the Air Force’s Advanced Battle Management System (ABMS) vehicle, which is meant to underpin that overarching network to move information and fully implement command and control seamlessly across land, sea, air, space, cyberspace and the electromagnetic spectrum.
“The ABMS indefinite delivery/indefinite quantity is a contract vehicle established to compete efforts within seven different categories. The ‘up to’ ceiling amount of $950 million is established per company, and the minimum order guarantee is met upon award with a detailed company capabilities report deliverable,” Air Force Spokesperson Maj. Joshua Benedetti told FedScoop on Thursday.
Currently there are 205 companies eligible to compete for work under the larger JADC2 concept — and each only in the specific categories that they were brought on for, Benedetti confirmed.
ABMS is viewed essentially as an open architecture family of systems that facilitate the operation of capabilities via multiple integrated platforms. The categories for which companies were awarded to provide specific solutions include digital architecture, standards and concepts; sensor integration; data; secure processing; connectivity; applications; and effects integration.
Locations of performance will be determined down the line and the work is anticipated to be complete by late May 2025.
Benedetti noted that this new announcement is “not tied to a specific ABMS effort, and is not a specific task order to any of the companies.” It will add approved vendors for the overall ABMS IDIQ, he said, and all requirements are competed for via the government’s fair opportunity process after the initial order.
“For example, ‘Applications’ is Category 5 on the IDIQ, so if there is a requirement for a new application team, then the Fair Opportunity process would occur to the teams on the ABMS IDIQ for Category 5,” Benedetti explained.
This structure is meant to help “streamline the acquisition process,” in the Pentagon’s view, and accelerate the maturation of innovative defense technologies.
“After the competition/evaluation period we can get a task order issued to the company faster, because they’re already approved,” Benedetti added. “This type of announcement will happen again in the future as we add additional companies to the IDIQ.”
Companies focusing on modular designs for final phase of Army’s TITAN program
With the final stage set for the Army to choose one of two contractors for its next-generation ground system to collect and disseminate sensor data, both companies are trying to develop a system that will be adaptable in a highly dynamic and unpredictable multi-domain environment of the future.
The Tactical Intelligence Targeting Access Node (TITAN) is considered a critical modernization component for the Army’s multi-domain operations (MDO) concept because it will integrate various types of data from numerous platforms to help commanders make sense of a fast-moving and complex battlefield. The Army recently awarded Raytheon Technologies and Palantir $36 million each to compete for the final phase of the contract, at which point there will be a downselect to one company.
There will be four opportunities for the companies to present their capabilities to soldiers during that 14-month process, company officials said, with troops providing critical feedback to further build upon.
“What’s different this time is that the Army is very engaged in a sense that from a teaming perspective, which is great, that when we go through these review demos every quarter that we’re going in together on this. And it’s more about developing and improving,” Scott McGleish, Raytheon Intelligence and Space’s lead on TITAN, told FedScoop in an interview. “I think you have to have that operational view as well … It’s very good the way it’s being done this way. We’re really happy with how the Army’s agreed to and moving forward.”
A senior engineer with Palantir, who was not authorized to speak on the record, indicated similar sentiments, noting the input from soldiers will help provide the Army something that will be ready to go for low-rate production and quick fielding at the end of the 14-month period.
To date, these soldier touchpoints have been both in lab-based environments — where the soldiers can see the laptops, interfaces and feeds — and exercises where they have the opportunity to validate the technology in an operational context.
In this final phase, McGleish said Raytheon will be working to mature the user interface, which ingests data and intelligence from various sensors across the Army and Department of Defense.
Palantir wants its system to be as modular as possible so the capability can evolve.
“We’re trying to make it as modular of an approach as possible and also to make sure that the system and the platform overall can mature alongside with the Army as advances in AI occur,” the engineer said. “It’s going to be very different considering what the fight might be in 2028 and what the current state of technology might be in 2028 or 2030, when the Army hopes to be MDO capable or MDO ready. We want to make sure that we’re going to be building this modularly as we go along.”
The next phase will also focus on further ingraining artificial intelligence and machine learning. This will help narrow down the right feeds of data from various systems and sensors at the right time for the processing, exploitation and dissemination of intelligence.
“You’re talking about, like six or seven ELINT [electronic intelligence], GEOINT [geospatial intelligence], SIGINT [signals intelligence], etc., and coming from space down to ground. All that — those feeds coming in and all the all the data that comes in on those — are labeled differently,” McGleish said. “The artificial intelligence and machine learning is going to help narrow that down, process it better so you don’t need an army of intel analysts. You need a few intel analysts in there. So this is what this is about. It speeds up the OODA [decision-making] loop and so the commander can make a decision to put some kind of effect” on targets.
Both Raytheon and Palantir officials explained the need to have open systems and work with others to ingest that data given the Army doesn’t own it.
“The Army again, has done a great job of communicating the overall concept of operations with our mission partners, then also supporting the technical exchange meetings or the TEMs with the partners as well too, so that they can keep tabs on the development of our program as it moves along,” the Palantir engineer said.
TITAN will also aid in the Army’s pursuit of long-range precision fires, which could also include electronic warfare or cyber weapons, McGleish said. In order to fire across thousands of miles, the Army needs to see across thousands of miles.
“You got to think about the peer-to-peer threats and the thousands of targets you’re going to have to deal with,” he said.
Top Marine expects unmanned technology to spur change in designs of amphibs
The development and fielding of new unmanned systems will likely lead to changes in how the Marine Corps’ next generation of amphibious ships are designed, the service’s top officer said Thursday.
Future amphibs will need to be able to launch and recover a variety of robotic platforms — including aerial drones and surface and subsurface vessels — that the U.S. military intends to field, Commandant Gen. David Berger noted an event hosted by the Hudson Institute when asked for thoughts on what a future “LXX” amphib might look like.
“We’re going to use the amphibious ships we have right now in ways we have not used them in the past. Think unmanned. Okay, now think if you designed a ship that was designed with unmanned [platforms] in mind, what would that look like? Probably a little bit different,” he said.
If “you had a clean sheet of white paper here … what would that vessel look like? Probably different than what we have right now. So my point of departure is not the vessel we have right now [or] the next best version of it — it’s how do we think we’re going to need to operate in the future? What would that look like? My guess is more of them, [but] smaller” than today’s amphibs, he added.
Berger envisions amphibs as part of a larger network of launch and recovery sites for unmanned systems. But they don’t necessarily have to function as “motherships” for drones or other robotic platforms.
“To date, we primarily thought of amphibious ships somewhat like [aircraft] carriers where you leave the mothership and you come back to the mothership. We need to look at them as they’re part of a whole network of portable airfields, plus the fixed ones and harbors. So if we’re going to launch these unmanned platforms, they don’t need to come back to the same ship. They could go ashore, they could go to somebody else’s ship, they could go to an Australian ship,” he said.
Similarly, allied militaries could potentially land their drones on U.S. amphibs, he noted.
Not having to return to their launch sites could enable robotic systems to operate farther away from their point of departure.
“We can actually extend these ranges if we open crack our minds open a bit,” Berger said.
Drones could also potentially refuel other drones, further extending their range. “Technology wise, we’re not far” from being able to do that, Berger said.
OMB working to develop system for real-time zero trust scoring
The Office of Management and Budget is working to develop a system that generates trust scores for users before allowing them to access its network or applications, according to the chief information security officer of its Management and Operations Division.
Speaking during an ATARC webinar Thursday, Dan Chandler said the idea is to use all the network information at OMB’s disposal to alert a user when their trust score isn’t high enough in real time — rather than simply reject their request.
The Cybersecurity Executive Order issued in May 2021 accelerated agencies’ efforts to implement zero-trust security architectures, but funding and expertise for systems like the one OMB envisions remain scarce.
“System may be too strong a word,” Chandler said. “This is an idea that we’re starting to develop.”
The comments after Federal CIO Clare Martorana last month told FedScoop that OMB aspires to implement new trust measures as it works to improve security and customer experience.
Agencies use tools like Google Authenticator and others from Amazon Web Services and Microsoft Azure to authenticate users, but trust in them changes depending on current events. If a zero-day vulnerability is found in one of those services, trust in it may drop a certain percentage, Chandler said.
If implemented, OMB’s desired system would compare a session’s trust score to the trust requirement on a function of feature. If a user’s score is too low to grant access, a list of options for raising their score — like reauthenticating or inputting a personal identity verification card — might even be provided, Chandler said.
The Department of Commerce is also interested in evaluating the trust of users and devices, but network evidence isn’t feeding into and informing its zero-trust architecture yet.
“We’re just not there yet because the investments haven’t come through,” said Lawrence Anderson, deputy chief information officer at the Department of Commerce. “But at some point we’re going to need some advanced tools to get to that advanced level of zero trust that we want to get to.”
Meanwhile the General Services Administration is working on another authentication solution that is expected to cost slightly less than Login.gov.
OMB has run the MAX.gov system, which performs authentication using PIV cards, for years. Agencies use MAX.gov for their budget systems and other use cases.
“MAX.gov is being transitioned to GSA,” Chandler said. “So by the end of next year GSA is supposed to have stood up an alternative solution which, as I understand it, is going to be based on Azure Active Directory.”
Pentagon’s IG contemplates move to the cloud via new solicitation
The Defense Department’s primary oversight arm is conducting market research to determine what exactly a move to the cloud would mean for — and bring to — its organization.
According to a two-page request for information (RFI) published this week, the DOD’s Office of the Inspector General is interested in commercial cloud solutions that could secure its particularly sensitive information and workloads, in an environment that’s isolated from the department’s various other cloud tenants.
Acting Principal Deputy Inspector General Steven Stebbins briefed FedScoop on the defense watchdog’s possible journey to the cloud and how the RFI should be viewed against the backdrop of the DOD’s ongoing Joint Warfighting Cloud Capability (JWCC) pursuit.
“Right now we haven’t made any major decision to move to the cloud. Our environment is currently housed in [Defense Information Systems Agency] data centers, but obviously, this is a fast-moving area. It’s evolving, and we want to have all the information that we need, so if we do decide to make a decision down the road — and potentially move to the cloud — we’re fully informed,” Stebbins explained.
The Army veteran, who also currently serves as the OIG’s chief of staff, noted that the office “is in a good place now” when it comes to its computing infrastructure, but needs to prepare for whatever the future might bring.
“So that’s really all that this [RFI] is about,” Stebbins said.
In order to host the Pentagon’s data, cloud service offerings must be compliant with a number of government regulations and categorized into one of several impact levels, which are based on the sensitivity of that data.
In the RFI, OIG officials note their intent to learn more about cloud solutions that are approved to house information at DOD impact level 5 (IL5) — or the most sensitive level of unclassified information — and connect to the Non-classified Internet Protocol Router Network (NIPRNet). They confirm in the document that “the OIG is anticipating to operate in a hybrid environment with a phased move of key applications to the cloud over time.”
Specifically, the office wants to hear from other federal entities that have transitioned to the cloud about cost savings and benefits they’ve realized in doing so. Other responders beyond the government are also invited to weigh in on how the technology could impact the IG.
“I think our requirements are kind of unique because of our position within the department, so it will just be interesting to see what others with similar requirements are doing,” Stebbins said.
Responses to this RFI are due by July 19.
As it’s only “just beginning to gather information,” Stebbins said the OIG does not currently have a timetable for next steps. Once insights are collected through the RFI, they’ll be shared with the Chief Information Office and other key technology players. And as with any RFI, there’s no obligation to move forward with a procurement.
“We don’t want to be reactive and find ourselves in a position where the department goes one direction and we have limited time to make a change to our current arrangements,” Stebbins said. “We want to anticipate that and then be prepared to make timely decisions.”
Notably, the OIG is pursuing this research as the broader DOD is preparing to award its highly anticipated multi-vendor cloud contract vehicle, JWCC. Google, Oracle, Microsoft and Amazon Web Services are currently competing for awards on that vehicle, which is worth billions and replaced the JEDI contract after the DOD’s original and years-long enterprise cloud competition was ultimately canceled.
“While we certainly are aware of what the department is doing, our requirements are our requirements — and I don’t know that what we’re doing here really is part of that conversation, at least at this point,” Stebbins said.
If the OIG does move forward with implementing its own cloud before JWCC comes to full fruition, it’s unclear right now whether it will consider other cloud vendors beyond those four competing for the bigger DOD contract mechanism.
“I really can’t say at this point. We’re gathering information. We’ll see what happens in the future,” Stebbins said. “So, we’re open to whatever options might be.”
Army looks to advanced tech to modernize its medical capabilities in new strategy
As the Army charts a course to modernize its force by 2035 with a focus on multi-domain operations, the service is also looking to fundamentally transform its medical capabilities to take advantage of and move at the pace of technological advancements.
Army Futures Command on Thursday published the Army Medical Modernization Strategy to guide the service’s transformation into a “semi-autonomous, integrated, networked capability” that will support the future of conflict envisioned in its 2019 Army Modernization Strategy.
The 22-page document is meant to guide the Army’s requirements, priorities and direction of medical modernization efforts for 2035 and beyond. A key assumption in the Army’s development of the strategy, among others, is that there will be vast developments over the next decade-plus in medical and non-medical technologies that will affect the Army’s future medical capabilities.
The strategy explains that the Army aims to transform its health system “through 2035 to be a more adaptable medical force capable of harnessing, integrating, and utilizing future technology on the battlefield to save Soldiers.”
The Army Health System’s (AHS) current acquisition and modernization processes are antiquated and unable to keep pace with the current threat environment, Lt. Gen. James M. Richardson, acting commanding general of Army Futures Command, says in an introductory message for the strategy.
“Since the last transformation with Air Land Battle over 40 years ago, Army medicine has continuously placed new technology on top of existing doctrine,” Richardson says. “This is no longer adequate. Modernization must be baked in, not bolted on; evolving doctrine to the pace of proven technologies and treatment modalities.”
The future envisioned in the strategy is one where Joint All-Domain Command and Control (JADC2) is pervasive and the formations, commanders and others operating in disparate warfare domains are widely connected. And not only are networks powering the widespread sharing of data to support things like advanced analytics and artificial intelligence for better situational awareness, but the Army also expects by then it will have comprehensively fielded autonomous systems and vehicles and other emerging capabilities, like the Integrated Visual Augmentation System (IVAS) platform, to aid in medical care.
“By 2035, the [Army Health System] will transform its organizations into modernized, tailorable and scalable [multi-domain operations, or MDO]-capable formations that are strategically positioned and able to leverage national-level capabilities and authorities,” says the strategy. “The AHS MDO force will combine tailored integrated formations of networked manned and unmanned platforms, sustainment, communications, intelligence, and protection capabilities from the individual to theater.”
The AHS must have the flexibility and capability to quickly adapt to novel injuries and threats in the future operating environment, the document notes.
“The ability to establish ‘care webs’ that allow for vertical, horizontal, and digital synchronization and integration for the care of the wounded, ill and injured will be critical to ensure the AHS provides the quickest, most efficient and appropriate care to our soldiers on the battlefield and beyond,” the strategy says.
While humans are at the center of the strategy, there is a great deal of emphasis on human-machine teaming. Medical formations will “leverage advanced robotics, AI, and optionally-manned systems with humans in- or on the-loop to enable decision making to inform advanced clinical care and prioritize evacuation,” the document says. “These technologically advanced systems will move casualties to the medic, aid the medic in treatment and movement of casualties, or serve as an evacuation platform with autonomous or human-provided care.”
In anticipation of this future state, as well as the corresponding rapid evolution of medical threats, the Army says it must look beyond the 2030 time frame and begin to invest in the research of disruptive technologies that drastically change how it will operate, and treatment modalities. it calls for the AHS to prioritize, develop and capitalize on rapid advancements in medical innovation and disruptive technologies.
The strategy points to six “disruptive research priority areas” that it will invest in through 2035: human intelligence, bio and human enhancement technology (BHET), data-AI-biotechnology, synthetic biology, additive manufacturing, and quantum technology.
“These areas directly nest with the [strategy] and address both required capabilities and capability gaps,” the document says. “We must realign and focus resources in these areas to ensure that the AHS will keep pace with operational advancements and expand treatment modalities to support the future force.”
Officials from Army Futures Command were unable to comment on the strategy prior to the publication of this story.
Department of Veterans Affairs working to end of July timeline for CISO appointment
The Department of Veterans Affairs is seeking to appoint a new permanent chief information security officer by the end of July, FedScoop understands.
Sources familiar with the hiring effort told FedScoop the agency is working to the timeline as it pushes ahead with a months-long search for an IT cybersecurity leader.
The VA has interviewed candidates for the job in recent weeks, which are understood to include acting CISO Lynette Sherrill.
The VA has sought to appoint a new chief information security officer following the departure of Paul Cunningham in February. Since his exit from the agency, the duties of CISO have been carried out on an acting basis by Enterprise Command Operations IT leader Lynette Sherrill.
Among the challenges facing the incoming CISO at the VA will be responding to concerns over the pace at which the department addresses cybersecurity concerns.
At a House committee hearing last month, the VA’s OIG highlighted that the VA’s fiscal year 2021 Federal Information Security Modernization Act (FISMA) audit showed “limited progress”.
Giving evidence to House lawmakers in the same hearing, VA CIO Kurt DelBene said his agency was working as quickly as possible to appoint a permanent CISO.
A VA spokesperson declined to comment.
Watchdog finds just two DOJ agencies adhering to supply chain risk requirements
Only two agencies within the Department of Justice have followed supply chain risk requirements over the last six years, according to an agency watchdog.
In a report published Thursday, the DOJ’s Office of the Inspector General found that only the Bureau of Alcohol, Firearms and Tobacco (ATF) and the Drug Enforcement Agency were compliant with cyber-supply chain risk management (C-SCRM) guidelines intended to ensure IT purchases do not introduce vulnerabilities into government networks.
“We assessed C-SCRM compliance by several of the largest non-FBI DOJ components … [and] concluded that only ATF and the DEA were compliant with the JMD C-SCRM requirements, including submitting applicable IT purchases for a C-SCRM review,” the watchdog said.
The IG’s audit generally covered the DOJ’s supply chain management activities from October 2016 through January 2022.
Supply chain risk within federal agencies’ IT procurement processes has received enhanced scrutiny since the SolarWinds attack in 2020 during which software supply chains were used to breach cybersecurity defenses and steal information across government and the private sector.
At the Department of Justice, that cyber breach resulted in the exposure and presumed theft of unclassified information from approximately 3% of email accounts across the agency.
In its report, the DOJ IG found also that the Justice Management Division had just one individual tasked with overseeing its supply chain risk management program.
“Overall, JMD lacked the personnel resources necessary to effectively manage this critical program. JMD needs to provide communication, outreach, and training to Department components and develop procedures to periodically assess their efforts,” the IG added. “Without such efforts, C-SCRM controls could be bypassed and high-risk IT could be installed without JMD authorization or a risk mitigation plan.”
In May, the National Institute of Standards and Technology published updated guidance meant to help agencies and organizations protect against cyberthreats in the supply chain, a major focus of the Biden administration’s cybersecurity executive order last year.
The revised publication on cybersecurity supply chain risk management gives acquirers and users of software and other technologies key practices, processes and controls to consider as they look to protect against such threats that can emerge from that tangled web of global suppliers and manufacturers from which companies develop technology products.
DC-QNet consortium director shares new details about plans for quantum network testbed
Six government agencies based around Washington, D.C., and two out-of-region affiliates recently launched a new consortium to jointly create — and ultimately connect through — an ultramodern quantum network testbed. However, other organizations, including from the private sector, may also be given the opportunity to conduct innovative experiments with the technology down the line, the organization’s executive director told FedScoop.
Through the newly unveiled Washington Metropolitan Quantum Network Research Consortium (DC-QNet), eight federal entities will contribute to a host of scientific and technical pursuits necessary to implement a functional quantum network for the U.S. government and Department of Defense.
Quantum information science (QIS) is a buzzy, emerging field that the U.S. government and its competitors have been increasingly prioritizing. The discipline seeks to apply phenomena associated with quantum mechanics to process and transmit information. Quantum networks are elements of QIS envisioned to one day provide the ability to securely distribute and share data among quantum computers, clusters of quantum sensors and other devices.
Agencies involved in DC-QNet include the Army Research Laboratory, Naval Research Laboratory, Naval Observatory, National Institute of Standards and Technology (NIST), National Security Agency (NSA) and NASA. The Naval Information Warfare Center Pacific and Air Force Research Laboratory are also involved as out-of-region affiliates.
“This is an example of [federal] organizations with different missions working together on common scientific challenges that benefit all of their missions,” DC-QNet Executive Director Gerald Borsuk told FedScoop on Wednesday. “The synergism of their contributions will enable advances that none of them separately could achieve in an efficient manner.”
Principal investigators from the various government components will each steer tasks aligned with specific technical goals. Pursuits will include advancing metrology needed to operate a quantum network, infrastructure development, network simulation, and the implementation of novel capabilities and devices to enable this technology.
The door might later be opened to other organizations to participate in the initiative, Borsuk told FedScoop.
“Once the basic science and network configuration is established and experimental proof of key components and concepts are performed, other organizations including from [the public and private sectors and academia] may be given the opportunity to conduct innovative experiments on the DC-QNet,” he said.
Quantum networks will likely be essential to state-of-the-art, secure communications and computing enhancements in the decades to come, he noted. To work, they lean on the exploitation of quantum-entangled particles, such as photons, to move information in the form of qubits — the basic unit of information in QIS technologies.
Quantum entanglement refers to a unique property of atomic and subatomic particles that isn’t completely explained by classical physics. It’s essentially the relationship between such tiny particles where the quantum state of each can’t be described independently of the state of the others — even though they are physically apart.
“The successful demonstration of controlled quantum entanglement distribution amongst three physically separated nodes represents a key outcome” the DC-QNet aims to achieve, noted Borsuk, who also serves as the associate director of research for the Naval Research Lab’s Systems Directorate.
The consortium’s roots
Interest in quantum networking has blossomed in recent years, Borsuk said.
In February 2020, the Energy Department hosted a quantum networking workshop in its New York City offices, which many scientists and engineers attended — including Borsuk.
Shortly thereafter, it came to light at a Naval Research Lab meeting that dark fiber — or fiber not being used for traditional telecommunications and with no optical-electronic-optical interfaces — connected several government laboratories that had ongoing research in QIS, and specifically quantum networking. Those discussions eventually led to a virtual workshop in November 2020 organized by NIST.
“The workshop included participation by all the U.S. government laboratories performing quantum research in the Washington Metro area,” Borsuk noted.
Through the workshop, the experts involved defined their areas of mutual research interest that could be key to implementing a quantum network.
“Given the excitement around these activities, the executive technical leadership of these organizations started meeting monthly, and DC-QNet evolved into the current organization,” Borsuk said.