OMB to hold listening session with industry on software security self-attestation
The Office of Management and Budget will shortly take feedback from industry on some of the language it plans to use in new cybersecurity self-attestation requirements for software vendors, according to a senior official.
Speaking at the Fortinet Federal Security Transformation Summit hosted by FedScoop, Senior Director for Cybersecurity and Emerging Technology on the National Security Council Steve Kelly said the White House is focused on working collaboratively with software providers as it introduces the new standards.
He said: “OMB is working closely with agencies to ensure a consistent approach to implementation, and [we] plan to soon host a listening session with software makers and other interested parties to continue to take their feedback on some of the language.”
Kelly added that OMB is in the process of completing details of minimum cybersecurity requirements for vendors and that these will likely be published around January next year.
It comes after the White House in September issued a memo requiring federal agencies to obtain self-attestation from software providers before deploying their software on government systems.
According to OMB’s September memo, federal departments must ensure that all third-party IT software deployed adheres to National Institute of Standards and Technology supply chain security requirements and get proof of conformance from vendors.
Kelly stressed also that OMB wants to work closely with industry to ensure that the process for adopting the new standards runs smoothly.
He said: “For software makers unable to attest to one or more of the required security practices, they can submit a plan of action and let us know how they are working to meet the requirements.”
VA drops supply chain management IT system, hunts for new solution
The Department of Veterans Affairs said Tuesday that it will stop using a supply chain management IT system after Congress and the VA’s Office of Inspector General questioned the system’s effectiveness and cost.
The agency will end use of the Defense Medical Logistics Standard Support (DMLSS) system, which is a local server-based application that supports internal medical logistics at military hospitals or clinics, including in war zones.
In procurement documents on SAM.gov, the department said that it will now seek a new supply chain solution that must operate in the VA’s technical production environment, either in the VA cloud or in another FedRAMP certified cloud.
“As the largest integrated healthcare system in the country, our supply chain logistics solution must meet the needs of the 1,298 medical facilities in our network and the millions of veterans that we serve—and this transition will help us do exactly that,” said Michael D. Parrish, VA’s chief acquisition officer.
In February under pressure from lawmakers, the VA said it would take a second look at the DMLSS contract to determine if it was the right fit for the agency, and said it was considering other options.
Pressure to drop the DMLSS contract has been building since the VA’s Office of Inspector General (OIG) released a report in November 2021 that found failures in VA’s pilot project to deploy the DMLSS system at the Captain James A. Lovell Federal Health Care Center in North Chicago, Illinois.
The OIG report found the DMLSS system did not meet 44% of the high-priority business requirements identified by Lovell hospital staff as essential to their operations.
To create a supply chain infrastructure that improves the veteran experience, VA told reporters Tuesday that it will cancel future DMLSS deployments. The agency said it will work with the Defense Health Agency (DHA) to modify the current agreement and allow the VA to continue to fund joint operations at Lovell hospital.
The VA said it will establish the new Office of Enterprise Supply Chain Modernization in the coming months to oversee its supply chain transformation effort. The agency expects a new supply chain logistics solution contract by 2023.
Machine-learning models predicted ignition in fusion breakthrough experiment
Lawrence Livermore National Laboratory’s machine-learning models predicted the historic achievement of fusion ignition the week before its successful experiment on Dec. 5.
The National Ignition Facility’s design team fed the experimental design to the Cognitive Simulation (CogSim) machine-learning team for analysis, and it found the resulting fusion reactions would likely create more energy than was used to start the process — leading to ignition.
LLNL’s laser-based inertial confinement fusion research device is the size of three football fields and shot 192 laser beams — delivering 2.05 megajoules of ultraviolet energy to an almost perfectly round fuel capsule made of diamond — causing 3.15 megajoules’ worth of fusion ignition in a lab for the first time during the latest experiment. The achievement strengthens U.S. energy independence and national security with nuclear testing prohibited, and CogSim machine-learning models helped ensure the experiment avoided the previous year’s pitfalls.
“Last week our pre-shot predictions, improved by machine learning and the wealth of data we’ve collected, indicated that we had a better than 50% chance of exceeding the target gain of 1,” said LLNL Director Kim Budil, during a press conference at the Department of Energy on Tuesday.
NIF’s design team benchmarks complex plasma physics simulations and analytical models against experimental data collected over 60 years to create designs that will reach the extreme conditions required for fusion ignition. The most recent experiment reached pressures two times greater than the Sun’s and a temperature of 150 million degrees.
CogSim may run thousands of machine-learning simulations of an experimental design in the lead up.
“We have made quite a bit of advancements in our machine-learning models to kind of tie together our complex radiation hydrodynamics simulations of the experimental data and learning,” said Annie Kritcher, principal designer.
But NIF’s August 8, 2021 experiment reached the threshold for ignition, and September’s experiment paved the way for a new laser capability. So the design team relied on traditional methods for the latest experiment and only used machine learning for predictions.
For this experiment the design team thickened the fuel capsule to widen the margin of success and burn more fuel and used improved models to increase the symmetry of the implosion by transferring more energy between laser beams in the second half and readjusting the first half of the pulse.
Kritcher credited those changes for the experiment’s success, though she called capsule defects, which are tougher to model and predict, the “main driver” in performance. While the diamond capsule is 100 times smoother than a mirror, X-ray tomography must be used to see, measure and count defects — generating a lot of data that software now helps analyze.
The robust capsule employed in the most recent experiment was not the most effective option, meaning future experiments should see improved performance, said Michael Stadermann, Target Fabrication program manager.
Firing the laser required an additional 300 megajoules of energy pulled from the power grid, which highlights an important point about the NIF: It’s a scientific demonstration facility, not an optimized one.
“The laser wasn’t designed to be efficient,” said Mark Herrmann, LLNL Weapons, Physics and Design program director. “The laser was designed to give us as much juice as possible to make these incredible conditions possible.”
The NIF is more than 20 years old, and some of its technology dates back to the 1980s.
New laser architectures, target fabrication methods, materials, computation and simulations, and machine learning have federal officials optimistic the U.S. can achieve President Biden’s goal of a commercial fusion reactor within the decade.
DOE invested $50 million in a public-private partnership in September around fusion pilot power plant designs, but Budil said such a plant is likely still four decades away without “concerted effort and investment” on the technology side.
The private sector invested $3 billion in fusion research last year, and DOE is partnering with the White House Office of Science and Technology Policy to map out a vision for commercial fusion with zero-carbon energy powering homes, cars and heavy industry.
To its credit, the Biden administration proposed the biggest research and development budget in U.S. history, and recent investments enabled LLNL’s latest achievement.
“I think this is an amazing example of the power of America’s research and development enterprise,” said OSTP Director Arati Prabhakar.
Government leaders tout big wins for their missions with AI, ML and cloud tools
Public sector organizations are making big strides supporting their missions by applying artificial intelligence, machine learning, analytics, security and collaboration tools to their initiatives.
That’s according to government executives from the U.S. Army, U.S. Postal Service and the State of New York who joined Google leaders on stage for the opening keynote at the Google Government Summit in Washington, D.C. on November 15.
From both a warfighter perspective and a user experience perspective, the U.S. Army “needs data for decision-making at the point of “need” with the “the right tools to get the job done” across a diverse set of working conditions, explained Dr. Raj Iyer, Army CIO for the U.S. Department of the Army.
During the event, Dr. Iyer shared that Google Workspace will be provisioned for 250,000 soldiers working in the U.S. Army. The first 160,000 users have migrated to Google Workspace in just two weeks – with plans for the remaining personnel to be up and running by mid-2023. Google Workspace was designed to be deployed quickly to soldiers across a variety of locations, jobs and skill levels.
Thomas Kurian, CEO for Google Cloud, also took the stage and expressed Google’s “deep commitment” to providing products and solutions that are mature, compliant and meet government’s mission goals.
“In the last four years, we’ve really heightened our work for the government…in the breadth of our products that are focused as solutions, and significantly ramped up our compliance certifications to serve agencies more fully. And we culminated that by launching Google Public Sector, the only division that Google has in the whole company dedicated to a single industry,” Kurian explained.
Though cloud was once mainly viewed as a solution that can mainly provide economic elastic compute, what makes Google Cloud competitive against other providers is its ability to offer solutions for different needs as the nature of cloud computing evolves, said Kurian.
“Organizations want to get smarter to make decisions, combining both structured and unstructured data. And they want to be able to do analysis no matter where the data sits — whether it’s in our cloud or other clouds. We are the only cloud that lets you link data and analyze it across multiple clouds, structured and unstructured, without moving a single piece of data.”
Cybersecurity was also a key concern raised during the keynote, namely the need to simplify security analysis tools so cyber experts can detect threats faster.
“Protecting governments isn’t just for something for extraordinary times. The business of government requires constant vigilance,” said Karen Dahut, CEO for Google Public Sector, the company’s independent division, focused solely on the needs of federal, state and local government and the education sector.
She cited the success of the New York City Cyber Command, which works across city government to detect and prevent cyber threats. They are accomplishing this “by building a highly secure and scalable data pipeline on Google Cloud so their cyber security experts can detect threats faster.”
Google has also recently strengthened its ability to help its customers access data on known threats with the recent acquisition of Mandiant. Kevin Mandia, CEO and director for Mandiant, now a part of Google Cloud, took the stage to explain how the company has been uniquely positioned to “own that moment of incident response” and threat attribution. This has given the company an immense collection of data on cyber incidents and intrusion techniques.
“When Mandiant and Google combined,” he explained, “we took the security DNA of Mandiant…and joining — what I believe is the best AI on the planet, best machine learning on the planet, best big data on the planet — and we’re bringing what we know [about cybersecurity] to scale.”
The keynote featured several seasoned technology leaders who each shared how cloud, artificial intelligence and machine learning tools are helping their agencies achieve mission outcomes and keep pace with cybersecurity needs, including:
- Pritha Mehra, CIO and Executive VP, United States Postal Service
- Rajiv Rao, CTO and Deputy CIO, New York State
- Teddra Burgess, Managing Director, Federal Civilian, Google Public Sector
- Leigh Palmer, VP, Delivery and Operations, Google Public Sector
Watch the keynote in its entirety on the Government Summit On-Demand page. This article was produced by Scoop News Group for FedScoop and underwritten by Google Cloud.
Federal execs see long journey to achieve zero trust
Less than two years remain for government agencies to meet major milestones of the Office of Management and Budget’s zero-trust security mandates. A new survey of federal IT and program leaders, however, suggests that while agency executives are focused on zero-trust practices, more than half (55%) of those polled acknowledge their agencies are still “assessing” their zero-trust gaps or have only a “baseline” of capabilities in place.
And while 35% of respondents say their agencies have intermediate or advanced zero-trust capabilities in place — based on definitions outlined in a widely-referenced maturity model issued by the Defense Department — agencies appear broadly under-equipped and under-financed to meet the administration’s mandates, according to the findings.

The survey, completed by 191 prequalified federal CIOs, IT and security managers, and program officials in September and October, found that roughly two in three executives at small and mid-sized agencies — and a little over half at large agencies, based on employee counts — believe their agency “will receive incremental funding for zero-trust work in their fiscal year 2024 budget.”
However, six in 10 respondents also said they were moderately or highly concerned that “other high-priority IT projects will suffer in FY2024 due to the need to reallocate resources to meet OMB’s and/or DOD’s zero-trust objectives.
The findings reflect the views of a broad base of federal IT leaders, with 63% from civilian agencies and 37% from defense agencies. One-third (32%) worked at agencies with less than 5,000 employees; 28% at agencies with 5,000 to 10,000 employees; and 39% at agencies with more than 10,000 employees.
Among the survey’s key findings:
Zero trust clearly has agencies’ attention. Four in 10 of executives (39%) say they are “fully familiar” — and another 47% are “generally familiar” — with the core objectives outlined in OMB’s M 22-09 memo or DOD’s latest zero-trust reference documents. Well over half say their agency has created a budget line for zero-trust work. And more than six in 10 say their department or agency has designated an individual to lead zero trust implementation.
Visibility and skills gaps remain. Roughly half of the executives at small and large agencies — and about six in 10 at midsize agencies — say their agency’s senior executives have “full visibility of the gaps that must be closed to achieve zero-trust mandates. However, based on FedScoop discussions with federal CISOs, agency executives may be over-optimistic about what is required to actually implement zero-trust practices. The survey, for instance, found that typically a quarter of respondents were “not very” or “not at all” confident that their agency had the requisite skills to assess the security requirements associated with seven key pillars associated with zero trust. Those pillars include minimum security requirements to achieve enterprise-wide control over user identity, devices, network environments, applications, data, visibility/analytics and automation orchestration.
The value of assessments. At the same time, nine in 10 respondents acknowledged that a “comprehensive zero-trust assessment to identify gaps and key focus areas” would be highly or moderately valuable. And eight in 10 indicated that such an assessment and subsequent services from a third-party vendor or organization, similarly, would be highly or moderately valuable.
Priorities vary, but concurrent upgrades will be needed. Overall, when it comes to investment priorities, user identity and upgrades to network environments are getting the greatest attention, the study found. But resource priorities vary, depending on the size of agencies, the study found. A more detailed breakout of those priorities is available in the full report, “Achieving the Security Promise of Zero Trust,” produced by FedScoop and underwritten by Iron Bow.
One capability essential to achieving zero trust that remains underappreciated, according to the CISO at one large federal agency, is the need to dramatically scale up infrastructure and applications to collect, store and analyze log files. He estimated zero trust will ultimately result in a 40-fold increase in log files, plus the staff, to manage it all.
The study also suggests that agencies are still underestimating the work involved in educating agency managers and employees about zero-trust practices and the steps required to achieve them. The findings also reflect a probable disconnect between what senior executives believe to be true and what the “boots on the ground” are saying is true.
“Zero trust is not just a journey for security folks,” one CISO told FedScoop. “It’s a journey for the entire agency.”
Download the FedScoop report, “Achieving the Security Promise of Zero Trust,” for detailed findings on meeting federally mandated zero-trust goals.
This article was produced by Scoop News Group for FedScoop and sponsored by Iron Bow.
Quantum-ready workforce tops White House, scientists’ list of needs
Workforce was the topic on most quantum scientists’ minds when 30 of the country’s best met at the White House on Dec. 2 to discuss the global quantum race.
Leaders of the five National Quantum Information Science Research Centers (NQISRCs) were among the attendees assessing their success accelerating QIS research and development, technology transfer, and workforce development since their launch mid-pandemic.
The National Quantum Initiative Act of 2018 allotted the Department of Energy $625 million for the centers, which have begun integrating companies into the U.S. QIS ecosystem. Gone are the days when monopolies like Bell Labs and IBM funded basic science in house, meaning U.S. investment in accessing the best quantum engineers is more important than ever to winning what has become a global race with China and Europe.
“This is the time to change the model for how you build a technology workforce,” David Awschalom, professor at the University of Chicago’s Pritzker School of Molecular Engineering and senior scientist at Argonne National Laboratory, told FedScoop. “This is an opportunity to build a very diverse, very inclusive and very equitable workforce.”
Awschalom participated in the Dec. 2 White House meeting, about half of which he said consisted of roundtable discussions on growing a quantum-ready workforce.
Companies participating in the Chicago Quantum Exchange, housed at the University of Chicago, are more concerned about the talent shortage than they are about even a more reliable qubit, Awschalom said.
Awschalom envisions an ecosystem where “quantum” is no longer an intimidating word for students, QIS is taught in high schools and community colleges, and the development of new technologies launches the careers of students at universities like Chicago State. The predominantly Black university regularly sees impressive students working full-time jobs to stay enrolled or making class sacrifices to care for children, he said.
Alleviating those complications could see tens of thousands more students enter the quantum workforce.
“We realized that to really address these questions properly we should probably have another meeting like this, where we bring in members from those communities to tell us what they need,” Awschalom said.
While no date was set, National Quantum Coordination Office Director Charles Tahan was clear that his door is open, and the White House wants to work together more with the QIS ecosystem, he added.
The second major topic of discussion at the White House meeting were “big” delays in obtaining rare-earth elements and unusual materials found outside the U.S. but required for a lot of quantum components, Awschalom said.
Helium-3 is needed for cryogenic experiments but became harder and more costly to obtain due to Russia’s war on Ukraine, while elements critical to quantum memories can only be mined in a few places globally.
Fortunately the U.S. is adept at nurturing startups, which could prove key to developing compact cryogenics and on-chip memories with silicon-compatible materials, Awschalom said.
The universities of Chicago and Illinois partner on the Duality Quantum Accelerator, which has hosted 11 quantum startups — including four from Europe — for research and development. At the NQISRC level, joint programs are forming between centers.
Since the birth of the qubit, rudimentary quantum processors have begun “remarkably fast” computing and prototype networks sending encrypted information around the world, Awschalom said. The University of Chicago’s 124-mile Quantum Link between it, Chicago and the National Labs serves as a testbed for industry prototypes and will eventually extend into south Illinois.
Whether the first beneficiary of quantum computing is precision GPS for microsurgery, improved telescope strength or some as-yet-unrealized application remains to be seen.
“The one thing we all know for sure in this field is that we don’t yet know the biggest impact,” Awschalom said. “So the United States and our centers have to be prepared; we need to be nimble.”
WH announces new members of National Quantum Advisory Committee
The White House Friday announced the appointment of 15 new members to the National Quantum Initiative Advisory Committee (NQIAC), which is tasked with coordinating how federal agencies research and deploy quantum information technologies.
The committee provides an independent assessment of the programs outlined in the National Quantum Initiative (NQI) Act of 2018, which gives the U.S. a plan for advancing quantum technology, particularly quantum computing.
President Trump signed the National Quantum Initiative Act into law in December 2018 with the goal of spending $1.2 billion on a framework advancing QIS technologies, and the committee will provide the program with expert evidence, data and perspectives.
The NQIAC committee was first established by executive order in August 2019 and subsequently enhanced by another executive order in May 2022, which elevated the committee to a Presidential Advisory Committee.
The committee makes recommendations for the President, Congress, the National Science and Technology Council (NSTC) Subcommittee on Quantum Information Science, and the NSTC Subcommittee on Economic and Security Implications of Quantum Science to consider when reviewing and revising the NQI Program.
The NQIAC committee consists of leaders in the field from industry, academia, and the federal laboratories with Dr. Kathryn Ann Moler and Dr. Charles Tahan serving as co-chairs of the 15 person committee.
Moler, who is Dean of Research at Stanford University, conducts research in magnetic imaging and develops tools that measure nanoscale magnetic fields to study quantum materials and devices.
Tahan is the Assistant Director for Quantum Information Science (QIS) and the Director of the National Quantum Coordination Office within the White House Office of Science and Technology Policy. He is on detail from the Laboratory for Physical Sciences, where he served as Technical Director and continues to serve as Chief Scientist and Chief of the QIS research office.
The other members of the board include: Dr. Jamil Abo-Shaeer, Dr. Fred Chong, Dr. James S. Clarke, Dr. Deborah Ann Frincke, Gilbert V. Herrera, Dr. Nadya Mason, Dr. William D. Oliver, Dr. John Preskill, Dr. Mark B. Ritter, Dr. Robert J. Schoelkopf, Dr. Krysta M. Svore, Dr. Jun Ye, and Dr. Jinliu Wang.
According to the Biden administration’s May 2022 executive order on quantum technology, the NQIAC may consist of up to 26 members. The committee is required to meet twice a year to carry out its duties.
Trade group calls for omnibus spending bill to include $100M for Technology Modernization Fund
A leading technology trade group has written to senior lawmakers in both chambers calling for the inclusion of fresh capital for the federal Technology Modernization Fund in the anticipated omnibus spending bill.
The Alliance for Digital Innovation (ADI) in a missive called for lawmakers to adopt language that would provide $100 million for the governmentwide technology working capital fund.
“Earlier this year, the Administration requested $300 million in its FY 2023 budget request. ADI notes that the House mark includes $100 million for the TMF while the Senate bill does not include additional funding,” says the letter addressed to leaders of the Senate and House appropriations committees. It’s referring to the funding the Biden administration requested for the TMF in its budget request for fiscal 2023.
ADI leaders added: “[G]iven the number of outstanding TMF proposals — 130 proposals from 60 agencies and components totaling over $2.5 billion, according to the director of the TMF program management office — we strongly urge the adoption of the House mark of $100 million in FY 2023.”
Democrats and Republicans are edging towards agreement on a roughly $1 trillion “omnibus” spending bill that would bundle 12 appropriations bills to provide governmentwide funding through the remainder of fiscal 2023.
“With this additional funding, ADI supports Congress’ efforts to provide additional oversight of the fund and the specific projects that are awarded,” the group added. “ADI believes that the priorities of the various agencies should align with the efforts of the administration and Congress to improve both the customer experience of citizen services and cybersecurity of the agencies.”
The Technology Modernization Fund received $1 billion for projects as part of the American Rescue Plan, which was signed into law by President Biden in March 2021.
The House version of the spending bill in its current form also funds the Federal Citizen Services Fund and supports cybersecurity improvements across government.
Both the House and Senate versions of the spending bill would provide CISA with $2.9 billion to carry out its objectives.
Post-quantum cryptography experts brace for long transition despite White House deadlines
The White House’s aggressive deadlines for agencies to develop post-quantum cryptography strategies make the U.S. the global leader on protection, but the transition will take at least a decade, experts say.
Canada led the Western world in considering a switch to post-quantum cryptography (PQC) prior to the Office of Management and Budget issuing its benchmark-setting memo on Nov. 18, which has agencies running to next-generation encryption companies with questions about next steps.
The memo gives agencies until May 4, 2023, to submit their first cryptographic system inventories identifying vulnerable systems, but they’ll find the number of systems reliant on public-key encryption — which experts predict forthcoming quantum computers will crack with ease — is in the hundreds or thousands. Agencies, software, servers and switches often have their own cryptography, and agencies don’t necessarily have the technical expertise on staff to understand the underlying math.
“This will be the largest upgrade cycle in all human history because every single device, 27 billion devices, every network and communication needs to upgrade to post-quantum resilience,” Skip Sanzeri, chief operating officer at quantum security-as-a-service company QuSecure, told FedScoop. “So it’s a massive upgrade, and we have to do it because these quantum systems should be online — we don’t know exactly when — but early estimates are three, four years for something strong enough.”
Bearish projections have the first quantum computer going live in about a decade, or never, with scientists still debating what the definition of a qubit — the quantum mechanical analogue to a bit — should even be.
QuSecure launched three years ago but became the first company to deploy PQC for the government this summer, when it proved to the U.S. Northern Command and North American Aerospace Defense Command that it could create a quantum channel for secure aerospace data transmissions at the Catalyst Campus in Colorado Springs, Colorado. The company used the CRYSTALS-KYBER cryptographic algorithm, one of four the National Institute of Standards and Technology announced it would standardize, but a quantum computer doesn’t yet exist to truly test the security.
The first quantum security-as-a-service company to be awarded a Phase III contract by the Small Business Innovation Research program, QuSecure can contract with all federal agencies immediately. Customers already include the Army, Navy, Marines and Air Force, and the State, Agriculture, Treasury and Justice departments have inquired about services, Sanzeri said.
QuSecure isn’t alone.
“We are having discussions right now with various federal agencies around what they should be doing, what they can be doing, in order to start today — whether it’s in building out the network architecture or looking at Internet of Things devices that are being sent into the field,” said Kaniah Konkoly-Thege, chief legal officer and senior vice president of government relations at Quantinuum, in an interview.
Defense and intelligence agencies are better funded and more familiar with classified programs requiring encryption services and therefore “probably in a much better position” to transition to PQC, Konkoly-Thege said.
Having served in the departments of the Interior and Energy, Konkoly-Thege said she’s “concerned” other agencies may struggle with migration.
“There are a lot of federal agencies that are underfunded and don’t have the resources, either in people or funding, to come and do what’s necessary,” she said. “And yet those agencies hold very important information.”
That information is already being exfiltrated in cyberattacks like the Office of Personnel Management hack in 2015, in which China aims to harvest now, decrypt later (HNDL) data with fully realized quantum computers.
Post-Quantum CEO Andersen Cheng coined the term, and his company’s joint NTS-KEM error-correcting code is in Round 4 of NIST’s PQC algorithm competition.
Cheng points to the fact he could trademark his company’s name as proof PQC wasn’t being taken seriously even in 2015 and certainly not the year prior, when he and two colleagues were the first to get a PQC algorithm to work in a real-world situation: a WhatsApp messaging application downloadable from the app store.
They took it down within 12 months.
“One of my friends in the intelligence world called me one day saying, ‘You’re very well known.’ I said, ‘Why?’ He said, ‘Well, your tool is the recommended tool by ISIS,’” Cheng told FedScoop in an interview. “It was a wonderful endorsement from the wrong party.”
While there wasn’t one moment that caused the U.S. government to take PQC seriously, Cheng said the “biggest” turning point was the release of National Security Memo-10 — which OMB’s latest memo serves as guidance for implementing — in May. That’s when the largest U.S. companies in network security infrastructure and finance began reaching out to Post-Quantum for consultation.
Post-Quantum now offers a portfolio of quantum-ready modules for not only secure messaging but identity, quorum sensing and key splitting.
Cheng said the Quantum Computing Cyber Preparedness Act, sent to President Biden’s desk Friday, should become law given PQC’s momentum, but he has “slight” reservations about the OMB memo’s aggressive deadlines for agencies to declare a migration lead and to conduct an inventory audit.
“People are probably underestimating the time it will take because the entire migration — I’ve spoken to some very top-end cryptographers like the head of crypto at Microsoft and so on — our consensus is this is a multi-year migration effort,” Cheng said. “It will take 10 years, at least, to migrate.”
That’s because public-key encryption protects everything from Zoom calls to cellphones, and the National Security Agency isn’t yet recommending hybridization, which would allow for interoperability among the various NIST-approved algorithms and also whichever ones other countries choose. Agencies and companies won’t want to swap PKE out for new PQC algorithms that won’t work with each other, Cheng said.
Complicating matters further, NIST is approving the math behind PQC algorithms, but the Internet Engineering Task Force generally winds up defining connectivity standards. Post-Quantum’s hybrid PQ virtual private network is still being standardized by IETF, and only then can it be added to systems and sold to agencies.
Cheng recommends agencies not wait until their inventory audits are complete to begin talking to consultants and software vendors about transitioning their mission-critical systems because PQC expertise is in short supply. Large consulting firms have been “quietly” building out their quantum consulting arms for months, he said.
OMB’s latest memo gives agencies 30 days after they submit their cryptographic system inventory to submit funding assessments, a sign it won’t be an unfunded mandate, Sanzeri said.
“This is showing that all of federal will be well into the upgrade process, certainly within 12 months,” he said.
Quantum cybersecurity legislation passes Senate
A bipartisan bill focused on improving the federal government’s protections against quantum computing-enabled data breaches has passed the Senate.
The Quantum Computing Cybersecurity Preparedness Act passed the House back in July and now will be signed into law by President Joe Biden. It is co-sponsored by Sens. Rob Portman, R-Ohio, and Maggie Hassan, D-N.H.
Once enacted, the legislation will require the Office of Management and Budget to prioritize federal agencies’ acquisition of and migration to IT systems with post-quantum cryptography. It will also require the White House to create guidance for federal agencies to assess critical systems one year after the National Institute of Standards and Technology issues planned post-quantum cryptography standards.
In addition, the bill will require OMB to send an annual report to Congress that includes a strategy for how to address post-quantum cryptography risks from across the government.
In a Nov. 18 memo, the White House set out the deadline and said federal agencies would be expected to subsequently provide an annual vulnerability report until 2035.
At the time, OMB said also in its memo that agencies would be required to submit to both the Office of the National Cyber Director and the White House an assessment of extra funding needed for the adoption of post-quantum cryptography within 30 days.
It added that a working group for post-quantum cryptographic systems will be established, which will be chaired by the federal chief information security officer.
The legislation passes the Senate as fears mount over significant leaps in quantum technology being made by the United States’s strategic competitors, including China, which could allow existing forms of secure encryption to be cracked much more quickly.