Why agencies must look beyond ‘post-quantum crypto’ algorithms to secure their data

As NIST rolls out advanced PQC standards to address the ‘Harvest now, decrypt later’ threat, a new report suggests that a smarter and simpler approach involves switching to out-of-band key delivery and adaptable systems.
Listen to this article
0:00
Learn more. This feature uses an automated voice, which may result in occasional errors in pronunciation, tone, or sentiment.
Abstract image of data flowing through a tunnel.
Getty Images

While the dawn of quantum computing may still be years away, federal agencies are facing a future threat that has already arrived. Adversaries are actively stealing encrypted government information today with the expectation of breaking the code later. That chilling reality, known as “Harvest now, decrypt later” (HNDL), makes the mandated transition to new federal encryption standards an immediate national security imperative.

This urgent call to action represents both a warning and an opportunity to systematically improve government data’s safety and security, say encryption experts in a new report, “Confronting a New Reality: Agencies need to adopt cryptographic agility with new quantum-ready encryption,” released by Scoop News Group and sponsored by cybersecurity firm Quantum Xchange.

Download the full report.

The report argues that federal agencies must move beyond simply swapping out old algorithms with new encryption standards issued in the past year by the National Institute of Standards and Technology (NIST) and instead adopt a more fundamental shift in cryptographic agility and architecture to secure the nation’s secrets against future threats.

The report details why federal leaders must replace encryption practices that date back nearly half a century, specifically touching on:

Embracing new standards: The threat clock is ticking

The prospect of nation-state actors siphoning off and storing massive amounts of encrypted U.S. data is serious, turning every sensitive file with a long shelf-life into a ticking time bomb. This has prompted an aggressive response from the federal government, culminating in NIST releasing a new suite of Post-Quantum Cryptography (PQC) algorithms. These new standards (FIPS 203, 204, 205, and the forthcoming 206) are built on different mathematical principles designed to resist attacks from classical and quantum computers.

The release of these standards has transformed the quantum threat from an academic discussion into a pressing compliance issue. “The NIST announcement created a call to action because all of a sudden, folks have to actually begin implementing and complying with these standards,” states Eddy Zervigon, CEO of Quantum Xchange, in the report. This mandate forces agencies to begin the multi-year migration process immediately to protect both current and future data.

Beyond algorithms: Rethinking key delivery

A core argument of the report is that simply replacing old algorithms with new PQC ones is insufficient. The larger, more systemic vulnerability lies in the very architecture of modern encryption, which was designed 50 years ago. In most systems, the cryptographic keys are exchanged over the same channel as the data itself, a practice known as “in-band” key exchange. This creates a single point of failure; if an adversary can compromise the channel, they can often access both the keys and the data.

The report advocates for a new architectural approach: “out-of-band” key delivery. This method decouples key generation and delivery from the data transmission channel, forcing an attacker to compromise two separate, independently secured pathways to succeed.

“Decoupling key generation and delivery from the data transmission channel takes encryption out of the data plane and puts it into its control plane,” Zervigon explains in the report. “It’s a control element, not a data element, that needs to be controlled, audited, and automated.” This architectural change, the report argues, provides a more transformational and lasting security enhancement than new math alone.

The mandate for crypto-agility

The era of a single encryption standard lasting decades is over. The report predicts new vulnerabilities will be discovered more frequently, and cryptographic standards must be updated on a much shorter cycle. This requires “crypto-agility”— the ability for an organization to dynamically update or change cryptographic methods without disrupting the entire network.

This new reality invalidates a “set it and forget it” approach to encryption. Agencies must build systems capable of evolving as threats evolve. “We had a long run with RSA, Diffie-Hellman and ECC for 40-plus years,” says Eric Hay, Field Engineer at Quantum Xchange, referring to long-standing cryptography algorithms. “Now, NIST has released four new algorithms… in part, because they expect that these things are going to break. You’re going to have to change them more frequently than we had to in the past.”

The report outlines actionable steps for agencies, including inventorying existing cryptographic systems, piloting new out-of-band architectures, and partnering with experts. It also highlights six central benefits agency IT departments can expect by moving beyond PQC algorithms and embracing a more modern architectural approach to encryption.

Doing so is easier than agencies might think and offers greater agility, auditability and control in the long run, according to the report. Encryption tools from Quantum Xchange can be integrated easily with a wide range of existing systems, allowing agencies to generate ephemeral, or temporary, encryption keys, eliminating the risk of keys getting stolen.

The report concludes that the NIST mandate is not a burden to be checked off but an opportunity to build a truly resilient and quantum-safe digital infrastructure for the nation.

Download and read the full report.

This article and the report were produced by Scoop News Group for FedScoop and sponsored by Quantum Xchange.

Latest Podcasts