Skip to main content
  1. Posts/

Quantum Won't Kill Encryption. It Never Has.

If you’ve spent any time on LinkedIn or at a cybersecurity conference in the last couple of years, you’ve seen the headlines. “Quantum computing will break all encryption.” “Your data is already at risk.” “The cryptographic apocalypse is coming.”

It makes for great conference talks and even better vendor marketing. But here’s the thing: encryption has always been broken. And every single time, we’ve replaced it with something stronger. The lifecycle of cryptographic algorithms isn’t a flaw in the system; it is the system. So why would quantum computing be any different?

I don’t think it will be. But I want to be precise about what I’m arguing, because 2026 is the year the industry has decided to stop debating and start migrating. That urgency is real, even if the apocalypse framing isn’t.

Where We Are Right Now
#

Let me set the stage before the history lesson, because the current moment matters.

The Quantum Insider designated 2026 the “Year of Quantum Security,” a coordinated global effort backed by NIST, the FBI, and government initiatives worldwide, focused specifically on post-quantum cryptography (PQC) adoption. In 2025, most major quantum computing vendors demonstrated quantum advantage in controlled settings. A Chinese research group used a quantum computer to factor a small RSA key, which wasn’t a threat to real-world 2048-bit keys, but it was a meaningful proof of concept that the theoretical attack model actually works. Google’s Willow chip performed a specific quantum calculation in under five minutes that would take a classical supercomputer longer than the age of the universe.

And here’s the uncomfortable enterprise reality: a recent IBM survey of 500+ senior executives found organizations are at a “low level of quantum-safe readiness,” with executives estimating it will take their organizations 12 years to fully integrate quantum-safe standards. Twelve years. While the cryptographic community has largely solved the algorithmic problem, most organizations haven’t started the operational work.

So yes, the urgency is warranted. The timeline to a cryptographically relevant quantum computer (CRQC) has compressed. Earlier estimates from 2019 required 20 million physical qubits to break RSA-2048. Breakthroughs in quantum error correction in 2024 and 2025 have significantly reduced that estimate, and some credible researchers now put a nation-state-level CRQC within five years.

But “urgency” and “apocalypse” are different things. Let me explain why.

A Brief History of “Broken” Encryption
#

Let’s take a walk through the cryptographic graveyard and notice a pattern.

DES (1977 to 1999): The Data Encryption Standard was the backbone of encryption for over two decades. It was adopted as a federal standard in 1977, and for years it was considered unbreakable for any practical purpose. Then computing power caught up. In 1997, the DESCHALL Project coordinated thousands of machines across the internet to crack a DES-encrypted message in 96 days. A year later, the Electronic Frontier Foundation built a purpose-built machine called “Deep Crack” for under $250,000 that broke DES in 56 hours. By January 1999, a combined effort cracked it in just over 22 hours. The response? NIST recommended Triple DES, and then replaced the whole thing with AES in 2001. The world kept turning.

MD5 (1991 to 2008): MD5 was the go-to hashing algorithm for years. Then researchers demonstrated practical collision attacks, proving you could generate two different inputs with the same hash. That’s a death sentence for any algorithm used in digital signatures and certificate verification. By 2008, researchers showed they could forge an MD5-signed intermediate certificate capable of impersonating any domain on the internet. MD5 was deprecated, and the ecosystem moved to SHA-1. Life went on.

SHA-1 (1995 to 2017): SHA-1 replaced MD5, and for a while it held up. But by 2005, theoretical weaknesses were being published. NIST formally deprecated it in 2011 and disallowed it for digital signatures in 2013. Browser vendors coordinated a sunset: Google Chrome, Mozilla Firefox, and Microsoft Edge all stopped trusting SHA-1 certificates by early 2017. In February of that year, Google and CWI Amsterdam published the SHAttered attack, a real-world collision. The industry had already moved to SHA-2 (SHA-256). Did the internet collapse? No. It adapted, just like it always does.

RSA key sizes: When RSA was introduced, 512-bit keys were standard. Today, 2048-bit is the minimum acceptable, and 4096-bit is common for high-security applications. The algorithm didn’t “break”; we just kept scaling it as computational power grew.

Notice the pattern? A standard is adopted, it serves its purpose for years or decades, weaknesses emerge, the community develops a replacement, and the migration happens. Sometimes messily, sometimes slowly, but it happens. Every time.

The Quantum Threat and What It Actually Is
#

I’m not here to tell you quantum computing is irrelevant. It’s a real technological development with real implications for cryptography. But those implications are specific and bounded, not universal.

The concern boils down to two algorithms:

Shor’s algorithm can efficiently factor large integers and compute discrete logarithms. This is a direct threat to asymmetric cryptography, specifically RSA, ECC, and Diffie-Hellman, because the security of those systems depends on the difficulty of exactly those mathematical problems. A sufficiently powerful quantum computer running Shor’s algorithm could break these in polynomial time rather than the exponential time classical computers require.

Grover’s algorithm provides a quadratic speedup for unstructured search problems. For symmetric encryption like AES, this effectively halves the key strength. AES-256 becomes roughly equivalent to AES-128 against a quantum adversary. That’s a meaningful reduction, but it’s addressable. You double the key length.

Here’s what gets lost in the noise: quantum computers don’t threaten all encryption equally. Symmetric encryption is dented, not destroyed. The real vulnerability is in asymmetric systems, specifically because those systems rely on mathematical problems that quantum computers happen to be good at solving. That “happen to” part is critical, and we’ll come back to it.

We Already Have the Answer
#

Here’s where the “quantum kills encryption” narrative falls apart: we’ve already built the replacement algorithms. And they were specifically designed to resist quantum attacks.

In August 2024, NIST released its first three finalized post-quantum cryptography (PQC) standards after an eight-year international evaluation process that began in 2016 with 82 candidate algorithms:

  • ML-KEM (FIPS 203), formerly CRYSTALS-Kyber, a lattice-based key encapsulation mechanism for general encryption
  • ML-DSA (FIPS 204), formerly CRYSTALS-Dilithium, a lattice-based digital signature scheme
  • SLH-DSA (FIPS 205), formerly SPHINCS+, a stateless hash-based digital signature scheme

Additional standards are in progress, including FN-DSA (based on FALCON) and HQC for key encapsulation. NIST released a fifth PQC algorithm in early 2025.

These algorithms don’t rely on integer factorization or discrete logarithms. They’re built on entirely different mathematical foundations, including problems in high-dimensional lattice geometry, error-correcting codes, and hash function properties, that quantum computers have no known advantage in solving.

The standards are finalized. The math has been publicly scrutinized for years. Google Chrome and Microsoft Edge already support hybrid key exchange using ML-KEM by default. This isn’t theoretical readiness; it’s early production deployment.

The problem isn’t the algorithms. It’s the operational migration work that most organizations haven’t started yet.

You Can’t Pick Every Lock With the Same Pick
#

This is the core of the argument, and it’s surprisingly simple once you see it.

Quantum computers aren’t magic. They’re exceptionally good at solving specific types of mathematical problems. Current asymmetric encryption happens to be built on exactly those problem types, and that’s a historical coincidence, not a fundamental law of cryptography.

Encryption doesn’t have to be built on those foundations. Post-quantum algorithms intentionally choose mathematical problems that quantum computers struggle with:

Lattice-based cryptography relies on finding short vectors in high-dimensional lattice structures. Even with quantum computing, there’s no known efficient algorithm for these problems. This underpins both ML-KEM and ML-DSA.

Hash-based signatures derive their security from the properties of hash functions. Grover’s algorithm offers only a modest speedup here, addressed by using longer hash outputs. SLH-DSA takes this approach.

Code-based cryptography, like Classic McEliece, is built on error-correcting code problems that have resisted cryptanalysis for over 45 years, classical and quantum alike.

Think of it this way: someone invents a master lockpick that can open every pin tumbler lock ever made. The headlines scream “locks are dead.” Meanwhile, locksmiths are already shipping magnetic locks, biometric systems, and mechanisms the pick was never designed to defeat. The lockpick isn’t magic; it just exploits a specific mechanical vulnerability. Change the mechanism, and the pick is useless.

That’s exactly what post-quantum cryptography does. It changes the mechanism.

What You Should Actually Worry About: Harvest Now, Decrypt Later
#

This is where the real urgency lives, and if there’s one thing I want practitioners to internalize, it’s this.

The “Harvest Now, Decrypt Later” (HNDL) attack model doesn’t require a cryptographically relevant quantum computer to exist today. It only requires one to exist eventually. And the adversarial infrastructure to execute it is already in operation.

Nation-state actors, and several are well-resourced enough to do this at scale, are intercepting and archiving encrypted traffic right now. The goal is simple: store it today, decrypt it once the hardware catches up.

I want to be precise here, especially for practitioners in PCI DSS environments, because this is a point worth getting right. Raw cardholder data like PANs is actually a weak example for HNDL. Card numbers expire and get reissued on a predictable cycle. By the time a CRQC exists, a harvested PAN is mostly a cold lead.

The real targets in a payment card context are the data types that don’t expire on a schedule:

Cardholder PII that travels with transactions. Names, billing addresses, email addresses, and phone numbers don’t rotate when a card does. That data has long-term value for identity theft, account takeover, and targeted fraud well beyond any reissuance cycle.

Merchant and behavioral transaction data. Aggregated spending patterns, purchase histories, and behavioral profiles tied to real identities are valuable for years. This is the kind of data that intelligence-grade adversaries actually care about.

Authentication infrastructure protecting the CDE. Any session that includes account credentials, API keys, or tokens for systems adjacent to the cardholder data environment has a sensitivity lifespan that outlasts the PAN by a wide margin.

The same logic applies more broadly. PHI under HIPAA, sensitive government data under FISMA, long-lived intellectual property, and any identifying data that doesn’t have an expiration date are all stronger HNDL candidates. The threat is real; it just needs to be aimed at the right data types to drive the right prioritization decisions.

The US government has set a quantum-resistance deadline of 2027 for new National Security System devices. That deadline wasn’t set arbitrarily. It reflects intelligence community assessments of adversarial capability timelines, and it’s a reasonable benchmark for any organization handling sensitive long-lived data.

Here’s what practitioners should be doing:

Inventory your cryptographic assets. Understand where you’re using vulnerable algorithms (RSA, ECC, Diffie-Hellman) across your infrastructure. This includes TLS configurations, VPNs, code signing, key management systems, and certificate infrastructure. If you’ve been through a TLS 1.0-to-1.2 migration or a SHA-1 sunset, you already know how sprawling this gets. Expect PQC to be messier.

Classify your data by sensitivity lifespan, not just regulatory category. A PAN that rotates every two years is a different risk profile than a medical record, an employment file, or a decade of behavioral transaction data on a high-value customer. Let that drive your migration priority order.

Start your PQC migration path now, not later. The NIST standards are final. Vendors are integrating them. Microsoft has already integrated ML-KEM and ML-DSA into SymCrypt, the cryptographic library underpinning Windows and Azure. The tooling exists. A multi-year migration that starts in 2026 is a planned transition. One that starts in 2029 is a scramble.

Follow NIST IR 8547 for transition guidance. It applies broadly, not just to federal systems.

The work is real. The urgency is warranted. But this is a migration program, not a crisis response.

Encryption Is Antifragile
#

Nassim Taleb coined the term “antifragile” to describe systems that get stronger under stress. Cryptography is one of the best examples in technology.

Every time an algorithm has been broken, the field hasn’t just recovered. It’s come back stronger. DES gave us AES. MD5 gave us SHA-2. The theoretical weaknesses of RSA are giving us lattice-based cryptography. Each generation of attackers has forced a corresponding evolution in defenses, and the defenses have consistently won the long game.

Quantum computing is the latest stressor. The timeline has compressed, the threat is more concrete than it was five years ago, and the operational work is genuinely significant. But the system is already responding. The standards exist. The migration paths are being defined. The math is sound. Vendors are shipping implementations.

In five or ten years, we’ll look back at the “quantum apocalypse” narrative the same way we look back at Y2K: a real problem that required real work, but never the civilization-ending event the headlines promised. The difference between organizations that come through in good shape and those that are scrambling will come down to whether they treated 2026 as the start of a migration, or kept waiting for the threat to become undeniable.

Every generation of cryptographers has faced an existential threat. Every generation has answered it. Quantum isn’t the exception; it’s the latest proof that the cycle works.

Start the inventory. Build the migration plan. The algorithms are ready when you are.


The views expressed in this article are my own, informed by hands-on experience in systems engineering, cybersecurity engineering, and PCI QSA assessments. Post-quantum migration planning touches every layer of a compliant environment. If you’re working through what that looks like for your organization, feel free to reach out.

Juan Carlos Munera
Author
Juan Carlos Munera
Passionate about cybersecurity, governance, risk, and compliance. Sharing insights on security best practices, frameworks, and industry trends.

Related

AI in Payment Environments

·1453 words·7 mins
PCI DSS v4.x wasn’t written with AI in mind, but the framework is more adaptable than it gets credit for. Here’s where the standard holds up, where there’s room to grow, and how the PCI SSC is already engaging with AI through initiatives like The AI Exchange.

Carding-as-a-Service: What Underground Dump Shops Mean for PCI Scope

·1650 words·8 mins
When we talk about PCI DSS compliance, the conversation tends to stay clinical. Scoping exercises. Network diagrams. Encryption at rest. But compliance doesn’t exist in a vacuum. It exists because there’s a thriving, industrialized criminal economy on the other end waiting to monetize every gap you leave open. Rapid7 published a detailed piece of research this month that every QSA, security engineer, and compliance leader should read: their analysis of the carding-as-a-service (CaaS) ecosystem and the underground dump shops that power it. Having spent years on the assessor side of PCI, I want to connect what Rapid7 found directly back to what it means for your cardholder data environment and your scoping decisions.