Quantum Risk Just Changed Shape — Not Timeline
- Joel Van Dyk
- 4 hours ago
- 4 min read
In my last blog post I asked what would happen if 2035 was wrong? It seems that I was ahead of even myself.
There’s a lot of noise right now around the recent paper by Madelyn Cain et.al. (https://arxiv.org/abs/2603.28627) on reducing the quantum resources required to break modern cryptography.
Most of the commentary is predictable:
• “10,000 qubits changes everything”
• “Harvest now, decrypt later is here”
• “We need post-quantum crypto immediately”
None of that is wrong.
But it’s also not the most important takeaway.
⸻
This Isn’t a Physics Breakthrough
Coming from the physics world, what strikes me is the headline number—~10,000 qubits to break RSA or ECC—is being treated as a breakthrough in quantum capability.
It isn’t. The underlying physics has still remained the same. No breakthroughs there.
What this paper actually represents is a compression of assumptions.
The result depends on a stack of conditions aligning:
• scalable quantum LDPC error correction
• high-fidelity operations sustained over long runtimes
• architectural parallelism working efficiently
• logical qubits behaving consistently under load
Individually, each of these is plausible.
Collectively, at scale? Still unproven.
So the right way to read this isn’t:
“We only need 10,000 qubits now.”
It’s:
“If everything works together as designed, the lower bound might look like 10,000 qubits.”
That’s a very different statement.
⸻
The Real Shift: Physics → Engineering
Historically, quantum risk has been constrained by physics:
• we couldn’t build enough qubits
• we couldn’t stabilize them
• we couldn’t run long enough computations
This paper subtly shifts the constraint. Theoretically, that all seems possible.
Now the problem looks more like:
• scheduling and orchestration of quantum workloads
• memory vs compute partitioning
• teleportation and circuit decomposition
• parallel execution at scale
In other words, the bottleneck is moving into systems engineering.
And that matters.
Because physics problems tend to stall awaiting a fundamental discovery.
Engineering problems tend to get solved—iteratively, and often faster than expected.
⸻
We’re Still Talking About the Wrong Target
Most discussions continue to center on RSA.
That’s outdated.
What the paper makes clear—if you read past the headline—is that elliptic curve cryptography (ECC) is the more realistic early target.
• Smaller key sizes
• Lower computational complexity
• Ubiquitous in modern protocols
ECC underpins:
• TLS handshakes
• SSH
• large parts of PKI infrastructure
So the first meaningful break won’t be theoretical.
It will hit the live fabric of the internet.
⸻
Qubits Are Only Half the Story
The fixation on qubit count misses the harder problem: time.
Even under optimistic assumptions, these attacks require:
• long runtimes
• sustained fault tolerance
• stable operation across extended periods
We are compressing qubit requirements faster than we are solving for runtime stability.
And those two variables are not independent.
A system that works for seconds is not a system that works for days.
⸻
There’s Also a Quiet Platform Signal
This isn’t just a cryptography paper.
It’s an implicit argument about which quantum architectures might scale first.
The approach leans heavily on:
• connectivity
• reconfigurability
• parallel operations
All areas where neutral atom systems currently have structural advantages over superconducting approaches.
That doesn’t settle the platform debate.
But it does suggest that cryptographic relevance may arrive through architectures optimized for system-level flexibility, not just raw qubit counts.
⸻
The Real Problem for Security Leaders
The biggest shift here isn’t technical.
It’s how we think about timelines.
Security programs are built on linear forecasting:
• assess risk
• estimate horizon
• plan migration
Quantum risk no longer behaves that way.
It’s becoming a step-function problem.
Long periods of minimal progress, followed by sudden compression when multiple constraints are reduced at once.
This paper is an example of that compression.
Not a breakthrough.
But a reduction in the number of unknowns required for one.
⸻
What Actually Changed
Cryptography is not suddenly broken.
But the path to breaking it is becoming clearer.
That’s the real signal.
The significance isn’t that we can break RSA or ECC with 10,000 qubits.
It’s that we now have a more concrete picture of what would need to be true to do it.
And that list is getting shorter.
⸻
Final Thought — Where does this put CyberSecurity?
Quantum risk won’t arrive the way most CyberSecurity organizations expect.
It won’t be a clean, well-telegraphed transition from “safe” to “unsafe” sent out by FS-ISAC, CISA or NIST
It will look like this:
• incremental papers
• narrowing assumptions
• engineering barriers quietly falling
Until one day, the gap between theory and practice is small enough that it no longer matters.
The mistake is waiting for certainty.
By the time you have it, you’re already behind. You can’t afford to wait until uncertainty is reduced, because you then have realized risk.
Instead you should:
Start that Cryptographic inventory right away.
Architect a CryptoAgile environment by treating crypto as infrastructure not an add on to applications. It needs to be centralized, and its agile switch out of quantum resistant algorithms made repeatable over short time frames.
Prioritize ECC exposure and the places where it is used.
Take seriously the threat posed by harvest now decrypt later attacks. Start working on the above to harden your organization.
Work on data classification and lifecycles as well as critical infrastructure (e.g. HSMs) to prioritize your key assets.
Work with the emerging standards and don’t wait for perfection. It’s better to get practice now.
This is part of your overall security architecture and strategy and is a multiyear effort that cuts across IDAM, Network Security, Data Protection, 3rd Party Risk, and Regulatory Compliance.



Comments