Quantum computing could potentially break much of the encryption algorithms and protocols that currently secure the internet and computational industry as they are.
Back in the 2000s, when I worked for PricewaterhouseCoopers (PwC), I was the director of technology and innovation. Part of that role was not only helping the company and some of our clients with tech innovation projects, but it was looking out five years, 10 years, 15 years and thinking about what technology was emerging and what it might mean for our industry. And so, all along the way, I’ve had a real interest in identifying things that are interesting that maybe a lot of people weren’t paying attention to yet.
One was the role of speech recognition. I was making the case that we would start to talk to our machines and there would be a more natural and progressive interface to use voice. The other one was called social computing, which later became known as social media. Another one was virtual reality. I did a lot of work with what, at the time, was called immersive worlds and a particular protocol called Second Life. Do you remember Second Life? It’s still popular today. It’s a world in which you can make your own persona. You can make friends, you can build things, there’s even a currency. I found blockchain to be similar at first. It was fascinating.
And then I stumbled upon quantum computing. Quantum mechanics, of course, is many decades old, but my radar, so to speak, detected that things were starting to happen. I would read, for example, about Google hitting a certain, what they call, qubit milestone. More companies — whether D-Wave or Microsoft, Fujitsu or others — were starting to publish their positions on quantum and where they think it’s going. And I think the real clincher for me was a book called “Hit Refresh,” where Satya Nadella — the book’s author and the CEO of Microsoft — is betting the future of Microsoft on three things: artificial intelligence, mixed realities and quantum.
A significant, influential, trillion-dollar business in the world is betting its future on this technology, as is Google. So I started to learn about it.
Related: Will Google’s Quantum Supremacy Break Bitcoin in 2020?
It was a much harder topic than I had anticipated, but I got into the topic research and then I proposed a course to LinkedIn Learning as an author, and they agreed.
The computing dilemma
All throughout my career, the microprocessor — the classical computer — has been the enabler of incredible positive outcomes for the most part. The microchip enabled the smartphone, the internet, better health care, better cities, better manufacturing, better products. And it continues to be very effective. Computers continue to get faster. Software continues to get better.
But there’s an interesting underlying trend that’s been happening that, for most of us, is only visible as chips are made smaller but also have more performance built into them. On a small microchip the size of a quarter or even a small coin, there are millions, if not billions of tiny little switches. Being able to fit more and more onto this very small silicon semiconductor is about as close to magic as it gets.
What we’ve discovered is that we will soon get to a physical limitation that there are only so many of these switches that can be put into a certain amount of space before reaching the atomic level. And once at that level, we will start to confront a different layer of physics, which is called quantum mechanics.
Because our processors are getting smaller and smaller, our ability to have faster computers in the future will start to slow down, because we can’t put a million switches on this one tiny thing. So here’s the dilemma we’re faced with.
We will either realize that we can only get computers to a certain speed, or we will train more computers together for a higher performance. Or we wonder, well, what’s going to take us from this peak that we’ve reached with computing power to the next generation so that we can solve the biggest problems of the 21st century? And it turns out — we have a few options. One of them is to develop a completely new computing platform based on quantum mechanics.
Fast computing
Once we break through to quantum computing, there will be no limits. We will suddenly be in a whole new place where, what would take a classical computer a month to process (like weather computations), a quantum computer could initially do in minutes and eventually in seconds.
There are chemistry problems that have become so complex that we can’t even solve them today in classical computing. And if we do try, it’ll take 10 years for the computers to turn out a result. Quantum will do it in seconds. So, we’re going to go from what we think is fast computing today to really fast computing. It’s going to open up incredible opportunities for humanity. However, quantum computing creates a lot of risks and some real issues — but that’s the case with everything, right? With any kind of technology comes someone who uses it with malicious intent. So there are also going to be consequences.
Social impact
The social impact of quantum computing is a big question, because we have to break it down bit by bit. Although, one aspect is that with better computing power, we can perform better, have better services, and solve problems that were once hard to solve, like making better predictions on health outcomes. So when we go to a hospital today, we ride on the expert judgment of doctors, and they do quite well. Perhaps in 30 years from now, we’ll look back and say, “wow, we trusted the doctors.” When artificial intelligence is supported by inconceivably fast processing speeds that can go through many permutations, we can hopefully get treatment plans that are much more accurate than they are today.
Quantum vs. blockchain
A lot of people have identified quantum and blockchain as something worth exploring. Personally, I don’t like to go too far out because the future is always a little unpredictable. But the risk to blockchain from quantum in the short term is that there is no risk.
Related: Quantum Computing Vs. Blockchain: Impact on Cryptography
Now, what that really means is — we got a head start. Cryptography works because it’s hard to break the encrypted scheme. It’s hard to unencrypt it because the amount of computations it would take makes the effort not feasible. So when exceptionally fast performance is brought into the equation, that limitation suddenly becomes less of a burden, because you can go through many more permutations to find the unencrypted passwords or even to reverse a hash. If the very core of what makes blockchain work is suddenly not as locked down as firmly as we thought, we should be concerned.
I believe we have time. But in the long term, there is a risk. So we need to be very active in alternative mechanisms, and there is already some interesting work being done on them.
If blockchain can be broken by quantum, all cryptography can be broken. And that’s a much bigger societal risk. Think about your bank, your browser, your smartphone or your laptop — computing has been integrated into all levels of our life. All those levels rely on existing cryptographic algorithms and are vulnerable in one way or another.
Related: Is Crypto Ready for the New Space Age?
One of the promising aspects here is that quantum actually makes excellent cryptography. There’s a thing called quantum cryptography, and it’s very different from our understanding of how classical cryptography works. Quantum cryptography is used today in a small amount of applications. Some interesting research in this field shows that while quantum can break existing cryptography and make it useless, it actually introduces a different forum which could be used to fix the problem. I think it’s fascinating that the very thing that breaks cryptography also fixes it.
There is actually a silver lining — quantum cryptography is much more secure than our consumer-base-level cryptography in society today. We could see quantum cryptography be the answer and even be actually integrated into future versions of blockchain, making blockchain a quantum-based system itself. Some interesting theoretical experimentation is being done on this very topic.
Related: The Future of Crypto: The Latest Cryptography Advances Set to Change Blockchain
Humans are awesome. We will come up with other ideas. It won’t just be quantum fixing the same problems it creates. And I’m also very confident that blockchain will evolve in the decades ahead.
Related: How the Crypto World Is Preparing for Quantum Computing, Explained
Limitations
We are in the early days of quantum computing — but that’s probably an understatement. We know it works. Part of the history of quantum in the last 10 to 15 years has been trying to make something work and it isn’t repeatable. If you can’t even do a math problem and repeat the same answer each time, you don’t have a viable pathway. But now we’re there. We’ve broken through this barrier and we have good, solid, predictable quantum computing.
Because quantum computing is so completely different from what we once knew, we have to build everything from scratch: the hardware, the operating systems, the applications, the databases. Everything has either been remade or is undergoing an overhaul. We can’t port anything.
In my video series, I make the joke that you can’t run Microsoft Windows on a quantum computer. There is no such thing. You can’t just run Ethereum on a quantum computer. So, we have to build the entire computing stack and build an ecosystem. If a quantum computer is slower than the fastest classical, well, the obvious answer is to just use a classical computer because it does run Windows and Ethereum. So for quantum computing to be feasible, you have to have a quantum computer that is faster than a classical one from the get-go.
This is called quantum supremacy — the point at which a quantum computer is faster than the fastest classical computer. We have not reached quantum supremacy yet, or at least it isn’t recognized by the consensus. There are some companies that say we’ve done it, while others debate, but we have to have full confidence that a quantum computer is able to make a calculation much faster than we know how to do it with even the fastest conventional supercomputers available. But even if that goal is achieved, there is a serious limitation for quantum computer adoption and usability. It starts with the difficulties in building the needed hardware as well as the computers’ enormously large size (which seems to be a step back from the mobility of the 21 century).
As quantum computers improve over time, they go through the same process as conventional computers — there’s always a new classical algorithm to test against the quantum ones, and it is continually raising the bar for quantum supremacy. But there exists a limitation for quantum supremacy: building the required hardware which would be able to control and stabilize processes on subatomic level.
The equivalent of classical transistors or processing capabilities in a quantum computer are called quantum bits, or qubits. They’re basically the quantum equivalent of bits on a microchip. We can have computers that have 12 qubits and 14, and 18 and 22. In fact, the highest we think in the broad quantum space is Google’s most recent quantum. The tech giant’s quantum AI lab says it has 72 qubits. And it means that they’re able to process in a stable way — that is, recreate the same outcome for an algorithm.
Why not just have lots of qubits together and tether really quickly? I wish it were that easy. In order to actually process an algorithm against a qubit, we need a high level of stability. As when you get to the subatomic level — where things do not follow the traditional rules of physics — the random motion at that level would make individual atomic events become an unpredictable and uncontrollable system, and these motions can generate internal energy as well as the various external factors that can also create movement and produce energy. One way to make atomic systems stable is by cooling them down, resulting in reduced movement, which means less energy will be expelled. In fact, the lower a temperature is, the more stable an atomic world becomes. But that means you’re not going to have a quantum computer in your house anytime soon, because the refrigeration factor is huge. Cooling the system down stabilizes it and thus reduces the chances of qubits incorrectly flipping in between quantum states.
Decoherence is still the primary limitation of quantum computing today. And every bit of effort, all the big energy, is on how to create better decoherence. Or, should I say, premature decoherence, because you need decoherence to measure the result, but you don’t want decoherence when the processing is happening.
Finally, there’s really no such thing as full tolerance yet. In classical computing, which is so reliable, there’s an awful lot of full tolerance happening. Software is failing all the time, but when a switch fails, another switch turns on and is able to account for the errors. We don’t have that in quantum yet. It’s a space that needs a lot of innovation, because if we’re going to have reliable and fast processing, we’ll need the ability for redundancy and accounting for errors in the system, and making those invisible to the end user.
Energy consumption issues?
One of the things that we need to recognize regarding quantum is we probably won’t ever own a quantum computer. There won’t be a quantum chip on a smartphone or a laptop, but quantum will flip the way that most of us are thinking about it. Most of the big innovators — the Microsofts and IBMs — the way they see it, quantum computing is heading toward cloud provision.
Quantum becomes just another service that can be tapped into. So one could argue that maybe in the future, there won’t be millions of quantum computers, just a lot of really big ones owned by the government and some prominent tech companies, then energy consumption won’t be such a big deal.
Digital future and crypto on cubits
One can imagine that in 20 or maybe 30 years from now, the world will run on quantum computers. All our systems will run on quantum, which will have a small quantity of classical computing plugged in to run on our local devices, but generally, the performance of our systems, organizations and society will be running on quantum. We’re not yet at the point where we can say with 100% certainty, as there will be lots of interesting twists and turns along the way, of course.
On the other hand, there is very deep government involvement in developing quantum-based infrastructure, including most members of the G-20. There are a couple of ways to think about it: There are benefits to being the first and being the innovators. The United States has typically pushed the envelope on the information age. And one of the reasons it was able to do that is because it had a head start by developing and attracting the world’s brightest minds, supporting technological and engineering initiatives and funding fundamental scientific research, while many other countries had other distractions.
Today, that’s not the case. China wants to be the first to have powerful quantum computers, while England still remains a community driven great power. Both examples are investing in innovation development from the government side.
Many national governments globally are also funding quantum research because it’s a competition for who can be the first to break through. No doubt, there’s also an important military application, and it seems like a lot of the conflicts we might have in the future would be sort of the lone actor — terrorism. And we need really good software and analytics to try to predict and mitigate in that particular arena. So, I could definitely see quantum computing being very valuable.
Many countries have official policies. If you look to China, the United States or the European Union, you’ll see that there are quantum policies, if not articulated in operational terms, then certainly philosophical, like with the EU saying things like, “We believe quantum is important and we want to research it at these universities.”
I don’t know what the upcoming quantum coin would entail. But you have yourself a glass of wine and think about what that might mean. That’s some big thinking right there.
This article is from an interview held by Kristina Lucrezia Cornèr with Dr. Jonathan Reichental. It has been condensed and edited.
Dr. Jonathan Reichental is the CEO of Human Future, a global business and technology education, advisory and investment firm. He is the former chief information officer for the City of Palo Alto, and is a multiple-award-winning technology leader whose 30-year career has spanned both the public and private sectors.