Quantum Computing 101

There’s been a lot of buzz around quantum computing over the last year or so and there seems little doubt that it will provide the next major step forward in computing power but it’s still largely theoretical – you can’t buy a quantum computer today. So, what does it really mean… and why should we care?

Today’s computers are binary. The transistors (tiny switches) that are contained in microchips are either off (0) or on (1) – just like a light switch. Quantum computing is based on entirely new principles. And quantum mechanics is difficult to understand – it’s counterintuitive – it’s weird. So let’s look at some of the basic concepts:

Superposition Superposition is a concept whereby, instead of a state being on or off, it’s on and off. At the same time. And it’s everything in the middle as well. Think of it as a scale from 0 to 1 and all the numbers in-between.
Qubit A quantum bit (qubit) uses superposition so that, instead of trying problems sequentially, we can compute in parallel with superposition.

More qubits are not necessarily better (although there is a qubit race taking place in the media)… the challenge is not about creating more qubits but better qubits, with better error correction.

Error correction Particles like an electron have a charge and a spin so they point in a certain direction. Noise from other electrons makes them wiggle so the information in one is leaking to others, which makes long calculations difficult. This is one of the reasons that quantum computers run at low temperatures.

Greek dancers hold their neighbour so that they move as one. One approach in quantum computing is to do the same with electrons so that only those at the end have freedom of motion – a concept called electron fractionalisation. This creates a robust building block for a qubit, one that is more like Lego (locking together) than a house of cards (loosely stacked).

Different teams of researchers are using different approaches to solve error correction problems, so not everyone’s Qubits are equal! One approach is to use topological qubits for reliable computation, storage and scaling. Just like Inca quipus (a system of knots and braids used to encode information so it couldn’t be washed away, unlike chalk marks), topological qubits can braid information and create patterns in code.

Exponential scaling Once the error correction issue is solved, then scaling is where the massive power of quantum computing can be unleashed.

A 4 bit classical computer has 16 configurations of 0s and 1s but can only exist in one of these states at any time. A quantum register of 4 qubits can be in all 16 states at the same time and compute on all of them at the same time!

Every n interacting qubits can handle 2n bits of information in parallel so:

  • 10 qubits = 1024 classical bits (1KiB)
  • 20 qubits = 1MB
  • 30 qubits = 1GB
  • 40 qubits = 1TB
  • etc.

This means that the computational power of a quantum computer is potentially huge.

What sort of problems need quantum computing?

We won’t be using quantum computers for general personal computing any time soon – Moore’s Law is doing just fine there – but there are a number of areas where quantum computing is better suited than classical computing approaches.

We can potentially use the massive quantum computing power to solve problems like:

  • Cryptography (making it more secure – a quantum computer could break the RSA 2048 algorithm that underpins much of today’s online commerce in around 100 seconds – so we need new models).
  • Quantum chemistry and materials science (nitrogen fixation, carbon capture, etc.).
  • Machine learning (faster training of models – quantum computing as a “co-processor” for AI).
  • and other intractable problems that are supercompute-constrained (improved medicines, etc.).

A universal programmable quantum computer

Microsoft is trying to create a universal programmable quantum computer – the whole stack – and they’re pretty advanced already. The developments include:

Quantum computing may sound like the technology of tomorrow but the tools are available to develop and test algorithms today and some sources are reporting that a quantum computing capability in Azure could be just 5 years away.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.