A qubit (or quantum bit or q-bit) is a bit that exploits the physical characteristics of matter on a subatomic scale. Its main advantage is to be able to store more pieces of information simultaneously, where the traditional bits store only one. Its fault is that it is not very persistent.
A “classic” bit is determined – we know if it stores a 1 or 0. A quantum bit, on the other hand, exploits the principle of superposing states to store both a one and a 0.
To understand this point, David Louapre (amazing science facilitator and author of but who caught the Higgs Bison ?) remember that a register of 4 bits – the equivalent of 4 boxes in which one puts a 1 or a 0 – generates 16 possible states (0000, 0001, 0010, 0011 …. up to 1111), “which allows representing numbers between 0 and 15″.
By comparison, ” a quantum register of 4 quilts can be in one superposition of these 16 states at a time […] Schematically, it can make 16 calculations in parallel “, he explains.
Natural Exponential Power
The advantage of such a superposition is that the addition of a qubit increases the computational capacity exponentially, whereas the addition of classical bits increases it linearly.
“To relate qubits gives exponential computational capacities-in the true sense of the term: if we have N Q-bits, we have 2n potential computational capacities-with a massively parallel system… and of course massively parallel “, stresses Thierry Breton, CEO of Atos. “Naturally,” because we use the laws of physics.”
For example, a 20 Q-bit quantum computer would have the power of a desktop computer. At 40 Q-bits, this machine would be as powerful as today’s largest HPC.
Beyond that, we speak of “quantum supremacy,” that is to say, the potential computing power would have no equivalent in the world of conventional computers.
“This opens up huge opportunities… if all of this works “, swatch Thierry Breton.
Limits of Quantum Measurement and Decoherence
The nuance is important because if a qubit takes advantage of the quantum laws, it is also limited by the same laws (decoherence and determination at the time of measurement).
Decoherence theory is the phenomenon that causes quantum superposition to end in interactions with the environment. A particle sees its state set (” project ” or ” shrink “) exactly as it would a measurement. For a qubit, this means that it turns into a simple classic bit.
In other words, a qubit has a persistence that is equal to the duration of the superposition of the particle that composes it (” quantum coherence “) before the superposition collapses. But decoherence can happen very quickly.
To delay it as long as possible, q-bits are cooled to very low temperatures to reduce ” quantum noise.”
“To avoid the problems of interactions of these Q-bits with the outside world, you have to work almost at Absolute Zero, -273 degrees or a few thousandths of degrees above,” says Thierry Breton (who also reminds us that photonic qubits are an exception on this point). “It’s very complicated to do. Even if we start doing it in the lab.”
Another limitation is that adding qubits increases the risk of interference and quantum noise, and thus decoherence.
Finally, the superposition state, by definition, disappears with the measure (“the reduction of the wave packet”). For a qubit, this means that reading its content – the result of an operation for example-freezes that content. “So even though a quantum computer can do a lot of calculations in parallel […] at the end you can only have one result, ” says Lauper.