A sort of Holy Grail for physicists and information scientists is the quantum computer.
Such a computer, operating on the highly complex principles of quantum mechanics, would be capable of performing specific calculations with capabilities far beyond even the most advanced modern supercomputers. It could be used for breaking computer security codes as well as for incredibly detailed, data-heavy simulations of quantum systems.
It could be used for applying precise principles of physics to understanding the minute details of the interactions of molecules in biological systems. It could also help physicists unravel some of the biggest mysteries of the workings of the universe by providing a way to possibly test quantum mechanics.
Such a computer exists in theory, but it does not exist in practicality – yet – as it would need to operate with circuitry at the scale of single atoms, which is still a daunting challenge, even to state-of-the-art experimental quantum science. To build a quantum computer, one needs to create and precisely control individual quantum memory units, called qubits, for information processing.
Qubits are similar to the regular memory “bits” in current digital computers, but far more fragile, as they are microscopic constituents of matter and extremely difficult to separate from their environment. The challenge is to increase the number of qubits to a practical-size quantum register. In particular, qubits need to be created into sets with precise, nonlocal physical correlations, called entangled states.
Olivier Pfister, a professor of physics in the University of Virginia’s College of Arts & Sciences, has just published findings in the journal Physical Review Letters demonstrating a breakthrough in the creation of massive numbers of entangled qubits, more precisely a multilevel variant thereof called Qmodes.
Entanglement dwells outside our day-to-day experience; imagine that two people, each tossing a coin on their own and keeping a record of the results, compared this data after a few coin tosses and found that they always had identical outcomes, even though each result, heads or tails, would still occur randomly from one toss to the next. Such correlations are now routinely observed between quantum systems in physics labs and form the operating core of a quantum computing processor.
Pfister and researchers in his lab used sophisticated lasers to engineer 15 groups of four entangled Qmodes each, for a total of 60 measurable Qmodes, the most ever created. They believe they may have created as many as 150 groups, or 600 Qmodes, but could measure only 60 with the techniques they used.
Each Qmode is a sharply defined color of the electromagnetic field. In lieu of a coin toss measurement, the Qmode measurement outcomes are the number of quantum particles of light (photons) present in the field. Hundreds to thousands of Qmodes would be needed to create a quantum computer, depending on the task.
“With this result, we hope to move from this multitude of small-size quantum processors to a single, massively entangled quantum processor, a prerequisite for any quantum computer,” Pfister said.
Pfister’s group used an exotic laser called an optical parametric oscillator, which emitted entangled quantum electromagnetic fields (the Qmodes) over a rainbow of equally spaced colors called an “optical frequency comb.”
Ultrastable lasers emitting over an optical frequency comb have revolutionized the science of precision measurements, called metrology, and paved the way to multiple technological breakthroughs. The inventors of the optical frequency comb, physicists John Hall of the National Institute of Standards and Technology and Theodor Hänsch of the Max-Planck Institute for Quantum Optics, were awarded half of the 2005 Nobel Prize in Physics for their achievement. (The other half went to Roy Glauber, one of the founding fathers of quantum optics.)
With their experiments, Pfister’s group completed a major step to confirm an earlier theoretical proof by Pfister and his collaborators that the quantum version of the optical frequency comb could be used to create a quantum computer.
“Some mathematical problems, such as factoring integers and solving the Schrödinger equation to model quantum physical systems, can be extremely hard to solve,” Pfister said. “In some cases the difficulty is exponential, meaning that computation time doubles for every finite increase of the size of the integer, or of the system.”
However, he said, this only holds for classical computing. Quantum computing was discovered to hold the revolutionary promise of exponentially speeding up such tasks, thereby making them easy computations.
“This would have tremendous societal implications, such as making current data encryption methods obsolete, and also major scientific implications, by dramatically opening up the possibilities of first-principle calculations to extremely complex systems such as biological molecules,” Pfister said.
Quantum computing can be summarized by qubit processing; computing with single elementary systems, such as atoms or monochromatic light waves, as memory units. Because qubits are inherently quantum systems, they obey the laws of quantum physics, which are more subtle than those of classical physics.
Randomness plays a greater role in quantum evolution than in classical evolution, Pfister said. Randomness is not an obstacle to deterministic predictions and control of quantum systems, but it does limit the way information can be encoded and read from qubits.
“As quantum information became better understood, these limits were circumvented by the use of entanglement, deterministic quantum correlations between systems that behave randomly, individually,” he said. “As far as we know, entanglement is actually the ‘engine’ of the exponential speed up in quantum computing.”