A group of American physicists succeeded in producing the interaction of pairs of atoms and ions at a distance ever reached until now. As the atoms can constitute relatively stable quantum memories to store “qubits” of information, the generalization of traditional bits good known in data processing, it”s actually a result interesting for all the researchers working with the creation of a

powerful quantum computer able to supplant which may implied in traditional computers.

Quantum information science has arisen in response to a variety of converging scientific challenges. One goal is to probe the foundations of the theory of computation. What limits are imposed on computation by the fundamental laws of physics, and how can computational power be enhanced by exploiting the structure of these laws? Another goal is to extend the theory of communication. What are the ultimate physical limits on the performance of a communication channel, and how might quantum phenomena be harnessed by new communication protocols? Yet another challenge is to understand and overcome the quantum effects that constrain how accurately we can monitor and manipulate physical systems. What new strategies can be devised to push back the frontier of quantum-limited measurements, or to control the behavior of intricate quantum systems?

While quantum information science is a broad and rapidly expanding field, there are a few underlying recurrent themes. The theory of classical information, computation, and communication developed extensively during the twentieth century. Though undeniably useful, this theory cannot fully characterize how information can be used and processed in the physical world — a quantum world. Some achievements of quantum information science can be described as generalizations or extensions of the classical theory that apply when information is represented as a quantum state rather than in terms of classical bits

In the popular imagination, quantum computers would be almost magical devices, able to “solve impossible problems in an instant” by trying exponentially many solutions in parallel. But some problem arises also by quantum computing.

First, Any quantum algorithm to decide whether a function f:[n]->[n] is oneto-one or two-to-one needs to query the function at least n^{1/5} times. This provides strong evidence that collision-resistant hash functions, and hence secure electronic commerce, would still be possible in a world with quantum computers.

Second, In the “black-box” or “oracle” model that we know how to analyze, quantum computers could not solve NP-complete problems in polynomial time, even with the help of nonuniform “quantum advice states.”

Third, Quantum computers need exponential time to find local optima – and surprisingly, that the ideas used to prove this result also yield new classical lower bounds for the same problem.

Finally, “pretty-good quantum state tomography” using a number of measurements that increases only linearly, not exponentially, with the number of qubits. This illustrates how one can sometimes turn the limitations of quantum computers on their head, and use them to develop new techniques for experimentalists. No quantum computing background is assumed.

Chris Monroe of University of Maryland and his colleagues of University of Michigan have just published in Nature , the article “Manipulating quantum entanglement with atoms and photons”. In a remarkable way, quantum interaction between the two atoms occurs via photons which they already emitted and this, theoretically, whatever the distance separating them. The principle of the method is as follows.-

One takes a pair of atoms or ions, here of the ytterbium ions, and one traps them one away meter using a high frequency quadric polar electric field (about a few MHz), as it is the case in the traps with ions of Paul and Penning. One then subjects them to a laser impulse to put them in an excited state. When they are de-energized, of the photons are produced which are intricate with the atom.

The idea is then to capture these photons using lenses then to inject them into optical fibers to make them traverse a certain distance taking them along on a separator of beams. If the two photons have the same frequency, they can then interfere while leaving the separator. One then carried out an interaction of the photons and thus of the atoms themselves!

**A test of EPR **

To test the reality of this interaction, the two ions were subjected again to laser impulses in order to produce other photons by fluorescence. The physicists then reproduced an experiment of the type of those resting on the famous correlations of paradox EPR, introduced by Einstein and his collaborators in 1935, and theoretically translated into terms of experiments on photons of the years later per David Bohm. As for the experiments of Alain Aspect in 1982, intended to test the ideas of Einstein Podolski and Rosen starting from correlations between photons violating the famous inequalities of Bell, odd “the actions remotely phantom” implied by quantum interaction were well there.

For the moment, the experiment could be made only with photons in close UV. Unfortunately, the optical fibres very tend to be absorbing with those and, on a billion photons captured by the preceding lenses, only one arrives at the separator of beams. The physicists thus work on the means of producing this interaction with photons closer to those usually employed for telecommunications.

The stake is of importance bus in the race with the “macroscopic” computers quantum, the only able ones to supplant of speed and in power the traditional computers, the use of stable quantum memories, i.e. able to record by quantum interaction of the “qubits” of information to sufficiently a long time allow to carry out quantum calculations before the non-coherence does not destroy the possibility of it, is crucial. However, the atoms are candidates interesting for that. One thus includes/understands all the interest of this performance of interaction of pairs of atoms.