News

David L. Chandler | MIT News Office
February 18, 2020

Labs around the world are racing to develop new computing and sensing devices that operate on the principles of quantum mechanics and could offer dramatic advantages over their classical counterparts. But these technologies still face several challenges, and one of the most significant is how to deal with “noise” — random fluctuations that can eradicate the data stored in such devices.

A new approach developed by researchers at MIT could provide a significant step forward in quantum error correction. The method involves fine-tuning the system to address the kinds of noise that are the most likely, rather than casting a broad net to try to catch all possible sources of disturbance.

The analysis is described in the journal Physical Review Letters, in a paper by MIT graduate student David Layden, postdoc Mo Chen, and professor of nuclear science and engineering Paola Cappellaro.

“The main issues we now face in developing quantum technologies are that current systems are small and noisy,” says Layden. Noise, meaning unwanted disturbance of any kind, is especially vexing because many quantum systems are inherently highly sensitive, a feature underlying some of their potential applications.

 

 

Image caption: In a diamond crystal, three carbon atom nuclei (shown in blue) surround an empty spot called a nitrogen vacancy center, which behaves much like a single electron (shown in red). The carbon nuclei act as quantum bits, or qubits, and it turns out the primary source of noise that disturbs them comes from the jittery “electron” in the middle. By understanding the single source of that noise, it becomes easier to compensate for it, the researchers found.
News type: