World’s first scalable, networked photonic quantum computer prototype unveiled
Traditional computers rely on electrons to perform computational tasks, but electrons can never match the speed and processing power offered by photons, tiny packets of light that travel at an astonishing 300,000 km/s.
Unfortunately, we cannot make classical computers run on photons because they need electric charges to process and store information. On the other hand, photons are chargeless and do not interact easily with electronic components like transistors in classical circuits.
This is why scientists have been trying to develop photonic quantum computers, an advanced computing concept that uses mirrors, beam splitters, and optical fibers to manipulate photons.
However, the word “concept” in the above paragraph seems to be no longer required as a Canada-based quantum computing company named Xanadu has created the world’s first scalable and networked photonic quantum computer prototype.
“It is the very first time we—or anyone, for that matter—have combined all the subsystems necessary to implement universal and fault-tolerant quantum computation in a photonic architecture.” the Xanadu team said.
A practical photonic quantum system
Xanadu calls their photonic quantum computer Aurora. It is a 12-qubit system developed using four independent modular server racks, which in total comprise 35 photonic chips and 13-km (8-mile) long fiber optics. The best part is that this entire system operates in room-temperature conditions.
Aurora’s makers claim that it is highly scalable and equipped with everything needed to perform fault-tolerant quantum computing operations.
It currently works like a baby data center but, in principle, it could “be scaled up to thousands of server racks and millions of qubits today, realizing the ultimate goal of a big quantum data center,” the Xanadu team notes.
The study authors also tested their four-server computer to create a special type of entangled state with billions of modes. This result was collected over 2 hours and represents the creation and measurement of a large entangled state, made up of 86.4 billion modes, or about 7.2 billion temporal modes.
“We use this machine, which we name Aurora, to synthesize a cluster state entangled across separate chips with 86.4 billion modes, and demonstrate its capability of implementing the foliated distance-2 repetition code with real-time decoding,” authors of a peer-reviewed study on Aurora said.
In quantum computing, a repetition code is a straightforward approach to dealing with errors. It works by encoding a single logical qubit across several physical qubits, creating redundancy for easy detection and correction of errors.
The test successfully proved Aurora’s potential for performing complex and large computations in a fault-tolerant manner.
Aurora isn’t yet perfect
There’s no doubt that Aurora is a highly scalable and modular photonic computer, but its design still has some challenges to overcome.
For instance, when it is scaled to the size of a large data center, it is likely to encounter high signal loss rates and need a large number of more complex hi-tech components to function smoothly.
“Loss rates will still have to come down by orders of magnitude as the resulting system will need to be the size of a conventional data center to operate,” Christoph Simon, a quantum computing expert at the University of Calgary, told The Globe and Mail. Simon wasn’t a part of the Xanadu team.
It will still take some years before all the challenges with Aurora are met, but the good news is photonic quantum computing is off to a solid start. In the coming months, we could see the development of more advanced systems with a larger number of servers.
A study on Aurora is published in the journal Nature.
#Worlds #scalable #networked #photonic #quantum #computer #prototype #unveiled