Microsoft today announced its roadmap for building its own quantum supercomputer, using the topological qubits the company’s researchers have been working on for quite a few years now. There are still plenty of intermediary milestones to be reached, but Krysta Svore, Microsoft’s VP of advanced quantum development, told us that the company believes that it will take fewer than ten years to build a quantum supercomputer using these qubits that will be able to perform a reliable one million quantum operations per second. That’s a new measurement Microsoft is introducing as the overall industry aims to move beyond the current era of noisy intermediate-scale quantum (NISQ) computing.
“We think about our roadmap and the time the quantum supercomputer in terms of years rather than decades,” Svore said.
Last year, Microsoft announced a major breakthrough when its team first highlighted its ability to create Majorana-based qubits. Majorana qubits have the advantage of being very stable (especially compared to traditional techniques) but they are also extremely difficult to create. Microsoft made an early bet on this technology and now, a year after first announcing this milestone, the team is publishing a new peer-reviewed paper (in the American Physical Society’s Physical Review B) that establishes that it has indeed achieved this first milestone on its way to a quantum supercomputer. To get to this point, Microsoft showed results from more devices and far more data than a year ago when it first announced this work.
“Today, we’re really at this foundational implementation level,” Svore said. We have noisy intermediate-scale quantum machines. They’re built around physical qubits and they’re not yet reliable enough to do something practical and advantageous in terms of something useful. For science or for the commercial industry. The next level we need to get to as an industry is the resilient level. We need to be able to operate not just with physical qubits but we need to take those physical qubits and put them into an error-correcting code and use them as a unit to serve as a logical qubit.” Svore argues that to reach this point, it’ll take a quantum computer that can perform those one million reliable quantum operations per second and a failure rate of one per trillion operations.
The next step now is to build hardware-protected qubits — and Svore said that the team is making great strides in its work to build these. These qubits will be small (less than 10 microns on a side) and fast enough to perform one qubit operation in less than a microsecond. After that, the team plans to work on entangling these qubits and operate them through a process called braiding, a concept that has been discussed (mostly as a theory) since at least the early 2000s.
From there, it’s on to build a smaller multi-qubit system and demonstrate a full quantum system.
That’s obviously an ambitious roadmap and given how long it took Microsoft to achieve even the first milestone, we’ll have to wait and see how well the team can now execute. With IBM, IonQ and others aiming for similar results — but using more established methods for building their qubits — we’re in a bit of an arms race right now to move beyond the NISQ era.
In addition to sharing its roadmap, Microsoft today also announced Azure Quantum Elements, its platform for accelerating scientific discovery by combining high-performance computing, AI and quantum, as well as Copilot for Azure Quantum, a specially trained AI model that can help scientists (and students) generate quantum-related calculations and simulations.