IonQ, the leader in quantum computing, today unveiled its next generation quantum computer system. The new hardware features 32 perfect qubits with low gate errors, giving it an expected quantum volume greater than 4,000,000.
The new system consists of perfect atomic clock qubits and random access all-to-all gate operations for efficient software compilation of applications. It will be first available via private beta, and then commercially available on Amazon Braket, where IonQ’s 11 qubit system is generally available for customers today, and Microsoft’s Azure Quantum. Pre-existing IonQ customers and partners, including 1QBit, Cambridge Quantum Computing, QC Ware, Zapata Computing and more are excited to experience the benefits of the new system, enabling them to drive towards the first wave of quantum applications.
The company’s trapped-ion quantum computers have a proven track record of outperforming all other available quantum hardware. With this new iteration, IonQ continues to lead the quantum computing field into the future. IonQ is already working on its next two generations of quantum computers, with each new system expected to be both exponentially more powerful and smaller in size than the last.
“In a single generation of hardware, we went from 11 to 32 qubits, and more importantly, improved the fidelity required to use all 32 qubits,” said IonQ CEO & President Peter Chapman. “Depending on the application, customers will need somewhere between 80 and 150 very high fidelity qubits and logic gates to see quantum advantage. Our goal is to double or more the number of qubits each year. With two new generations of hardware already in the works, companies not working with quantum now are at risk of falling behind.
“The technology underpinning IonQ’s new system is based on decades of proven research and advancements, and our unique architecture provides essential computational efficiencies as the system scales up,” said IonQ Co-Founder & CTO Jungsang Kim. “This cornerstone moment provides the foundation for IonQ to rapidly grow and continue to perfect our systems.”
“Demonstrating the first successful quantum logic gate in 1995 was almost an accident, but doing so opened a path forward towards deploying quantum computers on previously unsolvable problems,” said IonQ Co-Founder & Chief Scientist Chris Monroe. “The new system we’re deploying today is able to do things no other quantum computer has been able to achieve, and even more importantly, we know how to continue making these systems much more powerful moving forward.” One way is to fix errors through circuit encoding, capitalizing on a recent demonstration of quantum error correction in a nearly identical system. Monroe says “with our new IonQ system, we expect to be able to encode multiple qubits to tolerate errors, the holy grail for scaling quantum computers in the long haul.” This encoding requires just 13 qubits to make a near-perfect logical qubit, while in other hardware architectures it’s estimated to take more than 100,000.
“We design quantum machine learning algorithms to drive performance on near-term hardware,” said Iordanis Kerenidis, Head of Algorithms International, QC Ware. “We collaborated with IonQ in implementing QC Ware’s quantum classification algorithm on their system, and the excellent results attest to their unique approach and demonstrated performance.”
“IonQ and Zapata work together to create and implement quantum applications,” said Christopher Savoie, CEO & Founder, Zapata Computing. “We are excited to unlock new potential across industry verticals—and make IonQ’s latest generation of devices available to users of our software platform, Orquestra.”
“IonQ represents one of the most promising approaches to quantum computing that is both scalable and does not require any significant materials science or manufacturing breakthroughs,” said Francis Ho, Senior Vice President and Managing Director, Samsung Catalyst Fund. “The company’s unique combination of academic research and experience plus proven performance has led to their system demonstrating industry leading performance and helping break new ground in quantum computing.”
“We believe IonQ is the most promising and advanced technology for developing quantum computers at scale. This latest milestone represents decades of academic research and experience, proven performance, and superior technology,” said Alaa Halawa, Head of US Ventures, Mubadala Capital. “This latest breakthrough is also particularly exciting for industrial companies in areas of material science and petrochemicals, enabling new applications that are crucial for enhancing competitiveness in the market.”
“IonQ and Cambridge Quantum Computing are working together to create and implement applications for quantum computers, for the benefit of CQC’s customers, and are excited to see what new applications are possible with IonQ’s newest generation,” said Denise Ruffner, Chief Business Officer, Cambridge Quantum Computing.
“IonQ’s approach to quantum represents the most promising pathway to achieving commercial success with quantum computers – and this breakthrough in performance and fidelity further validates that approach,” said Hany Nada, Co-Founder & Partner, Acme Capital. “We are thrilled to continue working with the team to realize the full benefits of quantum.”
“IonQ and 1QBit are working together on applying quantum computers to solve previously intractable problems in a variety of industries and are excited to explore new possibilities resulting from the release of IonQ’s newest generation of devices,” said Arman Zaribafiyan, Head of Quantum Simulation, 1QBit.
IonQ has raised $84 million in funding, recently announcing new investment from Lockheed Martin, Robert Bosch Venture Capital GmbH (RBVC) and Cambium. Previous investors include Samsung Electronics, Mubadala Capital, GV, Amazon, and NEA. The company’s two co-founders were also recently named to the National Quantum Initiative Advisory Committee (NQIAC).
Quantum phenomena have puzzled and delighted scientists for over a century, revealing unique, counter-intuitive characteristics of matter like superposition and entanglement. For four decades, the U.S. National Science Foundation has worked to enable breakthroughs in quantum information science and engineering that harness what researchers have learned about quantum phenomena to develop technologies like quantum computers, sensors, and communications. These quantum technologies will have enormous consequences for the national and global economy. To unleash that potential, researchers must overcome several major, fundamental challenges in quantum information science and engineering.
With these unresolved questions in mind, NSF launched the Quantum Leap Challenges Institutes program. And today, NSF, in partnership with the White House Office of Science and Technology Policy, is announcing $75 million for three new institutes designed to have a tangible impact in solving these problems over the next five years.
These institutes are a central piece of NSF’s response to key federal initiatives to advance quantum information science, including the National Quantum Initiative Act of 2018 and the White House’s ongoing focus on American leadership in emerging technologies. Quantum Leap Challenge Institutes also form the centerpiece of NSF’s Quantum Leap, an ongoing, agency-wide effort to enable quantum systems research and development.
“Quantum information science has the potential to change the world. But to realize that potential, we must first answer some fundamental research questions,” said NSF Director Sethuraman Panchanathan. “Through the Quantum Leap Challenge Institutes, NSF is making targeted investments. Within five years, we are confident these institutes can make tangible advances to help carry us into a true quantum revolution.”
“America’s future depends on our continued leadership in the most cutting-edge industries of tomorrow. With the announcement of three new quantum institutes, the Trump Administration is making a bold statement that the United States will remain the global home for QIS research. Our new Quantum Leap Challenge Institutes will advance America’s long history of breakthrough discoveries and generate critical advancements for years to come,” said Michael Kratsios, U.S. Chief Technology Officer.
NSF Quantum Leap Challenge Institute for Present and Future Quantum Computing. Today’s quantum computing prototypes are rudimentary, error-prone, and small-scale. This institute, led by the University of California, Berkeley, plans to learn from these to design advanced, large-scale quantum computers, develop efficient algorithms for current and future quantum computing platforms, and ultimately demonstrate that quantum computers outperform even the best conceivable classical computers.
The institutes comprise an interconnected community of 16 core academic institutions, 8 national laboratories, and 22 industry partners. Through integrating the perspectives and resources of multiple disciplines and sectors, they promote a sustainable ecosystem for innovation. In addition to their research, these centers will also make strides in training and educating a diverse, quantum-ready U.S. workforce. They will develop new in-person and online curricula for students and teachers at all educational levels, from primary school to professionals.
Let’s say you’re interested in buying a car, so you visit a new car dealership. Imagine your disappointment if you learned that the only information the salesperson can give you is the number of seats in each vehicle.
How crazy would that be? Of course, to make sure a car fits your requirements, you need a lot more information.
What color is the car? Does it have the right accessories? And, what kind of gas mileage does it get? Ultimately, you would also like to know how it performs under different conditions, like driving in town and on the highway. Only when armed with more information could you make an informed decision about buying a car or making a comparison between different vehicles.
Until now, that’s pretty much how we have evaluated quantum computers. The focus has mainly been on the number of qubits in a quantum computer while ignoring many other important factors affecting its computational ability.
Before Google announced it had achieved Quantum Supremacy, the media speculated on how many qubits would be needed to outperform a classical computer. Ignoring the controversial technical aspects of Google’s accomplishment, it finally achieved the benchmark. Of course, most articles focused on the fact that Google used a 54-qubit quantum processor.
What’s next after quantum supremacy?
Beyond quantum supremacy, the next major benchmark, called quantum advantage, is on the distant horizon. Quantum advantage will exist when programmable NISQ quantum gate-based or circuit-based computers reach a degree of technical maturity that allows them to solve many, but not necessarily all, significant real-world problems that classical computers can’t solve, or problems that classical machines require an exponential amount of time to solve.
World-changing applications will be possible once we reach quantum advantage. The major applications will likely include optimization, chemistry, machine learning, and organic simulations.
From Quantum Supremacy to Quantum Advantage
Although the number of qubits is essential, so are the number of operations completed before qubits lose their quantum states. Qubits decohere either due to noise or because of their inherent properties. For those reasons, building quantum computers capable of solving deeper, more complex problems is not just a simple matter of increasing the number of qubits.
Before we can move from a hundred qubits to one with thousands of qubits, and eventual to one with millions of qubits, many significant technical issues remain to be solved. Moreover, none of the problems have overnight solutions. It will likely take another five or ten years of incremental research, experimentation, and steady technical improvements to find answers.
Configuration changes – subtle and significant – can dramatically affect the performance of a quantum computer. Determining the optimum quantum computer configuration is much like solving a Rubik’s Cube. In the process of trying to align all the blue tiles on one surface, the previously solved red surface becomes disrupted.
The power of Quantum Volume
It would be helpful if quantum researchers had a tool that allowed them to systematically measure and understand how incremental technology, configuration and design changes affected a quantum computer’s overall power and performance. Corporate users also need a way to compare the relative power of one quantum computer to another.
IBM foresaw the need for such a metric in 2017 when its researchers developed a full-system performance measurement called Quantum Volume.
Quantum Volume’s numerical value indicates the relative complexity of a problem that can be solved by the quantum computer. The number of qubits and the number of operations that can be performed are called the width and depth of a quantum circuit. The deeper the circuit, the more complex of an algorithm the computer can run. Circuit depth is influenced by such things as the number of qubits, how qubits are interconnected, gate and measurement errors, device cross talk, circuit compiler efficiency, and more.
That’s where Quantum Volume comes in. It analyzes the collective performance and efficiency of these factors then produces a single, easy-to-understand Quantum Volume number. The larger the number, the more powerful the quantum computer.
Demonstrated incremental improvements
Quantum Volume can be used to compare the power of one quantum computer to another. It can also play a significant role in ongoing development and research necessary to create bigger and better quantum computers to achieve a quantum advantage.
IBM tested Quantum Volume on three of its quantum computers. The test included the following: the Tenerife five-qubit computer released through the IBM Q Experience quantum cloud service in 2017, the 20-qubit Tokyo 2018 computer, and the 20-qubit 2019 IBM Q System One.
The Volume Growth Chart shows how Quantum Volume doubled each successive year.
Quantum Volume as a research tool
Jay Gambetta, IBM Vice President, Quantum Computing, posted graphs online showing progressive improvement in CNOT error rates as a result of various changes. In the post, he said: “With four revisions of our 20-qubit quantum system (chip name “Penguin”) plotted together, you can really see the progress in stability, scalability, and reduction of errors over the last two years.”
When asked for more detail on how Quantum Volume assisted in these improvements, the IBM research team said: “These plots are a great example of why qubit counts alone do not tell the whole story. Over several revisions of the architecture, the CNOT error rates improved significantly – which is shown in the plots; and that has a significant impact on improving quantum volume from revision to revision. There are many other performance metrics and architectural choices that impact the quantum volume of a system as well – for instance how the qubits are laid out and connected to one another. Maximizing quantum volume is about making trade-offs between these different parameters – and identifying new ways to push those boundaries. This is why the quantum volume metric is so important; otherwise it would be quite possible to drive a couple of parameters to report on publicly yet to the detriment of overall quantum volume as it can be measured in a single system.”
Optimizing with Quantum Volume
For the foreseeable future, quantum computers will use noisy qubits with relatively high error rates and short coherence times. We are still in the experimental stages of error correction. We know functional error correction will likely require a thousand or more error-correction qubits for every computational qubit. Decoherence of quantum states is a significant obstacle we will have to overcome to build scalable and reliable quantum computers.
Simple logic and the Quantum Growth Chart tells us that to reach quantum advantage by 2025, we need quantum computers with much higher Quantum Volumes, perhaps with a numerical value of 1000 or more.
Along these lines, in response to my earlier question, the IBM research team also said, “Maximizing quantum volume is about making trade-offs between these different parameters – and identifying new ways to push those boundaries. This is why the quantum volume metric is so important; otherwise it would be quite possible to drive a couple of parameters to report on publicly yet to the detriment of overall quantum volume as it can be measured in a single system.”
Quantum Volume and the quantum computing ecosystem
Most of today’s benchmarking is done at the component level for qubits and quantum logic gates. There have been other benchmarking methods investigated as described in this excerpt from IBM’s paper on Quantum Volume:
“Methods such as randomized benchmarking, state and process tomography, and gate set tomography are valued for measuring the performance of operations on a few qubits, yet they fail to account for errors arising from interactions with spectator qubits. Given a system such as this, whose individual gate operations have been independently calibrated and verified, how do we measure the degree to which the system performs as a general-purpose quantum computer? We address this question by introducing a single-number metric, the quantum volume, together with a concrete protocol for measuring it on near-term systems. Similar to how LINPACK and improved benchmarks are used for comparing diverse classical computers, this metric is not tailored to any particular system, requiring only the ability to implement a universal set of quantum gates.”IBM Research
Since quantum computing is now in a zone between quantum supremacy and quantum advantage, interim quantum computing performance benchmarks are now needed more than ever. The use of Quantum Volume offers many readily available operational and business benefits to gate-based and circuit-based quantum computer companies.
IBM’s published results demonstrate Quantum Volume’s benefits. It can facilitate research, help develop system roadmaps, and help to systematically optimize architectures needed to advance technology to quantum advantage and beyond.
Every researcher understands that funding for quantum computing research requires a favorable opinion of the technology by the general public. Unfortunately, because of quantum complexity and long intervals between significant announcements, the general public understands very little about quantum computing. Wider wide use of Quantum Volume would provide the public with a better feel for progress and generate more public interest.
Quantum Volume would also benefit the CEO or investor who lacks the in-depth technical knowledge necessary to make confident investment decisions in the technology. Additionally, reported variations in Quantum Volume from company to company would likely stimulate more articles by the media, which would serve to educate the general public further.
There should be no question that the development and ongoing support of Quantum Volume demonstrates IBM’s long-term commitment to the overall quantum computing ecosystem.
Organizations like the IEEE need to assist in the long-term evolution and development of quantum computing benchmarks. Unfortunately, it will take years to complete investigations, documentation, and negotiation of a final product.
The bottom line – there are many benefits to using Quantum Volume. Moor Insights & Strategy believes it is a powerful tool that should be adopted as an interim benchmarking tool by other gate-based quantum computer companies. When the time comes, it should also be considered by IEEE as a permanent standard.
In a surprising new study, University of California, Berkeley, researchers show that heat energy can travel through a complete vacuum thanks to invisible quantum fluctuations. To conduct the challenging experiment, the team engineered extremely thin silicon nitride membranes, which they fabricated in a dust-free clean room, and then used optic and electronic components to precisely control and monitor the temperature of the membranes when they were locked inside a vacuum chamber. (UC Berkeley photo by Violet Carter)
If you use a vacuum-insulated thermos to help keep your coffee hot, you may know it’s a good insulator because heat energy has a hard time moving through empty space. Vibrations of atoms or molecules, which carry thermal energy, simply can’t travel if there are no atoms or molecules around.
But a new study by researchers at the University of California, Berkeley, shows how the weirdness of quantum mechanics can turn even this basic tenet of classical physics on its head.
The study, appearing this week in the journal Nature, shows that heat energy can leap across a few hundred nanometers of a vacuum, thanks to a quantum mechanical phenomenon called the Casimir interaction.
Though this interaction is only significant on very short length scales, it could have profound implications for the design of computer chips and other nanoscale electronic components where heat dissipation is key. It also upends what many of us learned about heat transfer in high school physics.
In the experiment, the team showed that heat energy, in the form of molecular vibrations, can flow from a hot membrane to a cold membrane even in a complete vacuum. This is possible because everything in the universe is connected by invisible quantum fluctuations. (UC Berkeley image courtesy of the Zhang lab)
“Heat is usually conducted in a solid through the vibrations of atoms or molecules, or so-called phonons — but in a vacuum, there is no physical medium. So, for many years, textbooks told us that phonons cannot travel through a vacuum,” said Xiang Zhang, the professor of mechanical engineering at UC Berkeley who guided the study. “What we discovered, surprisingly, is that phonons can indeed be transferred across a vacuum by invisible quantum fluctuations.”
In the experiment, Zhang’s team placed two gold-coated silicon nitride membranes a few hundred nanometers apart inside a vacuum chamber. When they heated up one of the membranes, the other warmed up, too — even though there was nothing connecting the two membranes and negligible light energy passing between them.
“This discovery of a new mechanism of heat transfer opens up unprecedented opportunities for thermal management at the nanoscale, which is important for high-speed computation and data storage,” said Hao-Kun Li, a former Ph.D. student in Zhang’s group and co-first author of the study. “Now, we can engineer the quantum vacuum to extract heat in integrated circuits.”
No such thing as empty space
The seemingly impossible feat of moving molecular vibrations across a vacuum can be accomplished because, according to quantum mechanics, there is no such thing as truly empty space, said King Yan Fong, a former postdoctoral scholar at UC Berkeley and the study’s other first author.
“Even if you have empty space — no matter, no light — quantum mechanics says it cannot be truly empty. There are still some quantum field fluctuations in a vacuum,” Fong said. “These fluctuations give rise to a force that connects two objects, which is called the Casimir interaction. So, when one object heats up and starts shaking and oscillating, that motion can actually be transmitted to the other object across the vacuum because of these quantum fluctuations.”
The team used highly sensitive optics to monitor the temperature of the silicon nitride membranes during the experiment. (UC Berkeley photo by Violet Carter)
Though theorists have long speculated that the Casimir interaction could help molecular vibrations travel through empty space, proving it experimentally has been a major challenge. To do so, the team engineered extremely thin silicon nitride membranes, which they fabricated in a dust-free clean room, and then devised a way to precisely control and monitor their temperature.
They found that, by carefully selecting the size and design of the membranes, they could transfer the heat energy over a few hundred nanometers of a vacuum. This distance was far enough that other possible modes of heat transfer were negligible — such as energy carried by electromagnetic radiation, which is how energy from the sun heats up Earth.
Because molecular vibrations are also the basis of the sounds that we hear, this discovery hints that sounds can also travel through a vacuum, Zhang said.
“Twenty-five years ago, during my Ph.D. qualifying exam at Berkeley, one professor asked me ‘Why can you hear my voice across this table?’ I answered that, ‘It is because your sound travels by vibrating molecules in the air.’ He further asked, ‘What if we suck all air molecules out of this room? Can you still hear me?’ I said, ‘No, because there is no medium to vibrate,’” Zhang said. “Today, what we discovered is a surprising new mode of heat conduction across a vacuum without a medium, which is achieved by the intriguing quantum vacuum fluctuations. So, I was wrong in my 1994 exam. Now, you can shout through a vacuum.”
Co-authors of the paper include Rongkuo Zhao, Sui Yang and Yuan Wang of UC Berkeley.
This research was funded in part by the National Science Foundation (NSF) under grant 1725335, the King Abdullah University of Science and Technology Office of Sponsored Research (OSR) (award OSR-2016-CRG5-2950-03; OSR-2016-CRG5-2996) and the Ernest S. Kuh Endowed Chair in Engineering.