This Palace simulation visualizes the electric field energy density for a metamaterlal waveguide. (AWS Graphic)
Today’s news from the frontier of quantum computing includes Amazon Web Services’ release of cloud-based simulation software for modeling the electromagnetic properties of quantum hardware, Google’s latest technological advance aimed at lowering the error rate of quantum calculations, and new recommendations about the public sector’s role on the frontier.
The IonQ Forte quantum system is roughly the same size as a standard data center cabinet. (IonQ Photo)
Do full-fledged quantum computers already exist, or will it be a decade before they come into being? Will they have to be the size of a football field? A data center cabinet? A microwave oven?
It seems as if the more you talk to computer scientists involved in the quantum computing quest, the less certain the answers become. It’s the flip side of the classic case of Schrödinger’s Cat, which is both dead and alive until you open the box: Quantum computers could be regarded as already alive, or not yet born.
So are quantum computers ready for prime time? Researchers say that they’re not, and that the timeline for development is fuzzy. It all depends on how you define quantum computers and the kinds of problems you expect them to handle.
This cryostat creates ultra-cold temperatures for quantum experiments. (Microsoft Photo / John Brecher)
A newly issued report says Washington state provides one of America’s best settings for expanding the frontiers of quantum information science — but those frontiers are so strange and new that it’s hard to get a handle on their potential.
Formed in 2019, the Northwest Quantum Nexus’ membership shows why the region is well-suited to play a leading role in the quantum revolution.
NQN’s partners include Microsoft and Amazon Web Services, which have both rolled out cloud-based quantum computing platforms; Pacific Northwest National Laboratory, which is working on a range of quantum applications for national security purposes; and premier research institutions including UW and Washington State University.
“This report validates our thesis that Washington state has the right mix of organizations and capabilities — ranging from startups to legacy enterprises — to ensure Washington becomes a global leader in both quantum adoption and commercialization,” WTIA’s CEO, Michael Schutzler, said in a news release.
But the report also says the state’s tech ventures aren’t taking full advantage of homegrown talent.
IonQ CEO Peter Chapman shows off a quantum chip as University of Washington researcher Xinxin Tang looks on. (GeekWire Photo / Alan Boyle)
It’s been almost four years since Pacific Northwest leaders in the field of quantum computing gathered in Seattle for the first Northwest Quantum Nexus Summit, and since then, the scientific buzz over quantum has only gotten buzzier. So what’s next for the Nexus? A star-studded second summit.
Amazon Web Services and Boeing are joining this week’s gathering at the University of Washington, and nearly 300 academic, business and government representatives have signed up to attend. Some of the companies showing up at the second summit — such as the Seattle startup Moonbeam Exchange — didn’t even exist when the first summit took place in March 2019.
Over the past four years, UW has received about $45 million in federal funding to support research into quantum information science. Quantum computing has gotten fresh boosts from Congress and the Biden administration. The Pacific Northwest’s two cloud computing powerhouses, Amazon Web Services and Microsoft Azure, have both rolled out hybrid quantum platforms. And just last week, Maryland-based IonQ announced that it’s setting up a research and manufacturing facility for quantum computers in Bothell, a Seattle suburb.
Microsoft, UW and Pacific Northwest National Laboratory got the ball rolling for the Northwest Quantum Nexus in 2019. IonQ, Washington State University and the University of Oregon’s Center for Optical, Molecular and Quantum Science joined the team a couple of years later. Now the addition of Amazon and Boeing brings two of the region’s tech giants into the fold.
DeepMind’s AlphaCode program takes a data-driven approach to coding. (Google DeepMind Illustration)
Artificial intelligence software programs are becoming shockingly adept at carrying on conversations, winning board games and generating artwork — but what about creating software programs? In a newly published paper, researchers at Google DeepMind say their AlphaCode program can keep up with the average human coder in standardized programming contests.
There’s no need to sound the alarm about Skynet just yet: DeepMind’s code-generating system earned an average ranking in the top 54.3% in simulated evaluations on recent programming competitions on the Codeforces platform — which is a very “average” average.
“Competitive programming is an extremely difficult challenge, and there’s a massive gap between where we are now (solving around 30% of problems in 10 submissions) and top programmers (solving >90% of problems in a single submission),” DeepMind research scientist Yujia Li, one of the Science paper’s principal authors, told me in an email. “The remaining problems are also significantly harder than the problems we’re currently solving.”
Nevertheless, the experiment points to a new frontier in AI applications. Microsoft is also exploring the frontier with a code-suggesting program called Copilot that’s offered through GitHub. Amazon has a similar software tool, called CodeWhisperer.
Oren Etzioni, the founding CEO of Seattle’s Allen Institute for Artificial Intelligence and technical director of the AI2 Incubator, told me that the newly published research highlights DeepMind’s status as a major player in the application of AI tools known as large language models, or LLMs.
“This is an impressive reminder that OpenAI and Microsoft don’t have a monopoly on the impressive feats of LLMs,” Etzioni said in an email. “Far from it, AlphaCode outperforms both GPT-3 and Microsoft’s Github Copilot.”
Calculating 100 trillion digits of pi is a feat worth celebrating with a pie. (Google Graphic / The Keyword)
Three years after Seattle software developer Emma Haruka Iwao and her teammates at Google set the world record for calculating pi precisely, they’ve done it again. Thanks to Iwao and Google Cloud, we now know what pi equals to an incredible precision of 100 trillion digits.
Why pi?
Mathematicians have been working out the ratio of a circle’s circumference to its diameter for millennia, going back at least as far as the Babylonians (who figured it at 3.125). It’s important for scientists and engineers to know the irrational number’s value with a high degree of precision, but beyond a certain point, it’s really all about showing how well an algorithm or a computer network can handle more practical problems.
To back up that claim, IonQ is turning to a metric known as quantum volume. That’s a multidimensional yardstick that combines stats ranging from the number of quantum bits (a.k.a. qubits) in a computer to the system’s error rate and cross-qubit connectivity.
In today’s news release, IonQ says its next-generation system will feature 32 “perfect” qubits with low gate errors, penciling out to a quantum volume value in excess of 4 million.
The numbers game highlights the fact that the competition in quantum computing is just getting started, more than two decades after computer scientists laid out the theoretical foundations for the field.
Under the best of circumstances, quantum computing is hard to wrap your brain around. Rather than dealing with the cold, hard ones and zeroes of classical computing, the quantum paradigm relies on qubits that can represent multiple values at the same time.
The approach is particularly well-suited for solving problems ranging from breaking (or protecting) cryptographic codes, to formulating the molecular structures for new materials and medicines, to optimizing complex systems such as traffic patterns and financial markets.
Players in the quantum computing game include heavy-hitters such as IBM, Google and Honeywell — as well as startups such as Maryland-based IonQ, California-based Rigetti and B.C.-based D-Wave Systems.
The important thing to keep in mind is that different technologies from different providers (including IonQ, Rigetti and D-Wave) are being offered on the quantum cloud platforms offered by Amazon and Microsoft. IBM and Google, meanwhile, provide their quantum tools as options on their own cloud computing platforms.
Developers who want to make use of quantum data processing aren’t likely to go out and buy a dedicated quantum computer. They’re more likely to choose from the cloud platforms’ offerings — just as a traveler who wants to rent a snazzy car from Hertz or from Avis can go with a Corvette or a Mustang.
That’s where metrics make the difference. If you can show potential customers that your quantum machine has more horsepower, you’re likely to do better in an increasingly competitive market.
In contrast, IonQ emphasizes qubit quality over quantity. “We’re not going to throw a million qubits on the table unless we can do millions of operations,” co-founder and chief scientist Chris Monroe told me last December.
Peter Chapman, the former Amazon exec who now serves as IonQ’s CEO and president, said quantum computing should prove its worth well before the million-qubit mark.
“In a single generation of hardware, we went from 11 to 32 qubits, and more importantly, improved the fidelity required to use all 32 qubits,” Peter Chapman, the former Amazon exec who now serves as IonQ’s CEO and president, said in today’s news release.
“Depending on the application, customers will need somewhere between 80 and 150 very high-fidelity qubits and logic gates to see quantum advantage,” Chapman said. “Our goal is to double or more the number of qubits each year.”
IonQ’s 32-qubit hardware will be rolled out initially as a private beta, and then will be made commercially available via Amazon Braket and Microsoft Azure Quantum.
As we await the next raise in the numbers game, it might be a good idea to set up a trusted authority to take charge of the standards and benchmarking process for quantum computing — similar to how the TOP500 has the final word on which supercomputers lead the pack.
Such an authority could definitively determine who has the world’s most powerful quantum computer. Or would that violate the weird rules of quantum indeterminacy?
Update for 3:35 p.m. PT Oct. 5: We’ve added more precise language and links to describe the distinctions between different types of quantum computing technology.
In contrast to the rigid one-or-zero realm of classical computing, Braket and similar platforms take advantage of the fuzziness of quantum algorithms, in which quantum bits — or “qubits” — can represent multiple values simultaneously until the results are read out.
Quantum computing is particularly well-suited for tackling challenges ranging from cryptography — which serves as the foundation of secure online commerce — to the development of new chemical compounds for industrial and medical use. Some of the first applications could well be in the realm of system optimization, including the optimization of your financial portfolio.
Among the high-performance computing resources that will be made available for coronavirus research is Oak Ridge National Laboratory’s Summit, the world’s fastest supercomputer. (ORNL Photo)
The COVID-19 High-Performance Computing Consortium, organized by OSTP and IBM, has the Seattle area’s powerhouses of cloud computing, Amazon Web Services, and Microsoft on board. Google Cloud is in on the effort as well.
There are also academic partners (MIT and Rensselaer Polytechnic Institute), federal agency partners (NASA and the National Science Foundation) and five Department of Energy labs (Argonne, Lawrence Livermore, Los Alamos, Oak Ridge and Sandia).
Components for IBM’s quantum computer are on display at a science conference in Lausanne, Switzerland. (GeekWire Photo / Alan Boyle)
The U.S. Department of Energy is looking for experts to guide the White House and federal agencies through the weird world of quantum information science.
In addition to calling for the establishment of the advisory committee, the National Quantum Initiative Act sets aside $1.2 billion over five years to support research, development and workforce training relating to quantum information science.