Eclipse Digital - Fotolia

Tip

The future of quantum computing in the cloud

Quantum computing is the latest technology to catch the eyes of developers and cloud providers like AWS and Microsoft, but analysts predict it could still be years away from practical use.

AWS, Microsoft and other IaaS providers have jumped on the quantum computing bandwagon as they try to get ahead of the curve on this emerging technology.

Developers use quantum computing to encode problems as qubits, which compute multiple combinations of variables at once rather than exploring each possibility discretely. In theory, this could allow researchers to quickly solve problems involving different combinations of variables, such as breaking encryption keys, testing the properties of different chemical compounds or simulating different business models. Researchers have begun to demonstrate real-world examples of how these early quantum computers could be put to use.

However, this technology is still being developed, so experts caution that it could take more than a decade for quantum computing to deliver practical value. In the meantime, there are a few cloud services, such as Amazon Bracket and Microsoft Quantum, that aim to get developers up to speed on writing quantum applications.

Quantum computing in the cloud has the potential to disrupt industries in a similar way as other emerging technologies, such as AI and machine learning. But quantum computing is still being established in university classrooms and career paths, said Bob Sutor, vice president of IBM Quantum Ecosystem Development. Similarly, major cloud providers are focusing primarily on education at this early stage.

"The cloud services today are aimed at preparing the industry for the soon-to-arrive day when quantum computers will begin being useful," said Itamar Sivan, co-founder and CEO of Quantum Machines, an orchestration platform for quantum computing.

There's still much to iron out regarding quantum computing and the cloud, but the two technologies appear to be a logical fit, for now. 

IBM Quantum Q
The IBM Q System One was introduced in January 2019 and was the first quantum computing system for scientific and commercial use.

How quantum computing fits into the cloud model

Cloud-based quantum computing is more difficult to pull off than AI, so the ramp up will be slower and the learning curve steeper, said Martin Reynolds, distinguished vice president of research at Gartner. For starters, quantum computers require highly specialized room conditions that are dramatically different from how cloud providers build and operate their existing data centers.

Reynolds believes practical quantum computers are at least a decade away. The biggest drawback lies in aligning the quantum state of qubits in the computer with a given problem, especially since quantum computers still haven't been proven to solve problems better than traditional computers.

Coders also must learn new math and logic skills to utilize quantum computing. This makes it hard for them since they can't apply traditional digital programming techniques. IT teams need to develop specialized skills to understand how to apply quantum computing in the cloud so they can fine tune the algorithms, as well as the hardware, to make this technology work.

Current limitations aside, the cloud is an ideal way to consume quantum computing, because quantum computing has low I/O but deep computation, Reynolds said. Because cloud vendors have the technological resources and a large pool of users, they will inevitably be some of the first quantum-as-a-service providers and will look for ways to provide the best software development and deployment stacks.

Quantum computing could even supplement general compute and AI services cloud providers currently offer, said Tony Uttley, president of Honeywell Quantum Solutions. In that scenario, the cloud would integrate with classical computing cloud resources in a co-processing environment.

Simulate and access quantum with cloud computing

The cloud plays two key roles in quantum computing today, according to Hyoun Park, CEO and principal analyst at Amalgam Insights. The first is to provide an application development and test environment for developers to simulate the use of quantum computers through standard computing resources.

The second is to offer access to the few quantum computers that are currently available, in the way mainframe leasing was common a generation ago. This improves the financial viability of quantum computing, since multiple users can increase machine utilization.

It takes significant computing power to simulate quantum algorithm behavior from a development and testing perspective. For the most part, cloud vendors want to provide an environment to develop quantum algorithms before loading these quantum applications onto dedicated hardware from other providers, which can be quite expensive.

However, classical simulations of quantum algorithms that use large numbers of qubits are not practical. "The issue is that the size of the classical computer needed will grow exponentially with the number of qubits in the machine," said Doug Finke, publisher of the Quantum Computing Report. So, a classical simulation of a 50-qubit quantum computer would require a classical computer with roughly 1 petabyte of memory. This requirement will double with every additional qubit. 

Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage.
Martin ReynoldsDistinguished vice president of research at Gartner

But classical simulations for problems using a smaller number of qubits are useful both as a tool to teach quantum algorithms to students and also for quantum software engineers to test and debug algorithms with "toy models" for their problem, Finke said. Once they debug their software, they should be able to scale it up to solve larger problems on a real quantum computer.

In terms of putting quantum computing to use, organizations can currently use it to support last-mile optimization, encryption and other computationally challenging issues, Park said. This technology could also aid teams across logistics, cybersecurity, predictive equipment maintenance, weather predictions and more. Researchers can explore multiple combinations of variables in these kinds of problems simultaneously, whereas a traditional computer needs to compute each combination separately.

However, there are some drawbacks to quantum computing in the cloud. Developers should proceed cautiously when experimenting with applications that involve sensitive data, said Finke. To address this, many organizations prefer to install quantum hardware in their own facilities despite the operational hassles, Finke said. 

Also, a machine may not be immediately available when a quantum developer wants to submit a job through quantum services on the public cloud. "The machines will have job queues and sometimes there may be several jobs ahead of you when you want to run your own job," Finke said.  Some of the vendors have implemented a reservation capability so a user can book a quantum computer for a set time period to eliminate this problem.

Quantum cloud services to know

IBM was first to market with its Quantum Experience offering, which launched in 2016 and now has over 15 quantum computers connected to the cloud. Over 210,000 registered users have executed more than 70 billion circuits through the IBM Cloud and published over 200 papers based on the system, according to IBM.

IBM also started the Qiskit open source quantum software development platform and has been building an open community around it. According to GitHub statistics, it is currently the leading quantum development environment.

In late 2019, AWS and Microsoft introduced quantum cloud services offered through partners. 

Microsoft Quantum provides a quantum algorithm development environment, and from there users can transfer quantum algorithms to Honeywell, IonQ or Quantum Circuits Inc. hardware. Microsoft's Q# scripting offers a familiar Visual Studio experience for quantum problems, said Michael Morris, CEO of Topcoder, an on-demand digital talent platform.

Currently, this transfer involves the cloud providers installing a high-speed communication link from their data center to the quantum computer facilities, Finke said. This approach has many advantages from a logistics standpoint, because it makes things like maintenance, spare parts, calibration and physical infrastructure a lot easier. 

Amazon Braket similarly provides a quantum development environment and, when generally available, will provide time-based pricing to access D-Wave, IonQ and Rigetti hardware. Amazon says it will add more hardware partners as well. Braket offers a variety of different hardware architecture options through a common high-level programming interface, so users can test out the machines from the various partners and determine which one would work best with their application, Finke said.

Google has done considerable core research on quantum computing in the cloud and is expected to launch a cloud computing service later this year. Google has been more focused on developing its in-house quantum computing capabilities and hardware rather than providing access to these tools to its cloud users, Park said. In the meantime, developers can test out quantum algorithms locally using Google's Circ programming environment for writing apps in Python.

In addition to the larger offerings from the major cloud providers, there are several alternative approaches to implementing quantum computers that are being provided through the cloud.

D-Wave is the furthest along, with a quantum annealer well-suited for many optimization problems. Other alternatives include QuTech, which is working on a cloud offering of its small quantum machine utilizing its spin qubits technology. Xanadu is another and is developing a quantum machine based on a photonic technology.

Still testing the quantum filaments

Researchers are pursuing a variety of approaches to quantum computing -- using electrons, ions or photons -- and it's not yet clear which approaches will pan out for practical applications first.

"Nobody knows which approach is best, or which materials are best. We're at the Edison light bulb filament stage, where Edison reportedly tested thousands of ways to make a carbon filament until he got to one that lasted 1,500 hours," Reynolds said. In the meantime, recent cloud offerings promise to enable developers to start experimenting with these different approaches to get a taste of what's to come.

Next Steps

AWS, QCI look to bridge classical and quantum computing

Dig Deeper on Cloud deployment and architecture

Data Center
ITOperations
SearchAWS
SearchVMware
Close