A quantum computer taps directly into the fundamental fabric of reality – the strange and counterintuitive world of quantum mechanics – to speed computation.

**Quantum computing** studies theoretical computation systems (**quantum computers**) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from binary digital electronic computers based on transistors. Whereas common digital computing requires that the data are encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in superposition’s of states. A quantum Turing machine is a theoretical model of such a computer, and is also known as the universal quantum computer. Quantum computers share theoretical similarities with non-deterministic and probabilistic. The field of quantum computing was initiated by the work of Paul Benioff and Yuri Manin in 1980, Richard Feynman in 1982, and David Deutsch in 1985. A quantum computer with spins as quantum bits was also formulated for use as a quantum space–time in 1968.

As of 2016, the development of actual quantum computers is still in its infancy, but experiments have been carried out in which quantum computational operations were executed on a very small number of quantum bits. Both practical and theoretical research continues, and many national governments and military agencies are funding quantum computing research in an effort to develop quantum computers for civilian, business, trade, environmental and national security purposes, such as cryptanalysis.

Large-scale quantum computers would theoretically be able to solve certain problems much more quickly than any classical computers that use even the best currently known algorithms, like integer factorization using Shor’s algorithm or the simulation of quantum many-body systems. There exist quantum algorithms, such as Simon’s algorithm, that run faster than any possible probabilistic classical algorithm. Given sufficient computational resources, a classical computer could in theory simulate any quantum algorithm, as quantum computation does not violate the Church–Turing thesis. On the other hand, quantum computers may be able to efficiently solve problems which are not *practically* feasible on classical computers.

**How Quantum Computers Work**

The massive amount of processing power generated by computer manufacturers has not yet been able to quench our thirst for speed and computing capacity. In 1947, American computer engineer **Howard Aiken** said that just six electronic digital computers would satisfy the computing needs of the United States. Others have made similar errant predictions about the amount of computing power that would support our growing technological needs. Of course, Aiken didn’t count on the large amounts of data generated by scientific research, the proliferation of personal computers or the emergence of the Internet, which have only fueled our need for more, more and more computing power.

Will we ever have the amount of computing power we need or want? If, as **Moore’s Law** states, the number of transistors on a microprocessor continues to double every 18 months, the year 2020 or 2030 will find the circuits on a microprocessor measured on an atomic scale. And the logical next step will be to create **quantum computers**, which will harness the power of atoms and molecules to perform memory and processing tasks. Quantum computers have the potential to perform certain calculations significantly faster than any silicon-based computer.

Scientists have already built basic quantum computers that can perform certain calculations; but a practical quantum computer is still years away. In this article, you’ll learn what a quantum computer is and just what it’ll be used for in the next era of computing.

You don’t have to go back too far to find the origins of quantum computing. While computers have been around for the majority of the 20th century, quantum computing was first theorized less than 30 years ago, by a physicist at the Argonne National Laboratory. **Paul Benioff** is credited with first applying quantum theory to computers in 1981. Benioff theorized about creating a quantum Turing machine. Most digital computers, like the one you are using to read this article, are based on the **Turing Theory**. Learn what this is in the next section.

**Basis**

A classical computer has a memory made up of bits, where each bit is represented by either a one or a zero. A quantum computer maintains a sequence of qubits. A single qubit can represent a one, a zero, or any quantum superposition of those two qubit states; a pair of qubits can be in any quantum superposition of 4 states, and three qubits in any superposition of 8 states. In general, a quantum computer with qubits can be in an arbitrary superposition of up to different states simultaneously (this compares to a normal computer that can only be in *one* of these states at any one time). A quantum computer operates by setting the qubits in a controlled initial state that represents the problem at hand and by manipulating those qubits with a fixed sequence of quantum logic gates. The sequence of gates to be applied is called a quantum algorithm. The calculation ends with a measurement, collapsing the system of qubits into one of the pure states, where each qubit is zero or one, decomposing into a classical state. The outcome can therefore be at most classical bits of information. Quantum algorithms are often non-deterministic, in that they provide the correct solution only with a certain known probability.

An example of an implementation of qubits of a quantum computer could start with the use of particles with two spin states: “down” and “up”. But in fact any system possessing an observable quantity *A*, which is *conserved* under time evolution such that *A* has at least two discrete and sufficiently spaced consecutive eigenvalues, is a suitable candidate for implementing a qubit. This is true because any such system can be mapped onto an effective spin-1/2 system.

**How D-Wave Systems Work**

Quantum computing uses an entirely different approach than classical computing. A useful analogy is to think of a landscape with mountains and valleys.

Solving optimization problems can be thought of as trying to find the lowest point on this landscape. Every possible solution is mapped to coordinates on the landscape, and the altitude of the landscape is the “energy’” or “cost” of the solution at that point. The aim is to find the lowest point on the map and read the coordinates, as this gives the lowest energy, or optimal solution to the problem.

Classical computers running classical algorithms can only “walk over this landscape”. Quantum computers can tunnel through the landscape making it faster to find the lowest point. The D-Wave processor considers all the possibilities simultaneously to determine the lowest energy required to form those relationships. The computer returns many very good answers in a short amount of time – 10,000 answers in one second. This gives the user not only the optimal solution or a single answer, but also other alternatives to choose from.

D-Wave systems use “quantum annealing” to solve problems. Quantum annealing “tunes” qubits from their superposition state to a classical state to return the set of answers scored to show the best solution.

**Programming D-Wave**

To program the system a user maps their problem into this search for the lowest point. A user interfaces with the quantum computer by connecting to it over a network, as you would with a traditional computer. The user’s problems are sent to a server interface, which turns the optimization program into machine code to be programmed onto the chip. The system then executes a “quantum machine instruction” and the results are returned to the user.

D-Wave systems are designed to be used in conjunction with classical computers, as a “quantum co-processor”.

**Google’s quantum computer is 100 million times faster than your laptop**

**But is it a true quantum computer? **

Google has been sharing more details about its quantum computing project, which it runs in partnership with NASA. The tech company says its mammoth D-Wave 2X quantum computing machine has been figuring out algorithms at 100,000,000 times the speed that a traditional computer chip can, and that could make a huge difference in the processing power at our disposal in the future.

In quantum computing (closely linked to quantum mechanics), quantum bits (or ‘qubits’) can simultaneously hold values of 1, 0, or both, rather than being set to 1 or 0 as traditional electronic bits are. The qubits are tiny particles suspended in temperatures just above absolute zero, and as more qubits are added, the available processing power goes up exponentially. Big data problems, such as weather forecasts or chemical analysis, could be dealt with much faster through the power of quantum computing.

We’re not quite there yet – to the best of our knowledge, anyway – but Google’s announcement marks another step forward in making quantum computing a reality that can have a substantial impact on technology and science research. As Jordan Novet at VentureBeat reports, it’s also a justification of the work D-Wave has been doing with its machines, which have also been sold to companies including Lockheed Martin and the Los Alamos National Laboratory.

Trying to get an understanding of the new developments without a physics degree and a mastery of scientific jargon isn’t easy, but essentially, researchers ran simulated tests of standard computer chips and compared them against the capabilities of the D-Wave 2X when doing the same sums – that’s where the 100 million speed increase was noted.

That’s an impressive figure, but we’re not quite in the age of quantum computing just yet. “While these results are intriguing and very encouraging, there is more work ahead to turn quantum enhanced optimization into a practical technology,” writes Google’s Hartmut Neven on the company’s blog. The team has also published a paper on its findings.

Quantum computing theory is all well and good, but we’re still waiting for confirmation that these ideas can actually work in practice. As Gizmodo notes, not everyone is convinced that the D-Wave 2X is a true quantum computer; there’s also some discussion around the efficiency of the algorithms used in Google’s simulation of the regular computer chip, which could possibly have been optimized further.

While scientists debate the various merits of this particular super machine, the likes of IBM and Microsoft are joining Google in the race to crack the quantum computing code. According to Microsoft, we could see a working quantum computer within the next 10 years – although D-Wave still maintains the necessary technology is already here.

**World’s most powerful quantum computer now online at USC**

Following a recent upgrade, the USC-Lockheed Martin Quantum Computing Center (QCC) based at the USC Information Sciences Institute (ISI) is now the leader in quantum processing capacity.

With the upgrade — to 1,098 qubits from 512 — the D-Wave 2X™ processor is enabling QCC researchers to continue their efforts to close the gap between academic research in quantum computation and real-world critical problems.

The new processor will be used to study how and whether quantum effects can speed up the solution of tough optimization, machine learning and sampling problems. Machine-learning algorithms are widely used in artificial intelligence tasks.

A powerful computer hosted at the USC Information Sciences institute will be used to study how and whether quantum effects can speed up the solution of tough optimization, machine learning, and sampling problems. (Photo/Courtesy of USC Viterbi School of Engineering)

“Our QCC researchers have been studying and elucidating the capabilities of the D-Wave quantum annealing processors since 2011, starting with the first commercially released model, and we are fortunate to be able to continue this cutting-edge work with this new, third-generation model,” said Daniel Lidar, the scientific director of the USC-Lockheed Martin Quantum Computing Center.

The QCC is hosted at the USC Information Sciences Institute of the USC Viterbi School of Engineering. The QCC hosts one of two D-Wave systems that currently operate outside of D-Wave’s headquarters; the other, owned by Google, is hosted at NASA’s Ames Research Center. A third is being installed at Los Alamos National Laboratory.

“Research advances and new insights from the quantum computing research group at ISI have, over the past few years, re-shaped the national conversation about quantum computing. This significant new upgrade provides an even more powerful platform for the group to push the boundaries of quantum computing,” said Prem Natarajan, the Michael Keston Executive Director of the USC Information Sciences Institute.

“Quantum computing has the potential to solve complex engineering problems that classical computers cannot efficiently solve,” said Greg Tallant, Lockheed Martin fellow and lead for the USC-Lockheed Martin QCC. “This emerging technology is expected to achieve things like rapidly debugging millions of lines of software code and solving hard computational problems in the aerospace industry. At its fullest capability, quantum computing could advance the way we solve critical issues and plan for future generations.”

The QCC researchers expect that with continued development, the D-Wave 2X system will enable breakthrough results in quantum optimization.