Table of Contents
As a failed physicist – my first academic pursuits were in theoretical physics and applied geophysics, and as the son of a well-known theoretical physicist, quantum computing has always fascinated me. It brought together my initial scientific interests and my chosen career in computer science, cryptography and cybersecurity. Admittedly, I never fully understood quantum mechanics, but its intersection with computing and potential practical applications are intriguing. In case you are as interested in the field as I am, here’s a brief history of quantum computing with a brief description of each important event and with lots of relevant links.
The full history of the quantum computing field would have to include thousands of scientists and hundreds of events. A more comprehensive timeline is available on Wikipedia “Timeline of quantum computing and communication.” In this article I selected only the events I believe were the most influential, or I personally found them most fascinating. Every time I mention a year in the article, I refer to the first instance a particular theory or an algorithm were proposed or an implementation demonstrated, but, of course, each of those continued being developed and improved upon by an army of researchers.
Early Theoretical Foundations and Algorithmic Breakthroughs
1920 to 1985 – The Conceptual Beginnings
Modern quantum computing is based on the principles of quantum mechanics, a revolutionary scientific paradigm that was originally proposed in the early 20th century. In the 1920s and 1930s, pioneers such as Niels Bohr, Werner Heisenberg, and Erwin Schrรถdinger introduced concepts that challenged classical notions of physics, proposing that at subatomic levels, particles like electrons behave in ways that can only be accurately described by a probabilistic model, not deterministic laws. These ideas were the first steps toward understanding the quantum world. Key theories at that time include:
- Heisenberg’s Uncertainty Principle (1927): Heisenberg introduced the uncertainty principle, which asserts that the position and velocity of an object cannot both be measured, exactly, at the same time.
- Schrรถdinger’s Wave Equation (1926): Schrรถdinger developed a wave equation that describes how the quantum state of a physical system changes over time.
These foundational theories are well-documented in sources like the Stanford Encyclopedia of Philosophyโs entries on Quantum Mechanics and historical accounts such as the book Quantum: Einstein, Bohr, and the Great Debate about the Nature of Reality by Manjit Kumar.
While the initial developments in quantum mechanics were focused on understanding atomic and subatomic particles, it wasn’t until the mid-20th century that scientists began to consider the computational implications of these quantum phenomena. Notably, in the 1950s and 1960s, researchers like Hugh Everett and David Bohm expanded the theoretical framework, which hinted at more complex computations underlying quantum theory.
1957 – The Many Worlds of Hugh Everett
Hugh Everett III made a significant contribution to the theoretical foundation of quantum mechanics with his “Many-Worlds Interpretation” (MWI) of quantum physics, first proposed in his 1957 doctoral thesis at Princeton University. While Everett’s work did not directly address quantum computing and is not typically mentioned in histories of quantum computing, I believe that his contribution to quantum computing is indirect but foundational as his theories have influenced the philosophical landscape in which quantum computing developed.
Everett’s MWI challenges the traditional Copenhagen interpretation of quantum mechanics, which states that a quantum system remains in superposition until it is observed, at which point it collapses to a single outcome. In contrast, Everett proposed that all possible outcomes of quantum measurements are physically realized, each in a different, newly created branch of the universe. This interpretation removes the need for wave function collapse, which Everett saw as an unnecessary complication.
Everettโs theory provides a crucial conceptual underpinning for quantum computing. The idea that multiple states can coexist and that the universe branches into multiple outcomes can be seen as a parallel to the way quantum computers perform calculations. In a quantum computer, qubits exist simultaneously in multiple states due to superposition, and quantum algorithms take advantage of entanglement and interference to perform complex calculations more efficiently than classical computers. Everett’s view of simultaneous realities mirrors the parallel processing capabilities of quantum computers. In theory, a quantum computer could process a vast number of possibilities simultaneously due to the superposition of qubits, akin to how every possible outcome exists simultaneously in Everettโs many-worlds.
Everettโs interpretation sparked considerable debate and discussion within the physics community regarding the fundamental understanding of reality. This philosophical inquiry has enriched the discussions around quantum technologies, pushing scientists and theorists to ponder the deeper implications of quantum mechanics and computing. The acceptance and interest in Everett’s interpretation have waxed and waned over the years, but it has inspired further theoretical explorations in quantum mechanics and quantum computing, influencing figures like David Deutsch, a pioneer in quantum computing, who supports the many-worlds interpretation.
1968 – Conjugate Coding and Stephen Wiesner
Stephen Wiesner made a key contribution to quantum information theory in the late 1960s and early 1970s with his invention of conjugate coding, which he detailed in his paper “Conjugate Coding.” This concept was initially discussed in the industry in the early 1970s but the paper was only published in 1983. Wiesnerโs ideas laid important groundwork for the development of quantum cryptography and various protocols in quantum information processing, including quantum computing.
Wiesner introduced the idea of using two-state quantum systems (like qubits) for storing information. He proposed that each qubit could be encoded in one of two conjugate bases, which are related by a quantum Fourier transform and are mutually unbiased. This means that measuring the state of a qubit in one basis gives no information about what its state would be if measured in the other basis. The use of conjugate bases exploits the principle of quantum uncertainty, ensuring that information encoded in one basis is secure if an eavesdropper attempts to measure it in the other basis. This fundamental property of quantum mechanics is critical for ensuring the security of quantum cryptographic methods.
Wiesnerโs ideas directly influenced the development of Quantum Key Distribution, particularly the BB84 protocol devised by Charles Bennett and Gilles Brassard in 1984. BB84 uses two sets of conjugate bases to encode information, ensuring secure communication by detecting any presence of an eavesdropper. This protocol and its variants are among the most well-known applications of quantum cryptography.
Wiesnerโs original application for conjugate coding was quantum money that cannot be counterfeited, thanks to the no-cloning theorem of quantum mechanics. This concept, while not commercially adopted, opened the door to thinking about quantum states as a means to secure unforgeable digital tokens.
Wiesnerโs work is considered pioneering because it proposed using quantum states for information processing tasks long before the general feasibility of quantum computing was understood. His theoretical frameworks helped other scientists to explore further the application of quantum mechanics in computational and cryptographic contexts. While the practical impact of conjugate coding was initially limited due to the technological constraints of the time, its theoretical implications have been profound and enduring. As quantum technologies have advanced, the principles laid out by Wiesner have found new applications and continue to influence ongoing research in quantum computing and quantum communication.
1970 – James Park and No-Cloning Theorem
The no-cloning theorem is a fundamental theory in the field of quantum mechanics and quantum information theory, which has significant implications for quantum computing. Although the theorem is widely associated with the work published in 1982 by William Wootters, Wojciech Zurek in their paper “A single quantum cannot be cloned“, and independently by Dennis Dieks in his paper “Communication by EPR devices“, the foundational idea was first introduced by James Park in 1970.
James Park, in his 1970 paper titled “The Concept of Transition in Quantum Mechanics,” introduced the idea that quantum information encoded in an unknown state cannot be perfectly copied or cloned. Park is rarely mentioned in the history of quantum computing, the no-cloning theory most often being attributed to Wooters, Zurek and Dieks, but I believe that is wrong. Park’s paper contained an explicit mathematical proof of the impossibility of cloning a quantum, however, he did not view the no-cloning as a major aspect of his paper, focusing instead on the proof that that nondisturbing quantum measurements are possible and no-cloning being a byproduct of that.
The concept introduced by Park was later formalized and became widely recognized through the works of Wootters, Zurek, and Dieks in the early 1980s. Their formal theorem states that it is impossible to create an identical copy of an arbitrary unknown quantum state. This arises from the linearity of quantum mechanics and the preservation of quantum information. The concepts state:
- Quantum operations are unitary, meaning they preserve the length of the state vectors in the system. This unitarity, coupled with the superposition principle, implies that a general quantum state cannot be exactly copied without affecting the original state.
- The no-cloning theorem also highlights the impossibility of measuring quantum states without disturbing them, reinforcing the idea that quantum information is fundamentally different from classical information.
The no-cloning theorem provides a theoretical foundation for the security of quantum key distribution (QKD) protocols such as BB84. In QKD, the no-cloning theorem ensures that any attempt by an eavesdropper to intercept and measure the quantum keys will inevitably introduce detectable errors in the system, alerting the legitimate users to the presence of an eavesdropper.
While the no-cloning theorem prohibits duplicating quantum states, it has spurred the development of techniques like quantum teleportation, which allows a quantum state to be transferred from one particle to another over arbitrary distances, without violating the no-cloning principle.
The implications of the no-cloning theorem extend to quantum error correction and fault tolerance strategies. Quantum error correction codes are designed to protect quantum information against errors without needing to clone the information, thus adhering to the constraints imposed by the no-cloning theorem.
1980 – Paul Benioff and Quantum Mechanical Models of Computers
Paul Benioffโs contributions to quantum computing are vital. By demonstrating that quantum mechanics could be applied to computing processes, he set the stage for the development of quantum algorithms and technologies that researchers continue to explore and refine today.
In 1980 and 1982, Paul Benioff, a physicist at Argonne National Laboratory, published papers that described the first model of a quantum mechanical Turing machine. The papers are “The computer as a physical system: A microscopic quantum mechanical Hamiltonian model of computers as represented by Turing machines” and “Quantum mechanical Hamiltonian models of Turing machines.” His model proposed that a quantum system could perform calculations as a Turing machine does, but with the operations and states being governed by the laws of quantum mechanics. His work demonstrated that it was theoretically possible to build a computing machine where quantum mechanics dictated the processing of information. This idea was groundbreaking because it extended the concept of quantum mechanics beyond the atomic and subatomic scale to include macroscopic processes like computation.
Benioff’s theory was among the first to apply quantum mechanics explicitly to the field of information processing. He showed that quantum systems could theoretically perform classical computing tasks, but they could also utilize quantum effects such as superposition and entanglement to perform operations that no classical computer could achieve. In his models, Benioff introduced the idea of quantum gates and circuits that operate on quantum bits (qubits). This concept is a cornerstone of all quantum computing models that followed, influencing the development of quantum circuit design and the implementation of quantum algorithms.
1981 – Richard Feyman
The conceptual leap from quantum mechanical theories to quantum computing was largely inspired by Richard Feynman. In his seminal 1981 talk at the First Conference on the Physics of Computation at MIT, Feynman critically addressed the inefficiencies of simulating quantum phenomena on classical computers. Building on Paul Benioff’s work he proposed that a new type of computer, based on the principles of quantum mechanics rather than classical physics, could potentially resolve these inefficiencies.
Feynman suggested that unlike classical computers, which use bits as the smallest unit of data, quantum computers could use quantum bits, or qubits, which can exist in multiple states simultaneously due to superposition. This would allow them to perform many calculations at once, dramatically increasing computational power for specific types of problems. A transcript of Feynmanโs influential lecture can be found in the proceedings of the conference, detailed in the book Feynman and Computation.
Following Feynman’s proposal, the idea of quantum computing began to capture the imagination of both the public and the academic community. Throughout the 1980s and into the 1990s, more researchers started to explore the practical aspects of Feynmanโs ideas, leading to the development of quantum algorithms and the theoretical underpinning of quantum error correction.
1985 – David Deutsch and the Universal Quantum Computer
David Deutsch’s formulation of the universal quantum computer in 1985 is seen as a watershed moment in the history of quantum computing, transforming it from a theoretical concept into a potential technological reality. Before Deutsch’s intervention, quantum computing was largely a speculative field influenced by the theoretical implications of quantum mechanics for computation. Feynman had already suggested that quantum systems could compute things differently than classical systems, but it was Deutsch who fleshed out the architecture for a universal quantum computer.
In his 1985 groundbreaking paper, “Quantum theory, the Church-Turing principle and the universal quantum computer,” Deutsch proposed a quantum computational model where quantum logic gates operate on a superposition of states, allowing the computer to perform many calculations simultaneously. This model laid the groundwork for all future quantum computing research and introduced the concept of quantum parallelism. His work demonstrated that a quantum computer could, in theory, solve problems beyond the reach of classical computers by exploiting this parallelism. He proposed that such a computer could perform any computation that a classical computer could execute but would be exponentially more efficient for certain types of problems, particularly those involving factorization and database searches.
Deutsch went further to describe how quantum gates could manipulate qubits in a superposition of states. This was analogous to how classical gates manipulate bits, but with the added complexity and power of quantum mechanics.
Shortly after his theoretical proposal, Deutsch developed the first quantum algorithmโnow known as Deutsch’s Algorithmโwhich could determine the properties of a quantum function. This was the first example of a quantum algorithm that could outperform its classical counterpart, even though the advantage was minimal.
The implications of Deutsch’s work extended well beyond the confines of theoretical physics and computer science. His model not only validated the theoretical feasibility of quantum computing but also inspired a new generation of physicists and computer scientists to explore this new frontier. Following Deutsch’s model, researchers began to consider more seriously the hardware and practical aspects of constructing a quantum computer. This led to the exploration of various qubit types and the conceptualization of different quantum computing models, such as quantum annealing and adiabatic quantum computation.
Deutschโs work also influenced academic curricula and fostered collaborations across disciplines, integrating concepts of quantum mechanics with computational theory. Institutions like MIT, Caltech, and the University of Oxford began intensive research programs into quantum computing, supported by emerging collaborations with government and industry.
1994 to 1996 – Quantum Algorithms Emerge
The development of algorithms specifically designed for quantum computers marked a significant turning point in demonstrating the practical capabilities and immense potential of quantum technology.
1994 – Peter Shor
In 1994, Peter Shor, a mathematician working at AT&Tโs Bell Labs, introduced an algorithm that would become a cornerstone in the field of quantum computing. Shor’s Algorithm could factorize large integers exponentially faster than the best-known classical algorithms of the time. Shorโs algorithm demonstrated that quantum computers have the potential to break RSA encryption, the backbone of modern secure communications. RSA encryption relies heavily on the difficulty of factoring large prime numbers, a task for which Shor’s algorithm provided a quantum shortcut. This revelation stirred significant interest in quantum computing, particularly among governmental and financial institutions concerned with data security.
The original paper, “Algorithms for Quantum Computation: Discrete Logarithms and Factoring,” is available in case you want to dig into Shorโs method and its implications for cryptography.
The most striking implication of Shor’s Algorithm lies in its potential to undermine the security of RSA encryption, a widely used system underpinning the security of modern digital communications, including everything from online banking to secure email. RSA encryption’s security depends on the computational difficulty of factoring large prime numbers, a task presumed to be time-consuming and computationally expensive for conventional computers. Shorโs Algorithm changed the landscape by providing a quantum-based method to perform this task much more rapidly, potentially rendering traditional RSA encryption vulnerable.
The algorithm uses the principles of quantum superposition and entanglement to perform simultaneous operations on multiple numbers, drastically cutting down the time required for prime factorization. For instance, a number that might take a classical computer thousands of years to factor could be processed in a matter of hours on a sufficiently powerful quantum computer employing Shorโs Algorithm.
The revelation of such a powerful algorithm spurred significant interest in quantum computing, especially among entities that rely heavily on data security. Governments, financial institutions, and security firms, in particular, recognized the need to understand and potentially develop quantum computing technologies as a defense against the threat posed by quantum computers to existing cryptographic systems. This urgency has led to increased funding and research into quantum computing technologies, as well as a parallel push towards developing quantum-resistant cryptographic methods, often referred to as post-quantum cryptography.
Shor’s discovery also acted as a catalyst, accelerating the pace of research and investment in quantum computing. The algorithm provided concrete evidence that quantum computers could have practical and superior applications, thus transforming theoretical research into a more immediate technological pursuit. Universities, tech giants, and startups increased their investments in quantum computing, leading to rapid developments in quantum hardware and software.
The implications of Shorโs Algorithm extend beyond cryptography. The method also spurred broader interest in the development of other quantum algorithms that could similarly disrupt traditional approaches to solving complex mathematical, scientific, and engineering problems. Researchers have continued to explore quantum algorithms for applications in optimization, simulation, and artificial intelligence, leveraging the unique capabilities of quantum mechanics.
1996 – Lov Grover
In 1996, shortly after Peter Shor introduced his algorithm, Lov Grover of Bell Labs unveiled another seminal contribution to the field of quantum computing. Grover’s Algorithm, as it came to be known, desribed in his paper “A fast quantum mechanical algorithm for database search,” addressed a fundamental problem in computer science: searching unsorted databases. Unlike any classical algorithm, Groverโs method could search through unsorted data quadratically faster, presenting another compelling example of quantum computing’s potential.
Groverโs Algorithm fundamentally transforms the approach to searching within an unsorted database. Classical search algorithms, in the best case, need to check each element one by one until the target is found, resulting in a linear increase in search time as data size grows. Grover’s Algorithm, however, utilizes the principles of quantum superposition and entanglement to search through the database in a dramatically reduced number of steps.
By placing all possible database items into a quantum superposition, Grover’s Algorithm examines all items simultaneously. It uses a quantum operation known as the “oracle” which marks the item being searched for. Following this, a series of quantum operations called “amplitude amplification” is applied, which systematically increases the probability amplitude of the desired item’s quantum state, making it more likely to be observed upon measurement of the quantum state. This approach reduces the number of required operations from N (the total number of items in the database) to roughly sqrt(N), thus achieving a quadratic speedup.
The improvement offered by Groverโs Algorithm is quadratic, which, while less dramatic than Shor’s exponential speedup, still signifies a substantial enhancement over classical capabilities. The development of Groverโs Algorithm not only showcased the raw computational power inherent in quantum mechanics but also highlighted the practical applicability of quantum computing to more routine computing tasks, such as database searches. This demonstrated that quantum computing could have broad and meaningful applications across various sectors.
Groverโs Algorithm has implications that extend beyond mere database searching. It provides a framework for solving a variety of other problems more efficiently than classical computers can. For instance, it can be adapted for use in solving optimization problems, pattern recognition, and even machine learning tasks, where finding a solution often boils down to searching through a large set of possibilities.
The algorithm also plays a critical role in cryptanalysis, particularly in scenarios where a brute-force search can be employed. While Grover’s Algorithm does not break cryptographic systems as dramatically as Shorโs, it effectively halves the bit strength of symmetric cryptographic systems, meaning that an encryption method considered secure with a 256-bit key against classical attacks would need a 512-bit key to be secure against quantum attacks using Groverโs Algorithm.
The introduction of Shorโs and Groverโs algorithms was crucial for illustrating that quantum computing could transcend theoretical research and address practical problems with significant efficiency gains. By showing that specific computational tasks could be performed much more efficiently by quantum computers, these algorithms not only challenged existing paradigms in computational mathematics and data security but also firmly established quantum computing as a field of huge practical consequence.
The Evolution of Quantum Computing
1995 and 1996 – Quantum Error Correction Emerges
Another important step in the quantum computing history is the development of quantum error correction codes in the mid-1990s. It represented a crucial leap forward for quantum computing, addressing the significant challenge of error susceptibility in quantum systems.
Quantum computers, unlike their classical counterparts, operate on qubits that are highly sensitive to external disturbances. This sensitivity leads to errors from decoherence and other quantum noise, which can severely impact the reliability of quantum computations. Some of the most common sources of errors in quantum computing include:
- Phase Flip Error (Dephasing): A qubit in a superposition of states can lose its phase coherence due to environmental disturbances. This loss, known as a phase flip or dephasing error, impacts the qubit’s ability to maintain the phase information of its quantum state. At the moment the coherence time is measured in milliseconds.
- Bit Flip Errors (Depolarization): External influences such as thermal vibrations, electromagnetic interference, and cosmic rays can cause a qubit to switch between the states inadvertently. Known as bit flip errors, these depolarizing effects can lead to significant computational inaccuracies.
- Gate Operation Errors: Quantum gates manipulate qubits to perform computations. Despite their critical role, these gates occasionally function imperfectly and introduce errors during their operation. This is a significant concern especially for operations involving two qubits, where the precision of the gate is even more critical.
The development of quantum error correction codes was a critical response to this challenge, aiming to protect quantum information much like classical error correction but under the constraints of quantum mechanics.
The concept of quantum error correction was first proposed by Peter Shor in 1995 in his paper “Scheme for reducing decoherence in quantum computer memory“. Shor introduced a nine-qubit code, which was the first practical scheme to correct arbitrary single-qubit errors. Shorโs scheme demonstrated that it was possible to detect and correct quantum errors without measuring or destroying the quantum information contained in the qubits. This work was foundational, proving that error correction was feasible in a quantum context.
Following closely on the heels of Shorโs work, Andrew Steane, an Oxford physicist, made further significant contributions to quantum error correction. In 1996, Steane introduced a seven-qubit code, now known as the Steane code, which employs a more sophisticated approach using the principles of classical error correction theory applied to the quantum domain. The Steane code not only corrected errors but did so more efficiently, requiring fewer qubits than Shorโs nine-qubit code. Steaneโs paper, detailing this code, is titled “Error Correcting Codes in Quantum Theory.”
The implications of these developments in quantum error correction were profound, both theoretically and practically.
- Fault-Tolerance: Quantum error correction is the cornerstone of fault-tolerant quantum computing, which is essential for the practical realization of quantum computers. It allows quantum processors to function correctly even when some of the qubits are erroneous, thus enabling longer computations necessary for useful tasks.
- Scalability: By demonstrating that errors in quantum computations can be systematically corrected, error correction codes addressed one of the biggest barriers to scaling quantum computers to a larger number of qubits.
- Theoretical Expansion: The principles established by Shor and Steane spurred a wave of research and development in quantum error correction, leading to a variety of new codes and techniques, such as topological codes and surface codes, which continue to evolve today.
The field of quantum error correction continues to be a vibrant area of research. Modern advancements involve the integration of error correction with quantum hardware development, aiming to implement these codes in physical quantum systems. Researchers are also exploring the potential of adaptive error correction techniques and the integration of machine learning methods to optimize error correction strategies.
1995 to present – Physical Realization and Challenges
Transitioning from theory to practice involved the physical realization of qubitsโthe fundamental units of quantum information. Various approaches were explored:
- Trapped Ions: Demonstrated by researchers like Chris Monroe, trapped ion technology uses ions as qubits, manipulating them with lasers to perform quantum operations.
- Superconducting Qubits: Developed by teams including those at IBM and Google, superconducting circuits use electrical currents and Josephson junctions to create and manipulate qubits.
- Photonic Qubits: In photonic systems, qubits are encoded in the properties of photons, offering advantages in stability and transmission.
Each approach has its own set of challenges and advantages, contributing to a diverse technological ecosystem in quantum computing.
1995 – First Quantum Logic Gate Using Trapped Ions
In 1995, a significant experimental breakthrough in quantum computing was achieved by Christopher Monroe and David Wineland at the National Institute of Standards and Technology (NIST) in Boulder, Colorado. They successfully realized the first quantum logic gate, specifically the controlled-NOT (CNOT) gate, using trapped ions. The experiment is described in the paper “Demonstration of a Fundamental Quantum Logic Gate.”
This achievement followed the theoretical proposal made by Ignacio Cirac and Peter Zoller in 1995, which outlined a method to implement quantum computation with trapped ions. The proposal described in “Quantum Computations with Cold Trapped Ions” presented a detailed blueprint for using trapped ions to perform quantum computations. The central idea involved using the internal quantum states of ions as qubits and the collective vibrational modes of the ion chain as a bus for mediating quantum gates between qubits. Their scheme provided a coherent way to manipulate quantum information stored in the ions, thus setting the stage for Monroe and Wineland’s experimental realization.
The experiment conducted by Monroe and Wineland demonstrated the fundamental operations necessary for quantum computation by implementing a CNOT gate, which is essential for quantum error correction and many quantum algorithms. A CNOT gate is a two-qubit quantum gate that flips the state of a ‘target’ qubit conditional on the state of a ‘control’ qubit. In their experiment, Monroe and Wineland used laser pulses to manipulate the quantum states of the ions precisely, achieving the necessary control to execute the CNOT operation.
The experiment utilized beryllium ions confined in an electromagnetic trap. The qubits were encoded in the electronic states of the ions, and the ions were cooled nearly to their ground state of motion using laser cooling techniques. By using laser beams to couple the internal states of the ions to their collective motional state, Monroe and Wineland were able to control the quantum state transitions needed to perform the CNOT gate.
This experiment was pivotal for several reasons:
- It provided the first proof of concept that quantum logic gates, the building blocks of a quantum computer, could be physically implemented using trapped ions. This was a significant step forward from theoretical proposals to tangible, experimental verification.
- The success of this experiment encouraged further research into trapped ion technology, which has since become one of the leading platforms for developing quantum computers. The high degree of control over ion qubits and the relatively long coherence times achievable with trapped ions make this technology particularly promising for quantum computing.
- The demonstration of quantum logic operations using trapped ions inspired advancements in other areas of quantum technology, including quantum simulations and quantum metrology.
The work of Monroe and Wineland has had a lasting impact on the field of quantum computing. David Wineland was awarded the Nobel Prize in Physics in 2012, jointly with Serge Haroche, for methods to measure and manipulate individual quantum systems, highlighting the fundamental role of this research in the broader context of quantum physics.
Today, trapped ion technology continues to be at the forefront of quantum computing initiatives, with multiple companies and research groups developing trapped ion quantum computers and related technologies.
1999 – First Demonstration of Superconducting Qubits
The pioneering work of Yasunobu Nakamura and his colleagues, Jaw-Shen Tsai and Hiroshi Takayanagi, in 1999 marks a significant milestone in the history of quantum computing. Their experiments demonstrated for the first time that superconducting circuits could be used as qubits, thereby showcasing the potential for superconductivity in quantum computation. The experiment is described in “Coherent control of macroscopic quantum states in a single-Cooper-pair box.”
Prior to Nakamura’s experiment, the feasibility of using superconducting circuits for quantum computing was largely theoretical. Superconducting materials offered promising properties for quantum computation, including low dissipation and the ability to demonstrate macroscopic quantum phenomena, such as flux quantization and the Josephson effect. However, demonstrating that these systems could behave as true quantum bits and maintain quantum coherence over time was still an experimental challenge.
Nakamura’s team used a superconducting charge qubit, also known as a Cooper pair box. This setup consisted of a small superconducting island connected to a reservoir through a Josephson junction. The island could be charged with Cooper pairs (pairs of electrons with opposite spin and momentum), and the number of Cooper pairs could be controlled using a voltage applied through a gate capacitor. The two quantum states in the charge qubit corresponded to different numbers of Cooper pairs on the island. The crucial aspect of the experiment was controlling and observing the coherent quantum state dynamics between these states.
Nakamura and his team observed coherent oscillations between the two charge states by manipulating the gate voltage and the Josephson energy. They were able to initiate and then measure the coherent superposition of the charge states, observing the system as it oscillated back and forth between them. The experiment successfully measured the coherence time of the qubit, which is the time over which the qubit could maintain its quantum state without degrading. The demonstration of a sufficiently long coherence time was crucial for validating the potential of superconducting circuits in quantum computing.
The ability to demonstrate and control quantum coherence in a superconducting circuit was a breakthrough. It showed that quantum bits could be realized in solid-state systems, opening up practical pathways to building quantum computers. Following this experiment, research into superconducting qubits accelerated. Various types of superconducting qubits, such as flux, phase, and transmon qubits, were explored and developed based on the foundational understanding gained from these early experiments. This experiment spurred advancements in the design and fabrication of superconducting circuits, improvements in isolation from environmental noise, and innovations in qubit control and measurement techniques. The success of superconducting qubits has led to significant commercial interest, with companies like IBM, Google, and Rigetti Computing investing heavily in developing superconducting quantum processors.
2001 – First Experimental Implementation of Linear Optical Quantum Computing
The event often credited with kickstarting the field of optical quantum computing was the demonstration of the first quantum logic gate using linear optical elements by researchers in 2001. This foundational experiment was conducted by Emanuel Knill, Raymond Laflamme, and Gerald J. Milburn, and their work profoundly impacted the development of quantum computing using photons.
In their landmark paper “A scheme for efficient quantum computation with linear optics” published in 2001, Knill, Laflamme, and Milburn proposed a method to implement quantum computation using linear optical elements such as beam splitters, phase shifters, and mirrors, in conjunction with single-photon sources and detectors. The KLM scheme demonstrated that it was theoretically possible to perform universal quantum computing without the need for strong nonlinear interactions between photons, which were previously thought to be a requirement. Instead, they showed that conditional measurement could effectively induce the necessary nonlinearity for quantum gate operations.
Using photons as qubits has distinct advantages, including relatively low decoherence rates since photons do not interact strongly with their environment. Additionally, photons can be easily manipulated, transmitted over long distances, and can operate at room temperature, making them ideal for quantum communication and certain types of quantum computation tasks. Despite these advantages, the practical implementation of the KLM scheme and other photon-based quantum computing models poses challenges, primarily in terms of efficiently generating, manipulating, and detecting single photons.
Following the theoretical proposal by Knill, Laflamme, and Milburn, numerous experiments have aimed to realize practical optical quantum computing systems. These include demonstrations of basic quantum gates, quantum teleportation, and quantum entanglement using linear optics. The field has expanded to explore various architectures, such as integrated quantum photonics, where photonic circuits are fabricated on chips to perform complex quantum computations more compactly and with higher stability.
Today, optical quantum computing continues to be a vibrant area of research. Advances in integrated photonic technology have made it possible to create miniaturized and scalable quantum circuits on a chip, which is critical for the development of practical quantum computing devices. Optical quantum computing technologies are foundational for the development of the quantum internet, where quantum information is transmitted over long distances using photonic qubits.
2001 – First Experimental Demonstration of Shor’s Algorithm
The first experimental demonstration of Shorโs algorithm was a landmark event. This experiment, conducted in 2001, was a collaborative effort by researchers at IBM’s Almaden Research Center and Stanford University. The experiment is described in paper “Experimental realization of Shor’s quantum factoring algorithm using nuclear magnetic resonance.” It successfully demonstrated the factorization of the number 15 using nuclear magnetic resonance (NMR) quantum computing. The experiment utilized a liquid-state NMR quantum computer. The “computer” was essentially a solution of chloroform (CHCl3), where the carbon and one hydrogen atom served as qubits due to their nuclear spins. These spins acted as quantum bits (qubits) that could be manipulated using radiofrequency pulses. In this setup, the seven active nuclear spins mentioned refer to the spins of the atomic nuclei in the 1018 identical molecules of chloroform. Each molecule contained a pair of qubits used in the computation, with additional spins used for control and measurement processes.
This demonstration was crucial as it served as a proof of concept that Shor’s algorithm could be implemented on a physical quantum computing system. Even though the number 15 is trivial to factor with classical computers, factoring it using a quantum algorithm under real-world conditions proved the potential of quantum computing. The experiment paved the way for subsequent experiments and developments in other quantum computing technologies, such as superconducting qubits and trapped ions, which offer better prospects for scalability and are currently leading the race towards building a practical quantum computer.
Recent Developments and Commercialization
Since the early 2000s, the field of quantum computing has seen significant advancements, both in technological development and in commercialization efforts. The experimental demonstration of Shor’s algorithm in 2001 proved to be one of the key catalyzing events, spurring increased interest and investment from both the public and private sectors.
After 2001, research expanded beyond NMR-based quantum computers to several other promising quantum computing platforms. These include superconducting qubits, trapped ions, topological qubits, photonic systems, and silicon quantum dots. Each technology has its strengths and challenges, and the diversity of approaches helps to tackle different aspects of quantum computing.
IBM, Google, and Rigetti Computing have made substantial progress with superconducting qubits, while companies like IonQ and Honeywell have developed quantum computers based on trapped ions.
Major tech companies, including IBM, Google, Microsoft, and Intel, have invested heavily in developing quantum computing technologies and infrastructure. IBM was among the first to make quantum computers accessible to the public via cloud platforms, launching the IBM Quantum Experience in 2016.
Alongside hardware developments, there has been a growing emphasis on quantum software development. Quantum programming languages and software frameworks such as Qiskit (IBM), Cirq (Google), and Q# (Microsoft) have been developed to aid programmers in creating algorithms for quantum computers.
Startups like D-Wave Systems, Rigetti Computing, IonQ, and others have also been crucial in pushing the boundaries of quantum computing technology, with D-Wave focusing on quantum annealingโa specialized type of quantum computing.
The quantum algorithm library continues to expand beyond Shorโs and Grover’s algorithms, exploring areas like quantum machine learning, optimization, and materials science simulations.
Governments worldwide have recognized the strategic importance of quantum computing. The United States, the European Union, China, and others have launched significant funding initiatives to support quantum computing research and development.
Academic institutions and national labs have formed research consortia and partnerships with industry leaders to advance quantum computing technologies and train the next generation of quantum scientists.
As we move further into the 2020s, the commercialization of quantum computing continues to grow, driven by advancements in technology, increased investment, and expanding practical applications. The next decade promises to be even more transformative as quantum computing moves from experimental demonstrations to broader real-world implementations.
Entering the Era of Quantum Supremacy
The race to achieve “quantum supremacy”โa point where quantum computers can perform tasks beyond the reach of even the most powerful classical supercomputersโhas defined the recent phase of quantum computing development. In 2019, Google announced that it had achieved quantum supremacy using their 53-qubit Sycamore processor. They claimed to have performed a specific quantum computation in 200 seconds that would take the world’s most powerful supercomputer approximately 10,000 years to complete, demonstrating a practical example of a quantum computer solving a problem faster than any classical computer.
While this was a groundbreaking demonstration, the reality is that as of today the applications of quantum computing are very narrow. Current implementation are error prone and the coherence is maintained less than a millisecond. It will take quite a few years before we develop fault-tolerant qubits that would signal the transition towards quantum supremacy. For my views on when to expect resilient quantum computers, see my article “Q-Day Predictions: Anticipating the Arrival of Cryptanalytically Relevant Quantum Computers (CRQC).”
Marin Ivezic
For over 30 years, Marin Ivezic has been protecting critical infrastructure and financial services against cyber, financial crime and regulatory risks posed by complex and emerging technologies.
He held multiple interim CISO and technology leadership roles in Global 2000 companies.