Quantum computing can become scalable through error correction, but logical error rates only decrease with system size when physical errors are sufficiently uncorrelated. During computation,unused high energy levels of the qubits can become excited, creating leakage states that are long-lived and mobile. Particularly for superconducting transmon qubits, this leakage opens a path to errors that are correlated in space and time. Here, we report a reset protocol that returns a qubit to the ground state from all relevant higher level states. We test its performance with the bit-flip stabilizer code, a simplified version of the surface code for quantum error correction. We investigate the accumulation and dynamics of leakage during error correction. Using this protocol, we find lower rates of logical errors and an improved scaling and stability of error suppression with increasing qubit number. This demonstration provides a key step on the path towards scalable quantum computing.
Leakage errors occur when a quantum system leaves the two-level qubit subspace. Reducing these errors is critically important for quantum error correction to be viable. To quantifyleakage errors, we use randomized benchmarking in conjunction with measurement of the leakage population. We characterize single qubit gates in a superconducting qubit, and by refining our use of Derivative Reduction by Adiabatic Gate (DRAG) pulse shaping along with detuning of the pulses, we obtain gate errors consistently below 10−3 and leakage rates at the 10−5 level. With the control optimized, we find that a significant portion of the remaining leakage is due to incoherent heating of the qubit.
Since the inception of quantum mechanics, its validity as a complete description of reality has been challenged due to predictions that defy classical intuition. For many years it wasunclear whether predictions like entanglement and projective measurement represented real phenomena or artifacts of an incomplete model. Bell inequalities (BI) provided the first quantitative test to distinguish between quantum entanglement and a yet undiscovered classical hidden variable theory. The Leggett-Garg inequality (LGI) provides a similar test for projective measurement, and more recently has been adapted to include variable strength measurements to study the process of measurement itself. Here we probe the intersection of both entanglement and measurement through the lens of the hybrid Bell-Leggett-Garg inequality (BLGI). By correlating data from ancilla-based weak measurements and direct projective measurements, we for the first time quantify the effect of measurement strength on entanglement collapse. Violation of the BLGI, which we achieve only at the weakest measurement strengths, offers compelling evidence of the completeness of quantum mechanics while avoiding several loopholes common to previous experimental tests. This uniquely quantum result significantly constrains the nature of any possible classical theory of reality. Additionally, we demonstrate that with sufficient scale and fidelity, a universal quantum processor can be used to study richer fundamental physics.
A quantum computer can solve hard problems – such as prime factoring, database searching, and quantum simulation – at the cost of needing to protect fragile quantum statesfrom error. Quantum error correction provides this protection, by distributing a logical state among many physical qubits via quantum entanglement. Superconductivity is an appealing platform, as it allows for constructing large quantum circuits, and is compatible with microfabrication. For superconducting qubits the surface code is a natural choice for error correction, as it uses only nearest-neighbour coupling and rapidly-cycled entangling gates. The gate fidelity requirements are modest: The per-step fidelity threshold is only about 99%. Here, we demonstrate a universal set of logic gates in a superconducting multi-qubit processor, achieving an average single-qubit gate fidelity of 99.92% and a two-qubit gate fidelity up to 99.4%. This places Josephson quantum computing at the fault-tolerant threshold for surface code error correction. Our quantum processor is a first step towards the surface code, using five qubits arranged in a linear array with nearest-neighbour coupling. As a further demonstration, we construct a five-qubit Greenberger-Horne-Zeilinger (GHZ) state using the complete circuit and full set of gates. The results demonstrate that Josephson quantum computing is a high-fidelity technology, with a clear path to scaling up to large-scale, fault-tolerant quantum circuits.
A fundamental challenge for quantum information processing is reducing the impact of environmentally-induced errors. Quantum error detection (QED) provides one approach to handlingsuch errors, in which errors are rejected when they are detected. Here we demonstrate a QED protocol based on the idea of quantum un-collapsing, using this protocol to suppress energy relaxation due to the environment in a three-qubit superconducting circuit. We encode quantum information in a target qubit, and use the other two qubits to detect and reject errors caused by energy relaxation. This protocol improves the storage time of a quantum state by a factor of roughly three, at the cost of a reduced probability of success. This constitutes the first experimental demonstration of an algorithm-based improvement in the lifetime of a quantum state stored in a qubit.
The act of measurement bridges the quantum and classical worlds by projecting
a superposition of possible states into a single, albeit probabilistic,
outcome. The time-scale of this„instantaneous“ process can be stretched using
weak measurements so that it takes the form of a gradual random walk towards a
final state. Remarkably, the interim measurement record is sufficient to
continuously track and steer the quantum state using feedback. We monitor the
dynamics of a resonantly driven quantum two-level system — a superconducting
quantum bit –using a near-noiseless parametric amplifier. The high-fidelity
measurement output is used to actively stabilize the phase of Rabi
oscillations, enabling them to persist indefinitely. This new functionality
shows promise for fighting decoherence and defines a path for continuous
quantum error correction.