Robust quantum computation requires encoding delicate quantum information into degrees of freedom that are hard for the environment to change. Quantum encodings have been demonstratedin many physical systems by observing and correcting storage errors, but applications require not just storing information; we must accurately compute even with faulty operations. The theory of fault-tolerant quantum computing illuminates a way forward by providing a foundation and collection of techniques for limiting the spread of errors. Here we implement one of the smallest quantum codes in a five-qubit superconducting transmon device and demonstrate fault-tolerant state preparation. We characterize the resulting codewords through quantum process tomography and study the free evolution of the logical observables. Our results are consistent with fault-tolerant state preparation in a protected qubit subspace.
The resonator-induced phase (RIP) gate is a multi-qubit entangling gate that allows a high degree of flexibility in qubit frequencies, making it attractive for quantum operations inlarge-scale architectures. We experimentally realize the RIP gate with four superconducting qubits in a three-dimensional (3D) circuit-quantum electrodynamics architecture, demonstrating high-fidelity controlled-Z (CZ) gates between all possible pairs of qubits from two different 4-qubit devices in pair subspaces. These qubits are arranged within a wide range of frequency detunings, up to as large as 1.8 GHz. We further show a dynamical multi-qubit refocusing scheme in order to isolate out 2-qubit interactions, and combine them to generate a four-qubit Greenberger-Horne-Zeilinger state.
Quantum error correction will be a necessary component towards realizing scalable quantum computers with physical qubits. Theoretically, it is possible to perform arbitrarily long computationsif the error rate is below a threshold value. The two-dimensional surface code permits relatively high fault-tolerant thresholds at the ~1% level, and only requires a latticed network of qubits with nearest-neighbor interactions. Superconducting qubits have continued to steadily improve in coherence, gate, and readout fidelities, to become a leading candidate for implementation into larger quantum networks. Here we describe characterization experiments and calibration of a system of four superconducting qubits arranged in a planar lattice, amenable to the surface code. Insights into the particular qubit design and comparison between simulated parameters and experimentally determined parameters are given. Single- and two-qubit gate tune-up procedures are described and results for simultaneously benchmarking pairs of two-qubit gates are given. All controls are eventually used for an arbitrary error detection protocol described in separate work [Corcoles et al., Nature Communications, 6, 2015]
High-fidelity measurements are important for the physical implementation of quantum information protocols. Current methods for classifying measurement trajectories in superconductingqubit systems produce fidelities that are systematically lower than those predicted by experimental parameters. Here, we place current classification methods within the framework of machine learning algorithms and improve on them by investigating more sophisticated ML approaches. We find that non-linear algorithms and clustering methods produce significantly higher assignment fidelities that help close the gap to the fidelity achievable under ideal noise conditions. Clustering methods group trajectories into natural subsets within the data, which allows for the diagnosis of specific systematic errors. We find large clusters in the data associated with relaxation processes and show these are the main source of discrepancy between our experimental and achievable fidelities. These error diagnosis techniques help provide a concrete path forward to improve qubit measurements.
To build a fault-tolerant quantum computer, it is necessary to implement a quantum error correcting code. Such codes rely on the ability to extract information about the quantum errorsyndrome while not destroying the quantum information encoded in the system. Stabilizer codes are attractive solutions to this problem, as they are analogous to classical linear codes, have simple and easily computed encoding networks, and allow efficient syndrome extraction. In these codes, syndrome extraction is performed via multi-qubit stabilizer measurements, which are bit and phase parity checks up to local operations. Previously, stabilizer codes have been realized in nuclei, trapped-ions, and superconducting qubits. However these implementations lack the ability to perform fault-tolerant syndrome extraction which continues to be a challenge for all physical quantum computing systems. Here we experimentally demonstrate a key step towards this problem by using a two-by-two lattice of superconducting qubits to perform syndrome extraction and arbitrary error detection via simultaneous quantum non-demolition stabilizer measurements. This lattice represents a primitive tile for the surface code, which is a promising stabilizer code for scalable quantum computing. Furthermore, we successfully show the preservation of an entangled state in the presence of an arbitrary applied error through high-fidelity syndrome measurement. Our results bolster the promise of employing lattices of superconducting qubits for larger-scale fault-tolerant quantum computing.
We demonstrate enhanced relaxation and dephasing times of transmon qubits, up to ~ 60 mu s by fabricating the interdigitated shunting capacitors using titanium nitride (TiN). Comparedto lift-off aluminum deposited simultaneously with the Josephson junction, this represents as much as a six-fold improvement and provides evidence that previous planar transmon coherence times are limited by surface losses from two-level system (TLS) defects residing at or near interfaces. Concurrently, we observe an anomalous temperature dependent frequency shift of TiN resonators which is inconsistent with the predicted TLS model.
Quantum process tomography is a necessary tool for verifying quantum gates
and diagnosing faults in architectures and gate design. We show that the
standard approach of process tomographyis grossly inaccurate in the case where
the states and measurement operators used to interrogate the system are
generated by gates that have some systematic error, a situation all but
unavoidable in any practical setting. These errors in tomography can not be
fully corrected through oversampling or by performing a larger set of
experiments. We present an alternative method for tomography to reconstruct an
entire library of gates in a self-consistent manner. The essential ingredient
is to define a likelihood function that assumes nothing about the gates used
for preparation and measurement. In order to make the resulting optimization
tractable we linearize about the target, a reasonable approximation when
benchmarking a quantum computer as opposed to probing a black-box function.
We implement a complete randomized benchmarking protocol on a system of two
superconducting qubits. The protocol consists of randomizing over gates in the
Clifford group, which experimentallyare generated via an improved two-qubit
cross-resonance gate implementation and single-qubit unitaries. From this we
extract an optimal average error per Clifford of 0.0936. We also perform an
interleaved experiment, alternating our optimal two-qubit gate with random
two-qubit Clifford gates, to obtain a two-qubit gate error of 0.0653. We
compare these values with a two-qubit gate error of ~0.12 obtained from quantum
process tomography, which is likely limited by state preparation and
measurement errors.
The control and handling of errors arising from cross-talk and unwanted
interactions in multi-qubit systems is an important issue in quantum
information processing architectures. Weintroduce a benchmarking protocol that
provides information about the amount of addressability present in the system
and implement it on coupled superconducting qubits. The protocol consists of
randomized benchmarking each qubit individually and then simultaneously, and
the amount of addressability is related to the difference of the average gate
fidelities of those experiments. We present the results on two similar samples
with different amounts of cross-talk and unwanted interactions, which agree
with predictions based on simple models for the amount of residual coupling.
We report a superconducting artificial atom with an observed quantum
coherence time of T2*=95us and energy relaxation time T1=70us. The system
consists of a single Josephson junctiontransmon qubit embedded in an otherwise
empty copper waveguide cavity whose lowest eigenmode is dispersively coupled to
the qubit transition. We attribute the factor of four increase in the coherence
quality factor relative to previous reports to device modifications aimed at
reducing qubit dephasing from residual cavity photons. This simple device holds
great promise as a robust and easily produced artificial quantum system whose
intrinsic coherence properties are sufficient to allow tests of quantum error
correction.