Quantum error correction (QEC) is an essential step towards realising scalable quantum computers. Theoretically, it is possible to achieve arbitrarily long protection of quantum informationfrom corruption due to decoherence or imperfect controls, so long as the error rate is below a threshold value. The two-dimensional surface code (SC) is a fault-tolerant error correction protocol} that has garnered considerable attention for actual physical implementations, due to relatively high error thresholds ~1%, and restriction to planar lattices with nearest-neighbour interactions. Here we show a necessary element for SC error correction: high-fidelity parity detection of two code qubits via measurement of a third syndrome qubit. The experiment is performed on a sub-section of the SC lattice with three superconducting transmon qubits, in which two independent outer code qubits are joined to a central syndrome qubit via two linking bus resonators. With all-microwave high-fidelity single- and two-qubit nearest-neighbour entangling gates, we demonstrate entanglement distributed across the entire sub-section by generating a three-qubit Greenberger-Horne-Zeilinger (GHZ) state with fidelity ~94%. Then, via high-fidelity measurement of the syndrome qubit, we deterministically entangle the otherwise un-coupled outer code qubits, in either an even or odd parity Bell state, conditioned on the syndrome state. Finally, to fully characterize this parity readout, we develop a new measurement tomography protocol to obtain a fidelity metric (90% and 91%). Our results reveal a straightforward path for expanding superconducting circuits towards larger networks for the SC and eventually a primitive logical qubit implementation.
We demonstrate rapid, first-order sideband transitions between a
superconducting resonator and a frequency-modulated transmon qubit. The qubit
contains a substantial asymmetry betweenits Josephson junctions leading to a
linear portion of the energy band near the resonator frequency. The sideband
transitions are driven with a magnetic flux signal of a few hundred MHz coupled
to the qubit. This modulates the qubit splitting at a frequency near the
detuning between the dressed qubit and resonator frequencies, leading to rates
up to 85 MHz for exchanging quanta between the qubit and resonator.
Quantum process tomography is a necessary tool for verifying quantum gates
and diagnosing faults in architectures and gate design. We show that the
standard approach of process tomographyis grossly inaccurate in the case where
the states and measurement operators used to interrogate the system are
generated by gates that have some systematic error, a situation all but
unavoidable in any practical setting. These errors in tomography can not be
fully corrected through oversampling or by performing a larger set of
experiments. We present an alternative method for tomography to reconstruct an
entire library of gates in a self-consistent manner. The essential ingredient
is to define a likelihood function that assumes nothing about the gates used
for preparation and measurement. In order to make the resulting optimization
tractable we linearize about the target, a reasonable approximation when
benchmarking a quantum computer as opposed to probing a black-box function.
The control and handling of errors arising from cross-talk and unwanted
interactions in multi-qubit systems is an important issue in quantum
information processing architectures. Weintroduce a benchmarking protocol that
provides information about the amount of addressability present in the system
and implement it on coupled superconducting qubits. The protocol consists of
randomized benchmarking each qubit individually and then simultaneously, and
the amount of addressability is related to the difference of the average gate
fidelities of those experiments. We present the results on two similar samples
with different amounts of cross-talk and unwanted interactions, which agree
with predictions based on simple models for the amount of residual coupling.