Surface code error correction on a defective lattice

Shota Nagayama, Austin G. Fowler, Dominic Horsman, Simon J. Devitt, Rodney D Van Meter

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

The yield of physical qubits fabricated in the laboratory is much lower than that of classical transistors in production semiconductor fabrication. Actual implementations of quantum computers will be susceptible to loss in the form of physically faulty qubits. Though these physical faults must negatively affect the computation, we can deal with them by adapting error-correction schemes. In this paper we have simulated statically placed single-fault lattices and lattices with randomly placed faults at functional qubit yields of 80%, 90%, and 95%, showing practical performance of a defective surface code by employing actual circuit constructions and realistic errors on every gate, including identity gates. We extend Stace et al's superplaquettes solution against dynamic losses for the surface code to handle static losses such as physically faulty qubits [1]. The single-fault analysis shows that a static loss at the periphery of the lattice has less negative effect than a static loss at the center. The randomly faulty analysis shows that 95% yield is good enough to build a large-scale quantum computer. The local gate error rate threshold is , and a code distance of seven suppresses the residual error rate below the original error rate at . 90% yield is also good enough when we discard badly fabricated quantum computation chips, while 80% yield does not show enough error suppression even when discarding 90% of the chips. We evaluated several metrics for predicting chip performance, and found that the average of the product of the number of data qubits and the cycle time of a stabilizer measurement of stabilizers gave the strongest correlation with logical error rates. Our analysis will help with selecting usable quantum computation chips from among the pool of all fabricated chips.

Original languageEnglish
Article number023050
JournalNew Journal of Physics
Volume19
Issue number2
DOIs
Publication statusPublished - 2017 Feb 1

Fingerprint

chips
quantum computers
quantum computation
transistors
retarding
cycles
fabrication
thresholds
products

Keywords

  • fault tolerant quantum computation
  • quantum error correction
  • qubit loss
  • surface code
  • topological quantum error correction

ASJC Scopus subject areas

  • Physics and Astronomy(all)

Cite this

Surface code error correction on a defective lattice. / Nagayama, Shota; Fowler, Austin G.; Horsman, Dominic; Devitt, Simon J.; Van Meter, Rodney D.

In: New Journal of Physics, Vol. 19, No. 2, 023050, 01.02.2017.

Research output: Contribution to journalArticle

Nagayama, Shota ; Fowler, Austin G. ; Horsman, Dominic ; Devitt, Simon J. ; Van Meter, Rodney D. / Surface code error correction on a defective lattice. In: New Journal of Physics. 2017 ; Vol. 19, No. 2.
@article{0e8fd7a1eb734a59b494c26ae4bcdf87,
title = "Surface code error correction on a defective lattice",
abstract = "The yield of physical qubits fabricated in the laboratory is much lower than that of classical transistors in production semiconductor fabrication. Actual implementations of quantum computers will be susceptible to loss in the form of physically faulty qubits. Though these physical faults must negatively affect the computation, we can deal with them by adapting error-correction schemes. In this paper we have simulated statically placed single-fault lattices and lattices with randomly placed faults at functional qubit yields of 80{\%}, 90{\%}, and 95{\%}, showing practical performance of a defective surface code by employing actual circuit constructions and realistic errors on every gate, including identity gates. We extend Stace et al's superplaquettes solution against dynamic losses for the surface code to handle static losses such as physically faulty qubits [1]. The single-fault analysis shows that a static loss at the periphery of the lattice has less negative effect than a static loss at the center. The randomly faulty analysis shows that 95{\%} yield is good enough to build a large-scale quantum computer. The local gate error rate threshold is , and a code distance of seven suppresses the residual error rate below the original error rate at . 90{\%} yield is also good enough when we discard badly fabricated quantum computation chips, while 80{\%} yield does not show enough error suppression even when discarding 90{\%} of the chips. We evaluated several metrics for predicting chip performance, and found that the average of the product of the number of data qubits and the cycle time of a stabilizer measurement of stabilizers gave the strongest correlation with logical error rates. Our analysis will help with selecting usable quantum computation chips from among the pool of all fabricated chips.",
keywords = "fault tolerant quantum computation, quantum error correction, qubit loss, surface code, topological quantum error correction",
author = "Shota Nagayama and Fowler, {Austin G.} and Dominic Horsman and Devitt, {Simon J.} and {Van Meter}, {Rodney D}",
year = "2017",
month = "2",
day = "1",
doi = "10.1088/1367-2630/aa5918",
language = "English",
volume = "19",
journal = "New Journal of Physics",
issn = "1367-2630",
publisher = "IOP Publishing Ltd.",
number = "2",

}

TY - JOUR

T1 - Surface code error correction on a defective lattice

AU - Nagayama, Shota

AU - Fowler, Austin G.

AU - Horsman, Dominic

AU - Devitt, Simon J.

AU - Van Meter, Rodney D

PY - 2017/2/1

Y1 - 2017/2/1

N2 - The yield of physical qubits fabricated in the laboratory is much lower than that of classical transistors in production semiconductor fabrication. Actual implementations of quantum computers will be susceptible to loss in the form of physically faulty qubits. Though these physical faults must negatively affect the computation, we can deal with them by adapting error-correction schemes. In this paper we have simulated statically placed single-fault lattices and lattices with randomly placed faults at functional qubit yields of 80%, 90%, and 95%, showing practical performance of a defective surface code by employing actual circuit constructions and realistic errors on every gate, including identity gates. We extend Stace et al's superplaquettes solution against dynamic losses for the surface code to handle static losses such as physically faulty qubits [1]. The single-fault analysis shows that a static loss at the periphery of the lattice has less negative effect than a static loss at the center. The randomly faulty analysis shows that 95% yield is good enough to build a large-scale quantum computer. The local gate error rate threshold is , and a code distance of seven suppresses the residual error rate below the original error rate at . 90% yield is also good enough when we discard badly fabricated quantum computation chips, while 80% yield does not show enough error suppression even when discarding 90% of the chips. We evaluated several metrics for predicting chip performance, and found that the average of the product of the number of data qubits and the cycle time of a stabilizer measurement of stabilizers gave the strongest correlation with logical error rates. Our analysis will help with selecting usable quantum computation chips from among the pool of all fabricated chips.

AB - The yield of physical qubits fabricated in the laboratory is much lower than that of classical transistors in production semiconductor fabrication. Actual implementations of quantum computers will be susceptible to loss in the form of physically faulty qubits. Though these physical faults must negatively affect the computation, we can deal with them by adapting error-correction schemes. In this paper we have simulated statically placed single-fault lattices and lattices with randomly placed faults at functional qubit yields of 80%, 90%, and 95%, showing practical performance of a defective surface code by employing actual circuit constructions and realistic errors on every gate, including identity gates. We extend Stace et al's superplaquettes solution against dynamic losses for the surface code to handle static losses such as physically faulty qubits [1]. The single-fault analysis shows that a static loss at the periphery of the lattice has less negative effect than a static loss at the center. The randomly faulty analysis shows that 95% yield is good enough to build a large-scale quantum computer. The local gate error rate threshold is , and a code distance of seven suppresses the residual error rate below the original error rate at . 90% yield is also good enough when we discard badly fabricated quantum computation chips, while 80% yield does not show enough error suppression even when discarding 90% of the chips. We evaluated several metrics for predicting chip performance, and found that the average of the product of the number of data qubits and the cycle time of a stabilizer measurement of stabilizers gave the strongest correlation with logical error rates. Our analysis will help with selecting usable quantum computation chips from among the pool of all fabricated chips.

KW - fault tolerant quantum computation

KW - quantum error correction

KW - qubit loss

KW - surface code

KW - topological quantum error correction

UR - http://www.scopus.com/inward/record.url?scp=85014380924&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85014380924&partnerID=8YFLogxK

U2 - 10.1088/1367-2630/aa5918

DO - 10.1088/1367-2630/aa5918

M3 - Article

AN - SCOPUS:85014380924

VL - 19

JO - New Journal of Physics

JF - New Journal of Physics

SN - 1367-2630

IS - 2

M1 - 023050

ER -