### Abstract

The yield of physical qubits fabricated in the laboratory is much lower than that of classical transistors in production semiconductor fabrication. Actual implementations of quantum computers will be susceptible to loss in the form of physically faulty qubits. Though these physical faults must negatively affect the computation, we can deal with them by adapting error-correction schemes. In this paper we have simulated statically placed single-fault lattices and lattices with randomly placed faults at functional qubit yields of 80%, 90%, and 95%, showing practical performance of a defective surface code by employing actual circuit constructions and realistic errors on every gate, including identity gates. We extend Stace et al's superplaquettes solution against dynamic losses for the surface code to handle static losses such as physically faulty qubits [1]. The single-fault analysis shows that a static loss at the periphery of the lattice has less negative effect than a static loss at the center. The randomly faulty analysis shows that 95% yield is good enough to build a large-scale quantum computer. The local gate error rate threshold is , and a code distance of seven suppresses the residual error rate below the original error rate at . 90% yield is also good enough when we discard badly fabricated quantum computation chips, while 80% yield does not show enough error suppression even when discarding 90% of the chips. We evaluated several metrics for predicting chip performance, and found that the average of the product of the number of data qubits and the cycle time of a stabilizer measurement of stabilizers gave the strongest correlation with logical error rates. Our analysis will help with selecting usable quantum computation chips from among the pool of all fabricated chips.

Original language | English |
---|---|

Article number | 023050 |

Journal | New Journal of Physics |

Volume | 19 |

Issue number | 2 |

DOIs | |

Publication status | Published - 2017 Feb 1 |

### Fingerprint

### Keywords

- fault tolerant quantum computation
- quantum error correction
- qubit loss
- surface code
- topological quantum error correction

### ASJC Scopus subject areas

- Physics and Astronomy(all)

### Cite this

*New Journal of Physics*,

*19*(2), [023050]. https://doi.org/10.1088/1367-2630/aa5918

**Surface code error correction on a defective lattice.** / Nagayama, Shota; Fowler, Austin G.; Horsman, Dominic; Devitt, Simon J.; Van Meter, Rodney D.

Research output: Contribution to journal › Article

*New Journal of Physics*, vol. 19, no. 2, 023050. https://doi.org/10.1088/1367-2630/aa5918

}

TY - JOUR

T1 - Surface code error correction on a defective lattice

AU - Nagayama, Shota

AU - Fowler, Austin G.

AU - Horsman, Dominic

AU - Devitt, Simon J.

AU - Van Meter, Rodney D

PY - 2017/2/1

Y1 - 2017/2/1

N2 - The yield of physical qubits fabricated in the laboratory is much lower than that of classical transistors in production semiconductor fabrication. Actual implementations of quantum computers will be susceptible to loss in the form of physically faulty qubits. Though these physical faults must negatively affect the computation, we can deal with them by adapting error-correction schemes. In this paper we have simulated statically placed single-fault lattices and lattices with randomly placed faults at functional qubit yields of 80%, 90%, and 95%, showing practical performance of a defective surface code by employing actual circuit constructions and realistic errors on every gate, including identity gates. We extend Stace et al's superplaquettes solution against dynamic losses for the surface code to handle static losses such as physically faulty qubits [1]. The single-fault analysis shows that a static loss at the periphery of the lattice has less negative effect than a static loss at the center. The randomly faulty analysis shows that 95% yield is good enough to build a large-scale quantum computer. The local gate error rate threshold is , and a code distance of seven suppresses the residual error rate below the original error rate at . 90% yield is also good enough when we discard badly fabricated quantum computation chips, while 80% yield does not show enough error suppression even when discarding 90% of the chips. We evaluated several metrics for predicting chip performance, and found that the average of the product of the number of data qubits and the cycle time of a stabilizer measurement of stabilizers gave the strongest correlation with logical error rates. Our analysis will help with selecting usable quantum computation chips from among the pool of all fabricated chips.

AB - The yield of physical qubits fabricated in the laboratory is much lower than that of classical transistors in production semiconductor fabrication. Actual implementations of quantum computers will be susceptible to loss in the form of physically faulty qubits. Though these physical faults must negatively affect the computation, we can deal with them by adapting error-correction schemes. In this paper we have simulated statically placed single-fault lattices and lattices with randomly placed faults at functional qubit yields of 80%, 90%, and 95%, showing practical performance of a defective surface code by employing actual circuit constructions and realistic errors on every gate, including identity gates. We extend Stace et al's superplaquettes solution against dynamic losses for the surface code to handle static losses such as physically faulty qubits [1]. The single-fault analysis shows that a static loss at the periphery of the lattice has less negative effect than a static loss at the center. The randomly faulty analysis shows that 95% yield is good enough to build a large-scale quantum computer. The local gate error rate threshold is , and a code distance of seven suppresses the residual error rate below the original error rate at . 90% yield is also good enough when we discard badly fabricated quantum computation chips, while 80% yield does not show enough error suppression even when discarding 90% of the chips. We evaluated several metrics for predicting chip performance, and found that the average of the product of the number of data qubits and the cycle time of a stabilizer measurement of stabilizers gave the strongest correlation with logical error rates. Our analysis will help with selecting usable quantum computation chips from among the pool of all fabricated chips.

KW - fault tolerant quantum computation

KW - quantum error correction

KW - qubit loss

KW - surface code

KW - topological quantum error correction

UR - http://www.scopus.com/inward/record.url?scp=85014380924&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85014380924&partnerID=8YFLogxK

U2 - 10.1088/1367-2630/aa5918

DO - 10.1088/1367-2630/aa5918

M3 - Article

AN - SCOPUS:85014380924

VL - 19

JO - New Journal of Physics

JF - New Journal of Physics

SN - 1367-2630

IS - 2

M1 - 023050

ER -