How many qubits are needed to out-perform conventional computers, how to protect a quantum computer from the effects of decoherence and how to design more than 1000 qubits fault-tolerant large scale quantum computers, these are the three basic questions we want to deal in this article. Qubit technologies, qubit quality, qubit count, qubit connectivity and qubit architectures are the five key areas of quantum computing are discussed.

Earlier we have discussed **7 Core Qubit Technologies for Quantum Computing, 7 Key Requirements for Quantum Computing.** **Spin-orbit Coupling Qubits for Quantum Computing and AI, Quantum Computing Algorithms for Artificial Intelligence, Quantum Computing and Artificial Intelligence , Quantum Computing with Many World Interpretation Scopes and Challenges and Quantum Computer with Superconductivity at Room Temperature. **Here, we will focus on practical issues related to designing large-scale quantum computers.

Instead of running on zeros and ones, quantum computers run on an infinite number of states between zero and one. Instead of performing one calculation before moving on to the next, quantum computers can manage multiple processes all simultaneously.

Unlike binary bits of information in ordinary computers, “qubits” consist of quantum particles that have some probability of being in each of two states, represent as |0⟩ and |1⟩, simultaneously. When qubits interact, their possible states become interdependent (entangled), each one’s chances of |0⟩ and |1⟩ hinging on those of the other. Moreover, quantum information does not have to be encoded into binary bits, it could also be encoded into continuous observables bits (qubits).

The speed requirements for various applications grows with the complexity of the problems and the speed advantage of quantum computers are enormous compare to classical computers. *The key to quantum computation speed is that every additional qubit doubles the potential computing power of a quantum machine. *

The objective of 1000 qubits fault-tolerant quantum computing is to compute accurately even when gates have a high probability of error each time they are used. Theoretically, accurate quantum computing is possible with error probabilities above 3% per gate, which is significantly high. The resources required for quantum computing depends on the error probabilities of the gates. It is possible to implement non-trivial quantum computations at error probabilities as high as 1% per gate.

## Quantum Computing Promises

Based on complexity theory quantum computers can solve much complex problems in exponentially less time than classical computers. Quantum computers can provide faster solutions to factoring and searching algorithms compare to the classical computers. Factoring is basically finding the prime factors of a large composite integer – for which quantum algorithms have been discovered that could solve these problems easily. Quantum computers can provide better ways than classical computers to simulate complex quantum systems for the physicists.

The primary applications of quantum computing relate to the physical simulation of quantum particles of the Universe, new drug discovery, new material design, complex financial modeling, molecular biology, omics and precision medicine, complex optimizations, quantum artificial intelligence and also for the neural network training for machine learning applications.

## Need for 1000 Qubits Quantum Computing

As the number of qubits increases, the system continues to explore the exponentially growing number of quantum states. In theory, the more qubits, the more powerful a quantum computer becomes. Moreover, the key reason for needing so many qubits is dealing with their noise and fragility. At 1,000 qubits, there is only limited error correction and fault tolerance, but at 1,000,000 the system has fault tolerance, which is a key to why it can become fairly general purpose Universal Quantum Computer.

## Quantum Supremacy

One major milestone on the road of quantum computing is “quantum supremacy,” the point where a quantum machine can overcome the performance of the best classical computers in complex tasks. In theory, achieving quantum supremacy requires a computer of more than 50 qubits. However, engineering limitations, decoharance, unknown behavior of the qubits and noise has scaled-up the qubit requirements for quantum supremacy. It is estimated that with reasonable gate error rate, 1,000 qubits of Universal gate based quantum computer will be the most practical for developing quantum supremacy.

Moore’s law says that processing speeds for silicon-based transistors would double every two years, as more transistors were crammed on smaller chips. More precisely, doubling computers’ power in approximately every 18 months. Considering the Moore’s law 1,000 qubits is required to achieve quantum supremacy. According to our Compassionate AI Lab estimation based on Moore’s law and other parameters 1,000 qubits can be achieved by the year 2023 and operational availability will be by the year 2025.

## Present State of Qubits in Quantum Computers

50-qubit noisy machines have been developed, 100-qubit noisy machines are just knocking on the door, and even 1000-qubit machines are perhaps only a few years away. Presently, the largest operational gate-based quantum computer is a 20-qubit system from IBM Q with an average two-qubit gate error rate of about 5 percent. Gate error rate for others are not known. IBM, Intel, and Google each reported testing quantum processors containing 50, 49, and 72 qubits, respectively, all realized using superconducting circuits.

For quantum supremacy to outperform classical computers, it is required a machine with more than 50 qubits, and average gate error rate of around 0.1 percent.

## Quantum Fault Tolerance System Classifications

Fault tolerance of quantum computing can be broadly classified into three groups: Hardware fault tolerance, software fault tolerance and system level fault tolerance.

### Hardware Fault Tolerance: Fault Tolerant Quantum Gates

The main challenge is to construct a universal set of quantum gates that can act on the encoded data blocks without introducing an excessive number of errors. Once our hardware meets a specified standard of accuracy, quantum error-correcting codes and fault-tolerant procedures enable us to perform arbitrarily long quantum computations with arbitrarily high reliability.

Progress in gate-based quantum computing can be monitored by tracking the key properties that define the quality of a quantum processor: the effective error rates of the single-qubit and two-qubit operations, the inter-qubit connectivity, and the number of qubits contained within a single hardware module. Topologically protected quantum gates on clusters can reduce the error rate.

### Software Fault Tolerance: Quantum Error Correction

In general, the quantum error correction (QEC) is repeatedly performed only by the logical-qubits during the quantum computation process. In large scale quantum computation, a large number of physical qubits are needed to obtain the highly accurate results of quantum computation. Quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large scale quantum algorithms.

The basic principle is, if the probability of a single error occurring is *p*, then the probability of two (independent) errors is *p²*. A natural condition for the error correction scheme to work is therefore that the error probability *p* should be much smaller than one, so that p² is very close to zero.

### System Fault Tolerance: Fault Tolerant Recovery

Dynamic Recovery: In dynamic recovery, special mechanism is essential to discover faults in the units, perform a switch on a faulty module, puts in a spare, and carryout some software actions necessary to error-check, restore and continue computation such as; rollback, initialization, retry, and restart.

The discovery of quantum error correction has greatly improved the long-term prospects for quantum computing technology. This result is considered a landmark of quantum computation, a proof of principle that quantum processing is possible.

## Seven Key Challenges for 1000 Qbits Quantum Computing:

In order to build a effective quantum computer, one must create a physical system that can control, manipulate and measure the states of the qubits precisely in order to carry out computations. Creating stable qubit arrays is a challenging task. The most formidable enemy of the quantum computer is decoherence. An quantum state is extremely fragile as it interacts with the environment. Simply observing the state of a qubit changes it, and a qubit is also extremely difficult to isolate from outside noise that would also change its state. The information stored in the quantum bits registers decays, resulting in errors and the failure of the computation.

The noise of qubits is a complicated phenomenon. One reason is that the causes are often hard to evaluate and eliminate. Further technical and mathematical difficulties are related to measuring and quantifying noise. In theory, a quantum computer cannot function for practical purposes if the precision is below about 99 per cent. General consensus is that 99.99 per cent precision will be necessary to make a practical quantum computer. There is an issue of ‘fidelity’, a measure of the quality or ‘precision’ of the physical system – how close the real system is to the ideal system. The key challenges for developing the 1000 qbits quantum computers are as follows:

- Reducing gap between Quantum Algorithms and Quantum Circuit performance
- Establishing Qubit with high coherence time
- Establishing low Gate error rate
- Providing stable qubit to qubit connectivity
- Developing much greater circuit depth
- Providing significant redundancy for each qubit – True fault tolerance — error correction,
- Non-cryogenic operating temperatures – aiming to bridge the temperature gap between current computers – which operate at 27°C- and quantum computers functioning at below -180°C.

## 1. Quantum Error Correction (QEC)

The field of quantum computing has advances by the discoveries of quantum error correction (QEC) codes around 1995s (Laflamme et al. 1996; Shor 1995; Steane 1996). Shor’s breakthrough made quantum computation on a practical level look possible. Quantum error-correcting codes are needed to overcome the daunting error rates of the physical qubits. Presently, superconducting, trapped-ion and topological qubits are the three most promising approaches for creating the quantum data plane. Each of them have some inbuilt physical error rates. Quantum error correction is mainly to achieve fault-tolerant quantum computation that can deal not only with noise on stored quantum information, but also with faulty quantum gates, faulty quantum preparation, and faulty measurements.

## 2. Overcoming the Decoherence Issue

Decoherence is one of the primary obstacles that stops the realization of large-scale quantum computing. The decoherence of a qubit arises from the interaction with its environment and is even worse when the number of qubits becomes larger. The alternative attractive routes to realize fault-tolerant quantum computation should be evaluated. One of the promising area is fault-tolerant quantum computation based on topologically protected non-Abelian anyons.

## 3. Establishing Stable Qubit to Qubit Connectivity

Power of quantum computers come from the ability to generate a collective entangled state. An entangled state is generated by coupling a pair of qubits using two qubit operation. A machine can entangle only the qubits that have a link between them. Existing technologies offer limited connectivity, only to a few of the neighboring qubits. A pair of qubits can only be entangled if there exists a coupling link between them. However, not all pairs of qubits are connected owing to engineering constraints: there is a limited connectivity between qubits. The higher the connectivity, the easier it would be to fit a quantum calculation into the structure.

## 4. Increasing Redundancy and Scalability of the qubits

The scalability issue is one of the hardest problem in quantum computing. It is not just having a larger number of qubits, but also being able to entangle them. Scalability in quantum Fault tolerance is the dynamic method that’s used to keep the interconnected systems together, sustain reliability, and availability in distributed systems. The hardware and software redundancy methods are the known techniques of fault tolerance in distributed system. In dealing with fault tolerance, replication is typically used for general fault tolerance method to protect against system failure. Each qubit in a processor is capable of introducing error into a computation, which makes it hard to strike a balance between low error-rates and the ‘power’ of a quantum processor.

## 5. Reducing gap between Quantum Algorithms and Quantum Circuit Implementation

There is a significant resource gap remains between quantum algorithms and practical quantum computing implementations. There is an urgent research need to work on hardware, software and architectures to close this gap. It includes quantum programming language design, quantum software and hardware verification, defining and perforating abstraction boundaries, cross-layer optimization, managing parallelism and quantum communication, mapping and scheduling computations, reducing control complexity, machine-specific optimizations, learning error patterns, and many more.

## 6. Developing much greater quantum circuit depth

Instead of bits, quantum circuits manipulate qubits which can represent the classical Boolean values but also the superposition of them. Circuit depth is the number of layers of gates in the quantum circuit. Quantum computing also consider families of circuits that act on inputs. A quantum computation (or quantum algorithm) can be describe as a collection of qubits being acted on by quantum operators represented by a tensor product of quantum gates. Any quantum operation is represented by a unitary matrix and each quantum circuit is inherently reversible. Efficient design and construction of the universal quantum circuits can improve the performance of the quantum computers in many ways.

Quantum circuit depth can be achieved by many ways such as: **decomposition into a tensor product of low-dimensional subsystems, collective or selective cluster measurement techniques, adding different basis to the the initial qubit states**.

## 7. High quality qubits and low gate error rate

Measurement of a classical bit would not disturb its state, however a measurement of a qubit would destroy its coherence and irrevocably disturb the superposition state. Quantum entanglement allows multiple states to be acted on simultaneously, unlike classical bits that can only have one value at a time. Quantum computers perform calculations by manipulating qubits within a register. A qubyte (quantum byte) is a collection of eight qubits. Increasing the quality of the qubits so that good results can be obtained with smaller error correction codes that have less overhead.

## Summary:

Designing and developing large-scale quantum computer is one the most complex challenge of modern scientific research. Here, we discussed seven key strategies to develop 1000 qubits fault-tolerant quantum computing systems. We have also discussed quantum supremacy and the issues like qubit to qubit connectivity, quantum error correction, decomposition of quantum circuits.

To improve the quality of the qubits, there are also many other approaches. One approach is to reject the notion that we are limited to qubits (which have two states plus superposition) and adopt a bit with more possible states. We can work on “qudits” that can have 10 or more states (maybe we can call it a 10-state qubit). Other approaches look to higher density of qubits, possibly relying on the spin of a single electron.

According to our estimation 1,000 qubits can be achieved by the year 2023 and operational availability will be by the year 2025. Seven key strategies discussed here towards the near-term demonstration of quantum advantage and quantum supremacy.

The development of hybrid quantum–classical algorithms can also capable of exploiting the best of both worlds. It is essential higher fidelity and better connectivity across the device. Yet significant work remains, and many open questions need to be tackled to achieve the goal of building a large-scale scalable quantum computer.

## Source Books:

- Compassionate Artificial Intelligence: Frameworks and Algorithms by Dr. Amit Ray
- Compassionate Superintelligence, AI 5.0 by Dr. Amit Ray
- Quantum Computing Algorithms for Artificial Intelligence By Dr. Amit Ray

## References:

1 . Topological fault-tolerance in cluster state quantum computation , New Journal of Physics

2. Quantum Computing in the NISQ era and beyond, John Preskill, Cal Tech.

3. Digitized adiabatic quantum computing with a superconducting circuit, Nature

4. What would you do with 1000 qubits?

5. A blueprint for demonstrating quantum supremacy with superconducting qubits.

6. Parallelizing quantum circuits

7. Tracking Quantum Error Correction

1 - 1Share