Quantum computing comfort zones

Von   Clemens Schäfermeier   |  Teamleader   |  attocube systems AG
17. März 2021

Quantum computing has made its way into “everyday” media, from television to newspaper. Due to the unavoidable effect of information loss over various communication channels, quantum computers are already facing a similar fate as the popular metaphor of a “quantum leap”: a portray of a can-do-it-all machine is quickly drawn in the most vivid colours. Or its threat to nowadays data security is used to forecast the potential darker side of coming technology, notoriously exploited by pseudo-scientists.
It is not surprising that a new technology, whose intricacies astonish even the experts and which is yet not applied in actual real-world problems, has gained a reputation that falls short in describing the actuality. It is also not surprising that, when multiple solutions are at hand for a challenge, the one that shows the first results receives the greatest attention. Willingly or unwillingly, this can lead to building on technologies that are fast to realise, but not efficient in the long run. It is, so to speak, the time-to-market rule for technology.

Yet quantum computers are different in many aspects, but there appears to be no exception to that time-to-market rule. Not only since Google and IBM have shown their technological advancements towards real-world applications, the time-to-market race in quantum computing technology shows a clear winner: so-called gate-based models with realisations in cryogenic environments. The reasons why this approach has achieved the most attention? First, gate-based quantum computing adopts well-developed concepts from classical computing. It is a sequential set of operations, realised by a (finite) set of basis gates. Second, cryogenic environments have been the establishing grounds for solid-state quantum experiments. A solid-state quantum computer is thus inevitably connected to cryogenics. As a result, today’s newspapers show photos of cryogenic equipment to visualise quantum computers.

Let us review why there are other solutions towards quantum computers, and what benefits they might offer in the end. To review other solutions, let us motivate why there can be other solutions in the first place. To anticipate the overall answer: because the underlying concept is easily overlooked when a particular solution is at hand. In the following, we will discuss the mentioned solutions of “gate-basedness” and cryogenic hardware.

The particular solution of gate-based computing originates from the earliest theoretical discussions of quantum computers, starting in the 1980s [10.1007/BF01011339, 10.1007/BF02650179, 10.1007/BF01342185]. First experimental realisations were published less than 2 decades later [10.1103/PhysRevLett.75.4714, https://news.mit.edu/1999/quantum] – compared to many theoretical ideas in quantum mechanics a “fast lane” development. So what is the underlying concept of this solution of gate-basedness? Quantum computing, that is: the quantum part of it, builds upon superposition and entanglement [10.1098/rspa.2002.1097]. These ingredients make room for the characteristic weirdness of quantum mechanics, and are the differentiating factor between classical and quantum bits (technical parenthesis: when focussing on pure states). The spirit of gate-based quantum computing is to establish these features by starting with the most fundamental qubits and forcing them along a sequence of logic gates. Over this course, entanglement is built-up by gates, connecting the lanes. As the input states are following those lanes, one creates “quantum truth tables”. On paper, this is extendable to any desired complexity. In reality, the hurdle to overcome is: the more complex the algorithm, the more gates are required. Which in turn generates the need for longer and longer storage times – coherence is the keyword for the overall scheme – and more efficient gate implementations. Plus each gate needs to be controlled, demanding for more wiring and eventually a better space management. So if we assume that for most quantum-computational benefits, one requires entanglement and superposition, we might ask – is there another path to it? One that shifts the demand to something more, say, scalable? To repeat in simplified terms: in orthodox gate-based quantum computation, we start with unentangled states. Then by means of gates, an algorithm is implemented. The difficult part is thereby the “channel” between input and result, not the input. What if we could move the difficulty from the channel to the generation of input states? It can turn out that creating large entangled states is increasingly easier, the more demanding the tasks becomes. If that means a decreased amount of operations on the input, it comes as a benefit. In fact, from quantum metrology, we learned the tenet that the same measurement sensitivity is achievable with a “simple” input state and a challenging measurement operation – or vice versa [10.1103/PhysRevLett.98.223601]. This tenet is an exploitable consequence of time reversibility in quantum mechanics. While this is not waterproof and while it sweeps many critical concepts under the rug, we draw this analogy for reasons of clarity. As is now suggested to the reader, such an approach also exists for quantum computing. Most commonly it is referred to as measurement-based quantum computing and was introduced in the 2000ths [10.1103/PhysRevLett.86.5188, 10.1038/nphys1157, arXiv:quant-ph/0508124v2]. While we find variations of measurement-based quantum computing, the underlying approach is to create an entangled input state which is sufficiently large to support the amount if information required for the task. Next, only a few basic measurements need to be performed on that state. These measurements are essentially the “gates” in a gate-based approach; however, there is no need for coherence between the gates. Though difficult in their generation, experimental proof of several thousand entangled modes, so-called cluster states, was published 2019 [10.1126/science.aay2645, 10.1126/science.aay4354]. The resulting number of qubits was still less than 10 for the implementations, and error correction is lacking for this demonstration, as well. Since this field of quantum computing is rather young, its potential is yet to be unravelled. To draw one major technical advantage: the demonstrated generation of cluster-states was performed with photons of the telecom wavelength. A need for transduction between “stationary” and “flying” qubits like in IBM’s quantum computer does not exist. Amongst very few, Xanadu is a company that early on capitalises on measurement-based concepts [10.1103/PhysRevA.98.032316]. To mention as well is PsiQuantum; while their exact approach remains secret to the public, they gathered $215 million of funding on what is speculated to be a measurement-based quantum computer [https://psiquantum.com/news/]. WE should note that there is no free lunch also for measurement-based concepts. This motivated theoretical proposals to combine gate- and measurement-based approaches into a “hybrid” concept [10.1038/srep05364].

Now that we covered the first paradigm of circuit implementation, let us turn to the apparent need for cryogenic hardware. Whenever qubits are realised, key is to preserve their coherence over the computation time. Interactions with “the environment”, which is usually depicted as an abstract medium that scrambles information, destroy that important feature. While theoretically no information ever is lost, this idea of the environment is a practical concept. Shielding the qubit from its environment is commonly achieved by cutting off unwanted paths to the system of interest. A viable way is to lower the energy of the environment to a point where it is negligible compared to the qubit’s “activation” energy. One “freezes out” the paths to the qubit. Why special cooling equipment (cryostats) is required in solid state systems is seen when equating energy to frequency (E = hf)  and energy to temperature ( E propto k_B T). For an operation between 1 – 10 GHz, the temperature  is on the order of 50 – 500 mK (-273.10 to -272.65 °C). To shield a qubit of that frequency from its surrounding, the environment should be cooled to temperatures of at least one order of magnitude lower. Thus cryogenics are found in almost any solid state quantum computer. Surely, with the aim to build bigger systems, more heat is potentially introduced to the processing unit, in turn increasing the demand for more cooling power. There are a few other ways to overcome the need for cryogenics, alas. One is: rather than on damping the environment, the coupling between the environment and the qubit is cut. This is achievable by resonators. Another way, seized by startups as Xanadu, is to increase the frequency of the qubit. A drastic increase is achieved by moving from stationary qubits, for instance realised by Josephson junctions, to flying qubits, that is: photons. A photon at telecom wavelength has an energy of 800 meV, which is about 9 000 K. Contributions to a 1550 nm photon from photons at room temperature are rather unlikely. Hence, photonic quantum computers are often operated outside of cryostats. Sometimes cooling is required to enable efficient detection or creation of photons, but that is not a general rule. Why so are cryostats representing quantum computers? Because current cleanroom technology can be adopted to the production of solid-state systems. Making compact optical quantum computers will involve the implementation of optical setups, stretching over square-meter sized tables, to photonic chips. Their maturity is not yet as solid as the one of solid state, and mass production is yet to come – potentially to be on par with semiconductor technology.

To conclude, we have discussed the current paradigms of quantum computing and offered a brief overview and approach outside of the apparent comfort zone, namely gate-based quantum computers working at very low temperatures. As a simple bottom line, the way towards quantum computing should be kept open for exploration. Since investors, public and private, are not necessarily experts on the very technology and are furthermore not always motivated by decade-long success, this is a plea rather towards scientists. By offering alternative approaches, we cannot only decrease the risk of ending up in dead-ends. We can also make room for public participation, we can decrease the risk of fraudulent usage; we can at least provide some ground for the right use of a technology that holds high promises in solving pressing problems across the world. The more investors embarked on quantum technology, specifically computing, the more vital it is to stress the importance of alternative routes towards a technology yet outside of our comfort zone.

 

absolvierte ein Studium der Physikalischen Technik an der FH Münster, MPI Hannover und MPI Erlangen und promovierte an der DTU in Kopenhagen über die Quantenoptik in Kommunikation und Metrologie. Er war Postdoc an der TU Delft und ist seit 2018 bei attocube systems in der Forschungsabteilung.

Um einen Kommentar zu hinterlassen müssen sie Autor sein, oder mit Ihrem LinkedIn Account eingeloggt sein.

21791

share

Artikel teilen

Top Artikel

Ähnliche Artikel