Digital Annealing – a bridge technology for quantum computing


 / 26. March. 2020

Sorry, this entry is only available in German. For the sake of viewer convenience, the content is shown below in the alternative language. You may click the link to switch the active language.

Quantum computing is at the forefront of the digital world’s attention right now, following Google’s claim to have solved a random sampling task using its Sycamore quantum processor much faster than would be possible using even the fastest supercomputer.

The case is still contentious, but there is no doubt anymore that prototype quantum devices that exploit quantum phenomena actually exist. Many of the world’s leading organizations have already moved beyond investigation and experimentation and are standing up proofs of concept and business cases. Major auto OEMs have announced quantum computing programs. Pharmaceutical companies and chemical companies are looking at areas such as molecular matching for new drug and material discoveries. Utility companies are aiming to optimize ROI from new asset investment, while banks and insurance companies are seeking to optimize portfolio and credit risks. Governments, too, are fascinated by the potential to achieve climate change targets faster, through optimization of transport systems to reduce pollution from traffic jams.

Quantum annealing and digital annealing

Among other things, quantum computing promises the ability to improve business processes by solving a class of ‘combinatorial optimization’ problems. This means identifying the optimal solution from a finite but extremely large set of options by evaluating each possibility. ‘Annealing’ is a probabilistic technique for achieving this by approximating the overall optimum result of a given function. Until now, by using annealing to tackle any combinatorial optimization process there has been a trade-off between precision and risk. In the past, seeking high precision implied the need for more time to calculate the answer – often more time than was available – while accepting a ‘good enough’ answer introduced an increasing amount of risk and the need for a security buffer. The more precise the calculation you can achieve the more cost-efficient the final process will be.

The limitations of true quantum computing

Quantum annealing solves the speed side of this equation however, despite recent breakthrough announcements, it is unlikely to be available for solving real world scenarios or ready for practical use in enterprises in the near future.

When it comes to algorithms, experimentation on quantum computing means that we are now seeing the maturity of quantum algorithms. However, the very early quantum computers available today are still unable to take advantage of these quantum algorithm advances. In quantum annealers, the problem is called ‘chain break’ – essentially the problem breaks as the scale increases resulting in sub-optimal solutions or errors.

In order to produce the correct output for a problem, quantum bits (qubits) must remain in a quantum state at near absolute-zero temperatures, free from any outside interference including cosmic or magnetic rays. Without all this, the qubits collapse out of their delicate entangled state losing all quantum acceleration and of course also rendering any calculation impossible.

The fragility of these quantum states makes quantum computing prone to error and creates a corresponding need for error correction. This consumes a sizable proportion of an already sparse pool of qubits, making it practically impossible to solve large scale problems. Therefore, quantum computing has largely been restricted to research purposes only. There is also a frequent misconception in quantum computing about the difference between physical and logical qubits. Today, based on the systems available in the market, the required error correction depends on the problem and the algorithm being used. For quantum gate computers (e.g. IBM, Google), the estimates range between 100 and 1,000 physical qubits representing an error-corrected logical qubit, in some cases even more. For quantum annealers, noise is less problematic due to their adiabatic ‘analog’ process. However, the sparse connectivity between physical qubits makes it necessary to represent logical qubits by sets of physical qubits.  A fully-connected logical qubit typically requires a factor of approximately 30 to 80 physical qubits.

To put this in perspective, at the TCS Quantum Symposium held in Bombay on April 7, 2019 [1], the quantum algorithms performed much better on classical computing systems in test results intended to showcase real world quantum computers– re-establishing the fact that quantum computers are simply not ready yet and we are still trying to find the best way to create a less error-prone quantum computer.

Fujitsu’s scientists were keen on finding how to solve these critical quantum optimization problems and were among the first to realize that the software being developed for quantum computers could be applied to digital architectures. Based on this insight, they created the Digital Annealer, a new circuit design inspired by quantum phenomena.

Digital annealing is more precise, more robust than quantum annealing – and available today

Today’s quantum annealers suffer from limitations in solving large scale problems due to the limited number of connections between qubits. On the other hand, the Digital Annealer architecture has a fully-connected architecture. It can, therefore, solve large-scale combinatorial optimization problems very quickly and – hugely important – more accurately today than quantum annealing with its limited qubit connections. The technology also has none of the cost, energy and deployment hurdles being experienced by quantum devices today.

Digital Annealing has been described by independent industry analysts as a unique opportunity to preempt quantum computing and achieve the first stage benefits of optimization today, working within current data center constraints. They talk about creating a ‘bridge’ to the quantum future – getting the benefits of combinatorial optimization today while also learning how true quantum computing can be applied to operations in the future.

How quantum-inspired optimization is being used today

There are tangible benefits in almost all areas of industry, ranging from the optimization of logistics and manufacturing processes, to materials research in the chemical and pharmaceutical industries, to significantly improved portfolio and risk calculation in the financial industry.

In automotive manufacturing, the Digital Annealer has been applied in job-shop scheduling, engineering design and just-in-time manufacturing optimization for robot positioning for chassis welding, which has a significant impact on manufacturing efficiency and cost. For the automotive manufacturer BMW, Digital Annealer is calculating the best assignment of welds (or seams) to PVC sealing robots as well as the optimal path for the robots in setting out from and returning to their base positions. Currently, prototype quantum computing solutions addressing this challenge are able to compute optimization for about seven seams. Working with the Fujitsu Digital Annealer, a trip of 64 seams can be calculated. This increase from seven to 64 seams is not just 9x the number of seams. The number of possible combinations to choose from increases by a factor of 10100, a total far beyond the assumed number of atoms in the whole universe.

For the car maker, this has resulted in production of more vehicles without investment in additional resources and has led to a reduction in paint-shop costs – which account for 30 to 50 percent of automotive OEM’s manufacturing costs.

In financial services, main incubator GmbH, Commerzbank’s research and development unit has successfully concluded a loan portfolio management Proof of Concept (PoC) , leveraging the Digital Annealer. Focusing on receivables from vehicle leasing contracts, the PoC optimized the selection of several thousand vehicle leasing assets for a securitization portfolio. Critical factors taken into simultaneous consideration included regulatory requirements, absolute volume limits and percentage limits for specific asset characteristics needed to achieve greater risk diversification.

The Port of Hamburg is one of the largest seaports in the world, with an annual throughput of 136.5 million tons and up to 12,000 trucks calling at the port every day. It is also located in the heart of a busy city and has been prone to congestion causing delays to freight transfers. The Port of Hamburg Authority is working with the Fujitsu Digital Annealer team on a PoC to find a new approach to traffic management in the port area. Together they developed a combinatorial optimization model with excellent evaluation performance and scalability for realistic street and traffic simulations that can be used to optimize traffic flow at traffic lights in the port area.

As these examples show, Digital Annealing makes it possible to solve complex combinatorial optimization problems under real-time conditions – even before practically usable real quantum computers are available for companies.



Über den Autor / die Autorin:

Carsten Meurer Carsten Meurer is Head of Sales Financial Services at Fujitsu, where he focuses on new and innovative solutions and their use in the financial and insurance industries.