Quantum Machine Learning in Telecommunication

Von   PhD Fred Fung   |  Researcher   |  Huawei Technologies Duesseldorf GmbH
24. Februar 2022

As the world awaits the emergence of powerful quantum computers, there is a need for stakeholders in the optical telecommunication industry to get acquainted with the implications of quantum machine learning that future quantum computing brings.  It would be wise to get ready to harness the potential power, as well as getting involved in experimentation of prototypes.

Quantum technologies hold the promise for solving problems faster. Much progress has been seen in quantum computing researchworldwide fueled by increasing investments by governments and private companies. Worldwide efforts have been made in every part of the quantum computing ecosystem including quantum programming languages, compiler, middleware, control software/hardware, algorithms, various physical implementations (such as superconducting, trapped ion, optics, cold atoms, artificial atoms) and interconnects. It is prime time for telecommunication researchers to understand and investigate how to harness the power of quantum computing to solve telecommunication problems.

In the current telecommunication settings, the input and output data to a problem are classical and thus the classical input data must be packaged into quantum states in order to be fed to a quantum circuit, and likewise the output quantum states need to be transformed into classical readouts through quantum measurements.

Quantum measurements are inherently probabilistic and thus quantum algorithms fit well with classical tasks that do not rely on exact results. Machine learning is also a field whose techniques are well suited to such tasks. Though similar in this aspect, it is not trivial to come up with quantum machine learning algorithms that work well or even outperform the classical counterparts if at all.

Machine learning is useful for solving problems that do not have a known model or have no viable programming approach, such as the problems of clustering, regression, dimensionality reduction, anomaly detection, artificial neural network and language processing. There are many problems in telecommunication where machine learning produce good results. Solving a problem faster or achieving a better solution in a shorter time often means better performance and/or lower costs.

Fiber nonlinearity is a major impairment in optical fiber communication, and here we discuss how quantum machine learning can help.

Optical coherent communication has become the popular technology in long distance high-speed communication, thanks to the advances of fast digital signal processing. In 2005, demonstration of digital carrier-phase estimation led to an increased interest in optical coherent communication.

A typical optical coherent communication system is shown in Figure 1. The input digital data is processed by a digital signal processor to form a data-dependent waveform to be carried by both the in-phase component and the quadrature-phase component of an optical light field. The output optical signal is generated by a laser and modulated by a modulator and goes over an optical fiber to reach the receiver. An example of 4-quadrature amplitude modulation (QAM) is shown. Optical signals experience many impairments as they travel over optical fibers. These impairments include chromatic dispersion (where different spectral components of a signal travel at different speeds), polarization mode dispersion (where the two different polarizations of light travel at different speeds), polarization rotation, and nonlinear distortion contributed by the Kerr effect (which is an electro-optic effect that leads to a dependence of the fiber refractive index on the transmit signal power).

Nonlinear distortion is often a problem for high order modulation signals for long-haul high speed communication. The receiver has to compensate for the nonlinear effects in order to recover the transmitted signal with as low error probability as possible. Figure 2 shows an example of the nonlinear distortion of signal points of 16- QAM in coherent optical communication. Often the problem is to divide the optical phase space into decision regions for the 16 symbols and to assign a symbol/decision region to an incoming received signal.

Machine learning methods such as artificial neural networks (ANN), support vector machine (SVM), and k-means clustering have been proposed to mitigate the problem. K-means clustering has the advantage that no training is required.

The central elements of k-means clustering are centroids in the phase space. Every signal point in the phase space has an associated centroid. By design, it is the nearest centroid. As signal points are being processed, every new point is assigned to the nearest centroid and is added to the cluster of points that centroid represents. The position of the centroid is then updated to be the new center of the cluster of points.

The quantum version of k-means clustering has been studied and has the prospect of speedup over the polynomial runtime of theclassical (i.e. the non-quantum) version [Seth Lloyd, Masoud Mohseni, and Patrick Rebentrost. Quantum algorithms for supervised and unsupervised machine learning. arXiv preprint arXiv:1307.0411, 2013].

The core of the quantum k-means is that the (Euclidean) distance calculation is replaced by a statistical estimation process, where a single execution of an estimation unit produces a distance estimate. Collection of multiple estimates increases the accuracy.

However, the potential speedup offered by the quantum k-means construction relies on clever encodings of the initial classical information onto the quantum state. The difficulty of loading the initial information may erase any potential speedup offered by the core quantum algorithms [E. Tang, “Quantum principal component analysis only achieves an exponential speedup because of its state preparation assumptions,” Phys. Rev. Lett., vol. 127, p. 060503, Aug 2021.] Therefore, more investigations are still needed in order to find out if the quantum algorithms are beneficial and outperform the classical counterparts in a whole system including the data loading part. It is also possible that the quantum algorithms are advantageous in certain ranges of parameters (such as the feature dimension).  Thus, one should not blindly apply quantum algorithms with the hope that speedup is guaranteed, but rather consider more closely the application problem at hand with the actual parameters.

On the ANN front, some research in ANN has also focused on how to solve nonlinearity compensation in optical communication.ANN can function as a classifier and can reproduce the nonlinear input-output relationship of a certain function which is unknown or difficult to implement. Training data is needed to tune the parameters of the ANN and may be formed by the transmitter sending a pre-agreed sequence of signals to the receiver periodically. The trained ANN is then applied to predict the outputs (the information symbols, for example) from the received signals (the distorted information symbol points in the optical phase space, for example). For instance, ANN has been proposed as a nonlinear equalizer to efficiently mitigate fiber nonlinearity in coherent optical orthogonal frequency-division multiplexing (CO-OFDM) systems [E. Giacoumidis, S.T. Le, M. Ghanbarisabagh, M. McCarthy, I. Aldaya, S. Mhatli, et al., Fiber nonlinearity-induced penalty reduction in CO-OFDM by ANN-based nonlinear equalization, Opt. Lett. 40, 5113-5116 (2015).].

A large body of research has made proposals to implement the neural network with quantum circuits in various ways. However, many such investigations are more related to the neuroscience-based Hopfield networks which well implements associative memory than to machine learning. To the best of our knowledge, there has been no proposal of a fully functional and efficient quantum version of neural networks.

In summary, while quantum machine learning methods might not be the antidote for all telecommunication problems especially in the view of loading the classical input data onto quantum states, the methods certainly have the potential to offer advantages in special scenarios and more understanding and investigations are needed.

has been doing research in quantum key distribution and quantum information for over 10 years. His current areas of focus include practical quantum key distribution systems, security proofs, implementation loopholes, and certification. He holds a PhD degree from the University of Toronto.

Um einen Kommentar zu hinterlassen müssen sie Autor sein, oder mit Ihrem LinkedIn Account eingeloggt sein.

24817

share

Artikel teilen

Top Artikel

Ähnliche Artikel