Researchers at the MISIS University have developed a new protocol for quantum computing that improves the search for optimal solutions. The approach is based on deliberately introducing special noise channels. In the future, this development could significantly increase the accuracy and speed of quantum computations.

Despite their great potential, quantum machine learning tasks face serious challenges in training and optimization. Because of the vast number of possible solutions—many of which are suboptimal—algorithms often get “stuck” before reaching the best solutions. The protocol developed at NUST MISIS helps regulate optimization landscapes through specially designed noise channels.
Ordinarily, noise interferes with the efficient operation of quantum algorithms. Any interaction with the environment—such as random temperature fluctuations or electromagnetic disturbances—causes errors in calculations. The team demonstrated that the use of specialized noise channels effectively smooths out small-scale fluctuations in the loss function, enabling algorithms to find higher-quality solutions.
“When we train a model—be it a classical neural network or a quantum algorithm—it has a loss function. This function measures how far the model is from solving a problem correctly: the higher the loss, the worse the performance. A model can have many parameters, such as rotations, phases, or weights. Each combination of these parameters produces a result, and the loss function assigns it a value—a ‘height.’ Imagine you’re standing on a mountain trying to reach the lowest point. The height tells you how far you are from your goal. Along the way, you encounter many small pits and dips, where it’s easy to get stuck without ever reaching the bottom That’s usually what happens—we wander and fall into local traps. Our method is like filling those small pits with sand. It levels the surface, making the path smoother: you don’t get stuck and can keep moving toward the goal. In this way, adding noise—regularization—smooths the landscape and makes it much easier to find the optimal solution,” Dr. Nikita Nemkov, Senior Research Fellow at the NUST MISIS Quantum Information Technology Laboratory.
The protocol introduces a controlled amount of noise to specific elements of a quantum circuit, which smooths out the loss function. The team tested their algorithm on benchmark problems and a quantum convolutional neural network. In both cases, the protocol improved performance: the likelihood of finding the correct solution increased several times compared to traditional methods. Detailed results of the study are published in the Physical Review A journal (Q1).
“The difficulty of training variational quantum algorithms and quantum machine learning models is well known. Our proposed protocol can be combined with an existing method for mitigating local minima—the quantum natural gradient optimizer—as well as complement other approaches to optimizing quantum loss functions. Technically, the protocol requires few additional resources and can be applied both in classical quantum circuit simulators and on real quantum devices,” Dr. Alexey Fedorov, PhD, Director of the Institute of Physics and Quantum Engineering at NUST MISIS and head of the Quantum Information Technology research group at the RCC.
The study was supported by the Russian Science Foundation (grant No.
as well as within the framework of the NUST MISIS strategic technological project Quantum Internet under the Russian Ministry of Education and Science’s Priority-2030 program.