organize an event at Skoltech
learn more
campus tours for universities
book your tour
An international team of researchers has developed a new method for parametrizing machine-learning interatomic potentials (MLIP) to simulate magnetic materials, making the prediction of their properties much more reliable and accurate. A key feature of the new approach is that the models of interatomic interactions are trained on so-called ‘magnetic forces’. The research opens the door to faster research and design of materials for next-generation electronics, medicine, and sensors. The research, supported by a grant from the Russian Science Foundation (No 22-73-10206), was published in Computational Materials Science.
Magnetic materials are all around us – from compass hands and refrigerator magnets to sophisticated devices in computers, medical tomographs, and industrial sensors. Controlling magnetism at the atomic level is key to future technologies, such as spintronics, which exploits both the charge and the spin of an electron, targeted drug delivery using magnetic nanoparticles, or ultra-sensitive sensors.
Traditionally, the properties of materials are studied experimentally. However, this research is often too expensive and requires ultrapure samples (impurities strongly affect magnetism) and complex equipment. This is where simulation comes in.
One of the most accurate simulation methods, Density Functional Theory (DFT) based on quantum mechanics, does not require enormous computing resources to accurately calculate the properties of a material. Simulating even a few thousand atoms is a daunting task, but it is at this scale that many important effects, such as lattice defects or phase transitions, can be studied.
To overcome this obstacle, researchers are focusing on the development of MLIP – AI-based models that learn to predict the energy of a system and the forces acting on atoms based on data obtained from accurate but slow DFT calculations. MLIPs are orders of magnitude faster than DFT calculations, helping to model large systems and long processes.
However, since standard MLIPs are not sufficient to study magnetic materials, the need to explicitly account for the magnetic moments of the atoms in the functional form of potentials brought their magnetic counterparts to life. But here a new problem arises: MLIP training requires much more data from even more expensive spin-polarized DFT calculations, because MLIPs must take into account both the arrangement of the atoms and the magnitude and direction of the magnetic moments.
The authors were able to create accurate and reliable MLIPs that require a limited amount of expensive training data. Their key idea was to train MLIPs on magnetic forces − negative derivatives of energy with respect to magnetic moments – in addition to energies, atomic forces, and stresses. The training was performed on data computed for about 2,600 different atomic configurations of an iron-aluminum (Fe-Al) alloy with different component ratios. This system, with its interesting magnetic properties, is used in various technological applications.
A comparison of the potentials trained only on energies, forces, and stresses with those trained also on magnetic forces showed significant advantages of the new approach.
Importantly, the new method showed a tenfold reduction in prediction error for magnetic forces, but virtually no change for energies and conventional forces. The models trained on magnetic forces also proved to be more accurate in predicting the equilibrium magnetic moments of iron atoms.
Equally important was an increase in the reliability of the trained MLIPs. Geometric optimization showed that for iron-aluminum, the models not trained on magnetic forces either failed to relax the atomic structure or produced physically irrelevant results. Magnetic force training showed 100% reliability, with successful relaxation calculations and physically meaningful results, which is crucial for the practical use of MLIPs. In fact, magnetic force training helps to obtain a reliable model even with a relatively small training dataset.
The team successfully applied the best of the generated potentials to simulate the behavior of Fe-Al at room temperature (300 K) using molecular dynamics. The simulation results were in perfect agreement with the thermal expansion of the material observed in the experiment, despite a slight difference in the values, which may be due to the limitations of DFT in building the training dataset. This means that the new approach can be used to study dynamics and temperature effects.
Ivan Novikov, an associate professor at the HSE Faculty of Computer Science, an associate professor at the MIPT Department of Chemical Physics of Functional Materials, and a senior research scientist at Skoltech, comments: “The key idea behind our study was to show that magnetic forces, which are usually ignored when training the potentials, carry additional information about interatomic interactions in magnetic materials. By taking these forces into account when training the potentials, we were able to not only make the prediction of magnetic properties more accurate, but, just as importantly, improve the credibility of the simulations. We can now simulate complex magnetic systems more reliably with the same amount of expensive quantum computing, making such studies more affordable and consistent.”
The novelty of the research lies in the systematic development, application, and comprehensive validation of the new method. The study provides convincing evidence that the proposed approach not only works, but also brings significant gains in simulation reliability and accuracy, especially with a limited budget for quantum computing. In the future, reliable and fast MLIPs will enable effective virtual screening and optimization of the compositions of new magnetic alloys, materials for permanent magnets, magnetocaloric materials (for magnetic cooling), and spintronic components. It is now possible to simulate large systems with tens of thousands of atoms and to look into the effects of defects, grain boundaries, and nanostructuring on magnetic properties, as well as to study magnetic phase transitions, for example, determine the Curie temperature. Understanding magnetism at the atomic level is essential for improving the performance of electric motors, generators, transformers, data recording devices, and medical diagnostic and therapeutic systems, such as MRI.
The new method can work hand in hand with active learning algorithms that can identify the essential quantum computations required for further refinement of the model while the simulation is running. This will also help to reduce the number of DFT calculations.
The study was carried out by researchers from Skoltech, MIPT, HSE, the Institute of Solid State Chemistry and Mechanochemistry of the Siberian Branch of RAS, the Emanuel Institute of Biochemical Physics of RAS, and their colleagues from Germany, Norway, the United States, and Austria.