**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Machine learning for metallurgy I. A neural-network potential for Al-Cu

Abstract

High-strength metal alloys achieve their performance via careful control of precipitates and solutes. The nucleation, growth, and kinetics of precipitation, and the resulting mechanical properties, are inherently atomic scale phenomena, particularly during early-stage nucleation and growth. Atomistic modeling using interatomic potentials is a desirable tool for understanding the detailed phenomena involved in precipitation and strengthening, which requires length and timescales far larger than those accessible by first-principles methods. Current interatomic potentials for alloys are not, however, sufficiently accurate for such studies. Here a family of neural-network potentials (NNPs) for the Al-Cu system are presented as a first example of a machine learning potential that can achieve near-first-principles accuracy for many different metallurgically important aspects of this alloy. High-fidelity predictions of intermetallic compounds, elastic constants, dilute solid-solution energetics, precipitate-matrix interfaces, generalized stacking fault energies and surfaces for slip in matrix and precipitates, antisite defect energies, and other quantities, are shown. The NNPs also captures the subtle entropically induced transition between θ and θ at temperatures around 600 K. Many comparisons are made with the state-of-the-art angular-dependent potential for Al-Cu, demonstrating the significant quantitative benefit of a machine learning approach. A preliminary kinetic Monte Carlo study shows the NNP to predict the emergence of GP zones in Al-4at%Cu at T = 300 K in agreement with experiments. These studies show that the NNP has significant transferability to defects and properties outside the structures used to train the NNP but also shows some errors highlighting that the use of any interatomic potential requires careful validation in application to specific metallurgical problems of interest.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (32)

Related publications (35)

Interatomic potential

Interatomic potentials are mathematical functions to calculate the potential energy of a system of atoms with given positions in space. Interatomic potentials are widely used as the physical basis of molecular mechanics and molecular dynamics simulations in computational chemistry, computational physics and computational materials science to explain and predict materials properties.

Machine learning

Machine learning (ML) is an umbrella term for solving problems for which development of algorithms by human programmers would be cost-prohibitive, and instead the problems are solved by helping machines 'discover' their 'own' algorithms, without needing to be explicitly told what to do by any human-developed algorithms. Recently, generative artificial neural networks have been able to surpass results of many previous approaches.

Force field (chemistry)

In the context of chemistry and molecular modelling, a force field is a computational method that is used to estimate the forces between atoms within molecules and also between molecules. More precisely, the force field refers to the functional form and parameter sets used to calculate the potential energy of a system of atoms or coarse-grained particles in molecular mechanics, molecular dynamics, or Monte Carlo simulations. The parameters for a chosen energy function may be derived from experiments in physics and chemistry, calculations in quantum mechanics, or both.

Related MOOCs (23)

Neuronal Dynamics - Computational Neuroscience of Single Neurons

The activity of neurons in the brain and the code used by these neurons is described by mathematical neuron models at different levels of detail.

Neuronal Dynamics - Computational Neuroscience of Single Neurons

The activity of neurons in the brain and the code used by these neurons is described by mathematical neuron models at different levels of detail.

Neuronal Dynamics 2- Computational Neuroscience: Neuronal Dynamics of Cognition

This course explains the mathematical and computational models that are used in the field of theoretical neuroscience to analyze the collective dynamics of thousands of interacting neurons.

Atomic simulations using machine learning interatomic potential (MLIP) have gained a lot of popularity owing to their accuracy in comparison to conventional empirical potentials. However, the transferability of MLIP to systems outside the training set pose ...

Federico Grasselli, Paolo Pegolo

Accessing the thermal transport properties of glasses is a major issue for the design of production strategies of glass industry, as well as for the plethora of applications and devices where glasses are employed. From the computational standpoint, the che ...

Statistical (machine-learning, ML) models are more and more often used in computational chemistry as a substitute to more expensive ab initio and parametrizable methods. While the ML algorithms are capable of learning physical laws implicitly from data, ad ...