Designing new materials with neural networks

Materials World magazine
,
30 Oct 2018

Computational materials science could get a boost from artificial neural networks. Khai Trung Le reports.

Complications and complexities associated with materials design have long encouraged the use of computational processes in designing new materials. But the list of variables is vast – multiple elements, phase field combinations, compounds and metastable materials, among countless others – and programming for each is also a monumental task. However, a team from the University of California San Diego (UCSD), USA, has turned to artificial neural networks to improve the efficiency of computational materials design.

Computational materials science principally consists of computational simulation and experimental measurement. However, it can be difficult to properly utilise. Yue Liu, a lecturer at Jiangxi Normal University, China, stated in the 2017 paper, Materials discovery and design using machine learning, that ‘it is difficult to use these two methods to accelerate materials discovery and design’, noting that repetitive experimental and theoretical characterisation studies are often time-consuming and inefficient due to the need for ‘chemical intuition and serendipity’ in making any significant progress.

Shyue Ping Ong, Associate Professor of Nano Engineering at UCSD, said, ‘Predicting the stability of materials is a central problem in materials science, physics and chemistry. On one hand, you have traditional chemical intuition such as Linus Pauling’s five rules that describe stability for crystals in terms of the radii and packing of ions. On the other, you have expensive quantum mechanical computations to calculate the energy gained from forming a crystal that have to be done on supercomputers. What we have done is to use artificial neural networks to bridge these two worlds.’

Artificial neural networks are one of the principle tools of machine learning, often brain-inspired systems that seek to emulate the way humans learn. While artificial networks are based on Warren McCulloch and Walter Pitts’ 1943 theory, ‘A logical calculus of the ideas immanent in nervous activity’, it is only in recent years that neural networks have been prominent in AI developments, now that computer scientists have access to both vast computation power and data collection.

The UCSD team’s artificial neural networks predict a crystal’s formation energy from two inputs – electronegativity and ionic radius of the constituent atoms. Models were designed to identify stable materials in two classes of crystals – garnets and perovskites – chosen for their common use in LED lights, rechargeable lithium-ion batteries, and solar cells. Their work can be read in the paper, Deep neural networks for accurate predictions of crystal stability, published in Nature Communications.

Ong claimed the models are up to 10 times more accurate than previous machine learning models, and are fast enough to efficiently screen thousands of materials within hours on a consumer laptop, with an accuracy of classifying stable versus nonstable perovskites between 70–80%, based on set variables and descriptors.

The model has been made publically available at crystals.ai for users to compute the formation energy of any garnet or perovskite, and the researchers are planning to extend the use of artificial neural networks to other crystal prototypes and numerous material properties.

Despite its complexity, computational materials science remains a prevailing field, including at the Lancaster University Materials Science Institute, UK, which opened in 2016 and specialises in the use of computational processes (see Materials World, May 2016, page 14), and the use of an algorithm designed to beat computer games in designing materials, from the University of Tokyo, Japan (see Materials World, October 2017, page 12).


You can access the team’s model at crystals.ai, and you can read the paper, Deep neural networks for accurate predictions of crystal stability, at go.nature.com/2QTLUz3