Amp: A modular approach to machine learning in atomistic simulations

Published: 19 Aug 2016 | Version 1 | DOI: 10.17632/b3jnxf8m7c.1

This dataset has been viewed by the given number of unique visitors

One or more files have been downloaded the given number of times

Viewed 370 Downloaded 71
Contributor(s):

Description of this data

Electronic structure calculations, such as those employing Kohn–Sham density functional theory or ab initio wavefunction theories, have allowed for atomistic-level understandings of a wide variety of phenomena and properties of matter at small scales. However, the computational cost of electronic structure methods drastically increases with length and time scales, which makes these methods difficult for long time-scale molecular dynamics simulations or large-sized systems. Machine-learning techniques can provide accurate potentials that can match the quality of electronic structure calculations, provided sufficient training data. These potentials can then be used to rapidly simulate large and long time-scale phenomena at similar quality to the parent electronic structure approach. Machine-learning potentials usually take a bias-free mathematical form and can be readily developed for a wide variety of systems. Electronic structure calculations have favorable properties–namely that they are noiseless and targeted training data can be produced on-demand–that make them particularly well-suited for machine learning. This paper discusses our modular approach to atomistic machine learning through the development of the open-source Atomistic Machine-learning Package (Amp), which allows for representations of both the total and atom-centered potential energy surface, in both periodic and non-periodic systems. Potentials developed through the atom-centered approach are simultaneously applicable for systems with various sizes. Interpolation can be enhanced by introducing custom descriptors of the local environment. We demonstrate this in the current work for Gaussian-type, bispectrum, and Zernike-type descriptors. Amp has an intuitive and modular structure with an interface through the python scripting language yet has parallelizable fortran components for demanding tasks; it is designed to integrate closely with the widely used Atomic Simulation Environment (ASE), which makes it compatible with a wide variety of commercial and open-source electronic structure codes. We finally demonstrate that the neural network model inside Amp can accurately interpolate electronic structure energies as well as forces of thousands of multi-species atomic systems.

Experiment data files

peer reviewed

This data is associated with the following peer reviewed publication:

Amp: A modular approach to machine learning in atomistic simulations

Cite this article

Alireza Khorshidi, Andrew A. Peterson, Amp: A modular approach to machine learning in atomistic simulations, November 2016, Volume 207, Pages 310-324, ISSN 00104655, http://dx.doi.org/10.1016/j.cpc.2016.05.010

Published in: Computer Physics Communications

Latest version

  • Version 1

    2016-08-19

    Published: 2016-08-19

    DOI: 10.17632/b3jnxf8m7c.1

    Cite this dataset

    Khorshidi, Alireza; Peterson, Andrew A. (2016), “Amp: A modular approach to machine learning in atomistic simulations”, Mendeley Data, v1 http://dx.doi.org/10.17632/b3jnxf8m7c.1

Categories

Natural Sciences

Licence

GPLv3 Learn more

The files associated with this dataset are licensed under a GNU Public License Version 3 licence.

What does this mean?

The GNU General Public License is a free, copyleft license for software and other kinds of works.

Report