September 1, 2014

Machine-learning acceleration of electronic structure calculations

Trained to predict outcomes of DFT calculations.

Despite advances in computational power and calculation methods, electronic structure calculations are still quite expensive. Calculations are typically limited in size to tens of atoms, making the sorts of questions that can be answered with such calculations limited.

We are developing an open-source, ASE-compatible implementation of machine-learning atomistic software to predict the outcome of potential-energy surface calculations. The figure to the right shows a simple training and test set of images; a network was fit to the training images and correctly predicted the energies of the test set. This software is freely available via bitbucket and is constantly being updated and improved. In this package, we implement two basic types of machine-learning acceleration. In the simplest, we implement a conventional Cartesian implementation of a neural network to predict the outcome of potential energy calculations based solely on the Cartesian (x, y, z) coordinates of the atoms in identically sized calculations. In the second, we implement the Behler-Parrinello algorithm (with inspiration from Behler, Parrinello, Artrith, and others) which creates local symmetry functions about each atom in the simulation, making the calculations extensible to large system sizes.

Access the code at bitbucket.org.