Presentation Title

Using machine learning techniques to compromise future runs in the search for the global minimum of chemically relevant structures.

Faculty Mentor

Dr. Groves

Start Date

17-11-2018 8:30 AM

End Date

17-11-2018 10:30 AM

Location

HARBESON 67

Session

POSTER 1

Type of Presentation

Poster

Subject Area

engineering_computer_science

Abstract

Machine learning has accounted for solving a cascade of data in an efficient and timely manner. In this project, we are trying to enhance the search scheme for global optimization of chemical structures using the pre-existing software package Atomistic Machine-learning Package (AMP)1. The approach consisted of testing the neural network (NN) to learn more about its capabilities while comparing it to DFTB+, a fast-empirical method. This NN is tested by altering the number of nodes and hidden-layers. The comparison is quantified by an R2 value from a fit of the two calculated potential energies to a 1:1 line. Recent studies have suggested that the neural network tended to work best with 3 or 4 hidden layers between 32 and 120 nodes in each layer. However, our results seem to contradict this trend at smaller numbers of nodes with three or four hidden layers. A potential reason for this phenomenon is because we have organized clusters and introduced a train/test cycle. The train/test cycle works when a cluster of molecules is trained and each of these trained clusters are tested on the remaining structures. Further demonstrations for this methodology will be presented in this paper.

This document is currently not available here.

Share

COinS
 
Nov 17th, 8:30 AM Nov 17th, 10:30 AM

Using machine learning techniques to compromise future runs in the search for the global minimum of chemically relevant structures.

HARBESON 67

Machine learning has accounted for solving a cascade of data in an efficient and timely manner. In this project, we are trying to enhance the search scheme for global optimization of chemical structures using the pre-existing software package Atomistic Machine-learning Package (AMP)1. The approach consisted of testing the neural network (NN) to learn more about its capabilities while comparing it to DFTB+, a fast-empirical method. This NN is tested by altering the number of nodes and hidden-layers. The comparison is quantified by an R2 value from a fit of the two calculated potential energies to a 1:1 line. Recent studies have suggested that the neural network tended to work best with 3 or 4 hidden layers between 32 and 120 nodes in each layer. However, our results seem to contradict this trend at smaller numbers of nodes with three or four hidden layers. A potential reason for this phenomenon is because we have organized clusters and introduced a train/test cycle. The train/test cycle works when a cluster of molecules is trained and each of these trained clusters are tested on the remaining structures. Further demonstrations for this methodology will be presented in this paper.