Second-Derivative Based Searches

At present there are eight possible single-ended search types that employ Hessian calculation (or update) and full diagonalisation. They are specified by the SEARCH keyword with a parameter represented by the internal integer variable INR, with allowed values ranging from 0 to 7. The two most common types are 0, for minimisation, and 2, for transition state searching. The steps have the same form in both cases,[73] except that we go downhill in every direction when minimising, and uphill in one direction when looking for a transition state. For these search types convergence to stationary points with the wrong Hessian index (i.e. not 0 or 1 negative Hessian eigenvalues, respectively) is detected and countermeasures are taken as specified below.

The eigenvector to be followed `uphill' in an eigenvector-following transition state search is specified by an integer following the keyword MODE in odata. We count up from the eigenvector corresponding to the smallest eigenvalue, the one corresponding to the second smallest eigenvalue etc. using values 1, 2 and so on. The value specified by MODE is only effective on the first step: for subsequent steps the critical eigenvector is chosen by a maximum overlap criterion using the dot product with the eigenvector that was followed at the previous step. Using MODE 0 means that the eigenvector corresponding to the smallest Hessian eigenvalue is chosen at every step.

Quite often, especially if we are taking large steps in a region far from convergence, the largest modulus dot product may fall well below unity and give rise to ambiguity. At present the program takes the following action if the maximum overlap is less than 0.7. The chosen eigenvector is simply set to the same one that was followed in the last step, where the eigenvectors are numbered in terms of the corresponding eigenvalues arranged in ascending order. Zero eigenvalues are excluded from the counting. The same action is taken if the maximum overlap is with an eigenvector lying more than 8 places above the one followed at the previous step, when arranged according to ascending eigenvalue.

The behaviour for other search types is as follows. SEARCH 1 specifies a pseudo-Newton-Raphson search where the formula for the steps is the same as the eigenvector-following step for search types 0 and 2, but we search uphill or downhill depending only upon the sign of the eigenvalue corresponding to the direction in question.[73] Of course, this step tends to the conventional Newton-Raphson step near a stationary point. For SEARCH 1 convergence to stationary points of any Hessian index is allowed.

Search types 3 and 4 are the same as 0 and 2 except that a pseudo-third derivative correction is applied to the step[73]. These search types are now redundant because dynamic scaling using a trust radius and separate maximum step sizes for each direction seems to work much better (see §8).

Search type 5 is the same as type 0 except that the system is rotated into principal axes first.

Search types 6 and 7 are for steepest-descent energy minimisations using the Page-McIver method[74] with analytic first and second derivatives at each step. Search type 7 can converge to a saddle point, search type 6 can only converge to a true minimum.

Search type 8 is a steepest-ascent transition state search using a modification of the Page-McIver steepest-descent algorithm.

Search types 0, 6 and 7 can be used as minimisers for pathway calculations. In this case it is possible to use a subset of eigenvalues and eigenvectors by supplying USEEV n in odata. The number of second derivative-based minimisation steps can be controlled with PATHSDSTEPS n, after which OPTIM will switch to LBFGS.

David Wales 2017-09-25