We first observe that for all tolerances, convergence is achieved using just two iterations of the LOBPCG eigensolver.
KNYAZEV, lobpcg.m, MATLAB implementation of the locally optimal block preconditioned conjugate gradient method, accessed: 2015-12-09.
Solution of the local eigenvalue problems by LOBPCG with blocksize 10 and indicated number of maximum iterations.
These numerical experiments provide clear evidence for regarding LOBPCG as practically the optimal scheme (within that class of preconditioned eigensolvers we consider).
In the present paper, we shall consider two methods: the preconditioned steepest descent (PSD) and the locally optimal preconditioned conjugate gradient method (LOPCG) as well as their block analogs: the block preconditioned steepest descent (BSPD) and the locally optimal block preconditioned conjugate gradient method (LOBPCG), in the form they appear in [34, 36].
A different version of the preconditioned block steepest descent is described in , where the Rayleigh-Ritz method on step 6 is split into two parts, similar to that of the LOBPCG II method of  which we discuss later.
Our second Algorithm 3.2 of the LOBPCG method is similar to Algorithm 3.1, but utilizes an extra set of vectors, analogous to conjugate directions used in the standard preconditioned conjugate gradient linear solver.
We want to highlight that the main loop of Algorithm 3.2 of LOBPCG can be implemented with only one application of the preconditioner T, one matrix-vector product Bx and one matrix-vector product Ax, per iteration; see  for details.
A different version of the LOBPCG, called LOBPCG II, is described in , where the Rayleigh-Ritz method on step 6 of Algorithm 3.2 is split into two parts.
Other versions of LOBPCG are possible, e.g., the successive eigenvalue relaxation technique of  can be trivially applied to the LOBPCG.
Comparing the BPSD and LOBPCG algorithms, one realizes that the only difference is that the LOBPCG uses an extra set of directions [p.sup.(i).sub.j] in the trial subspace of the Rayleigh-Ritz method.
A seemingly natural idea is to try to accelerate the LOPCG and LOBPCG by adding more vectors to the trial subspace.