LOBPCG

AcronymDefinition
LOBPCGLocally Optimal Block Preconditioned Conjugate Gradient
References in periodicals archive ?
This estimate is not applicable to BPSD and LOBPCG as it cannot be used recursively.
1 provides us with the only presently known nonasymptotic theoretical convergence rate estimate of the LOBPCG with m > 1.
Our goal is to test PSD, BPSD, LOPCG and LOBPCG methods using multigrid preconditioners for the stiffness matrix A:
The iterations of PSD and LOBPCG are stopped if [[lambda].
Block versions, BPSD and LOBPCG are each started on [U.
This is the case after 13 BPSD iterations but only 8 LOBPCG steps, which again shows the superiority of LOBPCG.
4 demonstrates several properties of LOBPCG that we also observe in other similar tests:
the increase of the block size of the LOBPCG accelerates convergence of extreme eigenpairs.
The LOBPCG in this test behaves similarly to the block Lanczos method applied to [A.
A class of such methods, where the multigrid only appears as a black-box tool of constructing the preconditioner of the stiffness matrix, and the base iterative algorithm is one of well-known off-the-shelf preconditioned gradient methods, such as the LOBPCG method, is argued to be a reasonable choice for large scale engineering computations.
The LOBPCG method can be recommended as practically the optimal method on the whole class of preconditioned eigensolvers for symmetric eigenproblems.