Main navigation | Main content

HOME » PROGRAMS/ACTIVITIES » Annual Thematic Program

PROGRAMS/ACTIVITIES

Annual Thematic Program »Postdoctoral Fellowships »Hot Topics and Special »Public Lectures »New Directions »PI Programs »Math Modeling »Seminars »Be an Organizer »Annual »Hot Topics »PI Summer »PI Conference »Applying to Participate »

Talk Abstract

Variants of Krylov Projection Methods for Eigenvalue Problems

Variants of Krylov Projection Methods for Eigenvalue Problems

Most of the successful methods for solving large eigenvalue problems
consist of combining projection-type techniques with a few other
strategies such as deflation and preconditioning. The standard
Petrov-Galerkin conditions used to define projection methods give rise
to a variety of different methods, some of which have been considered
in recent years. These methods involve two subspaces: a subspace
*K*
from which the approximate eigenvector is extracted (called the
"right" subspace) and a subspace *L* of the same dimension used to
define constraints for computing these eigenvectors (the "left"
subspace). One common choice is to take the left and right space to be
the same and this leads to a standard orthogonal projection method
exemplified by the Lanczos algorithm in the Hermitian case and the
Arnoldi algorithm in the non-Hermitian case. A somewhat intriguing
variation, derived by comparison with the GMRES algorithm for solving
linear systems is to take *L=AK*. An (incorrect) motivation for this
approach is that it acts as a projection operator for the inverse of
the matrix. We will consider these options and a few others, which
involve inverting the operator. We will compare some of these variants
both experimentally and theoretically.