![]() |
Computational & Technology Resources
an online resource for computational,
engineering & technology publications |
||
Civil-Comp Proceedings
ISSN 1759-3433 CCP: 79
PROCEEDINGS OF THE SEVENTH INTERNATIONAL CONFERENCE ON COMPUTATIONAL STRUCTURES TECHNOLOGY Edited by: B.H.V. Topping and C.A. Mota Soares
Paper 229
A Jacobi-Davidson Type Projection Method for General Nonlinear Eigenproblems H. Voss
Section of Mathematics, Hamburg University of Technology, Germany Full Bibliographic Reference for this paper
H. Voss, "A Jacobi-Davidson Type Projection Method for General Nonlinear Eigenproblems", in B.H.V. Topping, C.A. Mota Soares, (Editors), "Proceedings of the Seventh International Conference on Computational Structures Technology", Civil-Comp Press, Stirlingshire, UK, Paper 229, 2004. doi:10.4203/ccp.79.229
Keywords: eigenvalue, eigenvector, nonlinear eigenproblem, iterative projection method, Jacobi-Davidson method, rational eigenproblem, damped vibrations of structures.
Summary
In this paper we consider the nonlinear eigenvalue problem
where ![]() ![]() ![]() ![]() ![]() ![]() For linear sparse eigenproblems iterative projection methods such as the Lanczos, Arnoldi, rational Krylov or Jacobi-Davidson methods are very efficient. In these approaches one determines approximations to the wanted eigenvalues and corresponding eigenvectors from projections of the large eigenproblem to low-dimensional subspaces which are expanded in the course of the algorithm. The small projected eigenproblems are solved by standard techniques.
In some sense, Ruhe [1] generalised the rational Krylov approach to sparse
nonlinear eigenvalue problems by nesting the linearisation of problem (65) by
Regula falsi and the solution of the resulting linear eigenproblem by Arnoldi's
method, where the Regula falsi iteration and the Arnoldi recursion are knit
together. Similarly as in the rational Krylov process he constructs a sequence
In this paper we suggest an iterative projection method for the nonlinear eigenproblem
where the two subtasks mentioned in the last paragraph are handled separately.
We order the eigenvalues in some way and determine them one after another. If
Inverse iteration with variable shifts converges quadratically to simple eigenvalues.
Therefore, the expansion
For linear eigenproblems it is well known that the Jacobi-Davidson method yields the
same expansion of the search space as inverse iteration if the the correction equation
is solved exactly, and solving it only approximately still yields fast convergence.
The same holds true for nonlinear eigenproblems if the search space
and as in the linear case this equation has to be solved only approximately in a Jacobi-Davidson like manner. For a symmetric problem allowing a minmax characterisation of its eigenvalues this was discussed in [3]. The present paper deals with the general case. References
purchase the full-text of this paper (price £20)
go to the previous paper |
|||