Constructive Analysis of Eigenvalue Problems in Control under Numerical Uncertainty

  • PDF / 420,842 Bytes
  • 9 Pages / 594.77 x 793.026 pts Page_size
  • 79 Downloads / 210 Views

DOWNLOAD

REPORT


ISSN:1598-6446 eISSN:2005-4092 http://www.springer.com/12555

Constructive Analysis of Eigenvalue Problems in Control under Numerical Uncertainty Pavel Osinenko*, Grigory Devadze, and Stefan Streif Abstract: The eigenvalue problem plays a central role in linear algebra and its applications in control and optimization methods. In particular, many matrix decompositions rely upon computation of eigenvalue-eigenvector pairs, such as diagonal or Jordan normal forms. Perturbation theory and various regularization techniques help address some numerical difficulties of computation eigenvectors, but often rely on per se uncomputable quantities, such as a minimal gap between eigenvalues. In this note, the eigenvalue problem is revisited within constructive analysis allowing to explicitly consider numerical uncertainty. Exact eigenvectors are substituted by approximate ones in a suitable format. Examples showing influence of computation precision are provided. Keywords: Approximate solutions, constructive analysis, eigenvalues, eigenvectors, fundamental theorem of algebra.

1.

INTRODUCTION

Let A be a complex-valued n × n matrix. Its characteristic polynomial is given as: PA (λ ) = det (A − λ I) .

(1)

An eigenpair (v j , λ j ) is a root λ j , j ∈ {1, . . . n} of (1) and a vector v j satisfying: Av j = λ j v j .

(2)

Computing eigenpairs is crucial for many control and optimization methods [1–3], because these methods are often based on certain matrix decompositions such as diagonal normal form, Jordan form or singular value decompositions (SVD). For example, SVD is used in H2 , H∞ and µ-optimal controllers by decoupling the original system into lower-dimensional ones, e.g., [4], and requires computing left and right eigenvectors. SVD is also used in deriving reduced models and controllers which preserve important system properties such as closed-loop stability, observability or controllability [5, 6]. In model predictive control, SVD of the cost function’s Hessian is often used (see, e.g., [7]). System identification is another field of application of SVD. For instance, Zhang et al. [8] used SVD on extended Kalman filter to cope with numerical illconditioning. Eigenvectors are used in computing Jordan normal form for stability analysis of linear systems [9]. For a survey on controller design based on matrix normal

forms refer to [10]. Computing eigenvectors amounts to finding non-trivial solutions to the system of linear equations (2). The classical method of solving systems of linear equations is Gaussian elimination (see, e.g., [11]). In general, solving a system of linear equations Av = b, where A is the coefficient matrix, b is the data vector, and v is an unknown vector, amounts to comparing real numbers to zero. In computations, real numbers are represented by computer programs or algorithms that compute the respective approximations. There is, however, no algorithm that can decide whether α = β or α = 6 β for arbitrary real numbers α and β . Such an algorithm would be equivalent to solving the problem of deciding