Orthogonality

Orthogonality is the mathematical formalization of the geometrical property of perpendicularity, as adapted to general inner product spaces. In linear algebra, bases consisting of mutually orthogonal elements play an essential role in theoretical developm

  • PDF / 935,836 Bytes
  • 52 Pages / 504.547 x 719.979 pts Page_size
  • 15 Downloads / 214 Views

DOWNLOAD

REPORT


Orthogonality Orthogonality is the mathematical formalization of the geometrical property of perpendicularity, as adapted to general inner product spaces. In linear algebra, bases consisting of mutually orthogonal elements play an essential role in theoretical developments, in a broad range of applications, and in the design of practical numerical algorithms. Computations become dramatically simpler and less prone to numerical instabilities when performed in orthogonal coordinate systems. Indeed, many large-scale modern applications would be impractical, if not completely infeasible, were it not for the dramatic simplifying power of orthogonality. The duly famous Gram–Schmidt process will convert an arbitrary basis of an inner product space into an orthogonal basis. In Euclidean space, the Gram–Schmidt process can be reinterpreted as a new kind of matrix factorization, in which a nonsingular matrix A = Q R is written as the product of an orthogonal matrix Q and an upper triangular matrix R. The Q R factorization and its generalizations are used in statistical data analysis as well as the design of numerical algorithms for computing eigenvalues and eigenvectors. In function space, the Gram–Schmidt algorithm is employed to construct orthogonal polynomials and other useful systems of orthogonal functions. Orthogonality is motivated by geometry, and orthogonal matrices, meaning those whose columns form an orthonormal system, are of fundamental importance in the mathematics of symmetry, in image processing, and in computer graphics, animation, and cinema, [5, 12, 72, 73]. The orthogonal projection of a point onto a subspace turns out to be the closest point or least squares minimizer, as we discuss in Chapter 5. Yet another important fact is that the four fundamental subspaces of a matrix that were introduced in Chapter 2 come in mutually orthogonal pairs. This observation leads directly to a new characterization of the compatibility conditions for linear algebraic systems known as the Fredholm alternative, whose extensions are used in the analysis of linear boundary value problems, differential equations, and integral equations, [16, 61]. The orthogonality of eigenvector and eigenfunction bases for symmetric matrices and self-adjoint operators provides the key to understanding the dynamics of discrete and continuous mechanical, thermodynamical, electrical, and quantum mechanical systems. One of the most fertile applications of orthogonal bases is in signal processing. Fourier analysis decomposes a signal into its simple periodic components — sines and cosines — which form an orthogonal system of functions, [61, 77]. Modern digital media, such as CD’s, DVD’s and MP3’s, are based on discrete data obtained by sampling a physical signal. The Discrete Fourier Transform (DFT) uses orthogonality to decompose the sampled signal vector into a linear combination of sampled trigonometric functions (or, more accurately, complex exponentials). Basic data compression and noise removal algorithms are applied to the discrete Fourier co