Dual subspace learning via geodesic search on Stiefel manifold

  • PDF / 374,139 Bytes
  • 7 Pages / 595.276 x 790.866 pts Page_size
  • 62 Downloads / 230 Views

DOWNLOAD

REPORT


ORIGINAL ARTICLE

Dual subspace learning via geodesic search on Stiefel manifold Lijun Liu • Rendong Ge • Jiana Meng Guangjie You



Received: 22 May 2013 / Accepted: 14 November 2013 Ó Springer-Verlag Berlin Heidelberg 2013

Abstract Oja’s principal subspace algorithm is a wellknown and powerful technique for learning and tracking principal information of time series. However, Oja’s algorithm is divergent when performing the task of minor subspace analysis. In the present paper, we transform Oja’s algorithm into a dual learning algorithm in the sense of fulfilling principal subspace analysis as well as minor subspace analysis via geodesic search on Stiefel manifold. Also inherent stability is guaranteed for the proposed geodesic based algorithm due to the fact the weight matrix rigourously evolves on the compact Stiefel manifold. The effectiveness of the proposed algorithm is further verified in the section of numerical simulation. Keywords Principal subspace analysis  Minor subspace analysis  Dual learning  Stiefel manifold

1 Introduction The interest in subspace-based methods stems from the facts that they consist of splitting the observations into a set of desired and a set of disturbing components, which can be L. Liu (&)  R. Ge School of Science, Dalian Nationalities University, Dalian 116600, China e-mail: [email protected] J. Meng College of Computer Science and Technology, Dalian Nationalities University, Dalian 116600, China e-mail: [email protected] G. You College of Foreign Languages and Cultures, Dalian Nationalities University, Dalian 116600, China

viewed in terms of signal and noise subspaces. These methods have applications in numerous domains, including the fields of adaptive filtering, source localization, or parameter estimation [1–4]. This problem can be stated as follows: considering a sequence of zero-mean n-dimensional random data vectors, whose correlation matrix is denoted as Cxx. The principal subspace of Cxx is defined as the subspace spanned by the p eigenvectors of Cxx associated to p greatest eigenvalues. The minor subspace of Cxx is defined as the subspace spanned by the p eigenvectors of Cxx associated to p lowest eigenvalues. We aim at estimating and tracking the p dimensional principal subspace and minor subspace of Cxx. In practice, a straightforward calculation of this socalled subspace weight matrix requires the eigenvalue decomposition (EVD) of an n-by-n correlation matrix Cxx. As the complexity of EVD methods (7, Chap. 7) [5] is around O(n3), this approach is intractable for real time applications. Therefore, the art of subspace tracking consists in recursively updating a matrix as close as possible to this exact solution with the lowest possible complexity. The merger of signal processing and neural networks in the early 1990s [6] brought much attention to a method originated by Oja [7] and applied by many others [8]. These class of neural algorithms for performing the task of subspace tracking work in an online manner, which operate on a sample-by-sample basis and up