Manifold-Preserving Common Subspace Factorization for Feature Matching
- PDF / 2,129,188 Bytes
- 11 Pages / 595.276 x 790.866 pts Page_size
- 72 Downloads / 200 Views
RESEARCH ARTICLE
Manifold-Preserving Common Subspace Factorization for Feature Matching Weidong Yan 1 & ShaoJun Shi 1 & Wei Lin 1 & Lulu Pan 1 & Jinhuan Wen 1
Received: 17 October 2014 / Accepted: 17 September 2015 # Indian Society of Remote Sensing 2016
Abstract A method, called Manifold-preserving Common Subspace Factorization, is presented which can be used for feature matching. Motivated by the Graph Regularized Non-negative Matrix Factorization (GNMF) algorithm (Deng et al. 2011), we developed GNMF algorithm by considering a joint factorization of the two feature matrices, which share a common basis matrix. An iterative multiplicative updating algorithm is proposed to optimize the objective, and its convergence is guaranteed theoretically. Our feature matching algorithm operates on the new representations in the common subspace generated by basis vectors. Experiments are conducted on the synthetic and real-world data. The results show that the Manifoldpreserving common subspace factorization algorithm provides better matching rates than other matrix-factorization techniques. Keyword Nonnegative matrix factorization (NMF) . Common subspace factorization . Feature matching . Image registration
Introduction In recent years, the demand for feature matching has increased significantly. Feature matching is a key
* Weidong Yan [email protected] 1
School of Science, Northwestern Polytechnical University, Xi’an 710072, China
component in many computer vision tasks (Xiang et al. 2006; Xiang et al. 2008); including stereo matching, image retrieval, image registration and visual tracking. Many techniques have been proposed to analyze feature matching. Recent works, matrix factorization-based correspondence has caused wide public concern. For instance, Scott and Longuet Higgins (Scott and Longuet-Higgins 1999) build an inter-image similarity matrix between feature points in different images being matched and show how to recover correspondences via singular value decomposition on the inter-image similarity matrix. This method fails when rotation or scaling between the images is too large. To overcome this problem, Shapiro and Brady (Shapiro and Brady 1992) developed an extension of the Scott and Longuet Higgins method, in which point sets are matched by comparing the eigenvectors of the point proximity matrix. Wang and Hancock (Wang and Hancock 2004) investigate the performance of Kernel PCA with a polynomial Kernel function for solving the point correspondence problem and discuss the relationship with Shapiro and Brady’s correspondence method. Canonical correlation analysis (CCA) is a classical multivariate method concerned with describing linear dependencies between sets of variables. Kuss and Graepel (Kuss and Graepel 2003) study the geometry of Kernel CCA and show how the canonical correlation between configurations of points mapped into kernel feature spaces can be determined while preserving the geometry of the original method. The intrinsic structure of data is very important to improve matching under the ass
Data Loading...