Spherical coordinate-based kernel principal component analysis
- PDF / 2,071,782 Bytes
- 8 Pages / 595.276 x 790.866 pts Page_size
- 57 Downloads / 219 Views
ORIGINAL PAPER
Spherical coordinate-based kernel principal component analysis Yitong Guo1 · Bingo Wing-Kuen Ling1 Received: 10 September 2019 / Revised: 19 August 2020 / Accepted: 25 August 2020 © Springer-Verlag London Ltd., part of Springer Nature 2020
Abstract This paper proposes a spherical coordinate-based kernel principal component analysis (PCA). Here, the kernel function is the nonlinear transform from the Cartesian coordinate system to the spherical coordinate system. In particular, first, the vectors represented in the Cartesian coordinate system are expressed as those represented in the spherical coordinate system. Then, certain rotational angles or the radii of the vector are set to their corresponding mean values. Finally, the processed vectors represented in the spherical coordinate system are expressed back in the Cartesian coordinate system. As the degrees of the freedoms of the processed vectors represented in the spherical coordinate system are reduced, the dimension of the manifold of the processed vectors represented in the Cartesian coordinate system is also reduced. Moreover, since the conversion between the vectors represented in the Cartesian coordinate system and those represented in the spherical coordinate system only involves some elements in the vectors, the required computational power for the conversion is low. Computer numerical simulation results show that the mean squares reconstruction error via the spherical coordinate-based kernel PCA is lower than that via the conventional PCA. Also, the required computational power is significantly reduced. Keywords Kernel principal component analysis · Spherical coordinate system · Computational complexity
1 Introduction The conventional PCA is to rotate the principal axes so that most of the energies of the vectors are localized in the subspace spanned by some of the rotated principal axes. Therefore, by representing the vectors only using some of the rotated principal axes, the dimensions of the vectors are reduced. For some high-dimensional signals [1–3] such as the digital images, the conventional PCA can significantly reduce the required computation power for the further processing. In order to perform the rotations of the principal axes, the unitary transform is applied to these vectors. Since the unitary transform is linear, it does not help to improve the classification accuracy for the nonlinear separable pattern recognition problems. To address these issues, the kernel PCA [4] is proposed. In particular, the nonlinear operators are firstly applied to these vectors. Then, the conventional
B
Bingo Wing-Kuen Ling [email protected] Yitong Guo [email protected]
1
School of Information Engineering, Guangdong University of Technology, Guangzhou 510006, China
PCA is applied to these nonlinear processed vectors. However, the eigenvalue decomposition is required to perform on the covariance matrix of these vectors [5] for both the conventional PCA and the kernel PCA. Nevertheless, the required computational power or performing
Data Loading...