Nonlinear Regression on Riemannian Manifolds and Its Applications to Neuro-Image Analysis
Regression in its most common form where independent and dependent variables are in ℝ n is a ubiquitous tool in Sciences and Engineering. Recent advances in Medical Imaging has lead to a wide spread availability of manifold-valued data leading to problems
- PDF / 307,978 Bytes
- 9 Pages / 439.363 x 666.131 pts Page_size
- 115 Downloads / 163 Views
2
Department of CISE, University of Florida, Gainesville, Florida, USA {monami,rudrasis,vemuri}@cise.ufl.edu Department of Applied Physiology and Kinesiology, University of Florida, Florida, USA {eofori,vcourt}@ufl.edu
Abstract. Regression in its most common form where independent and dependent variables are in Rn is a ubiquitous tool in Sciences and Engineering. Recent advances in Medical Imaging has lead to a wide spread availability of manifoldvalued data leading to problems where the independent variables are manifoldvalued and dependent are real-valued or vice-versa. The most common method of regression on a manifold is the geodesic regression, which is the counterpart of linear regression in Euclidean space. Often, the relation between the variables is highly complex, and existing most commonly used geodesic regression can prove to be inaccurate. Thus, it is necessary to resort to a non-linear model for regression. In this work we present a novel Kernel based non-linear regression method when the mapping to be estimated is either from M → Rn or Rn → M , where M is a Riemannian manifold. A key advantage of this approach is that there is no requirement for the manifold-valued data to necessarily inherit an ordering from the data in Rn . We present several synthetic and real data experiments along with comparisons to the state-of-the-art geodesic regression method in literature and thus validating the effectiveness of the proposed algorithm.
1 Introduction Regression is an essential tool for quantitative analysis to find the relation between independent and dependent variables. Here, we are given a training set of both of these variables and we seek a relation between them. When, both of these variables are in Euclidean space, and there is a linear relation between them, i.e., yi = axi + b for a set of {xi , yi }, a common way to solve for the unknowns a and b is using linear leastsquare estimator, i.e., minimizing the sum of square distances between the two sets of variables over the training set. But, in many real applications, the relation is seldom linear, hence a non-linear least squares estimator or any other sophisticated regression tool like Support Vector Regression [4] can be used. Often, either of the independent or dependent variables are manifold-valued and lie on a smooth Riemannian manifold. In such instances, embedding the manifold valued
This research was funded in part by the NIH grant NS066340 to BCV. Corresponding author.
c Springer International Publishing Switzerland 2015 N. Navab et al. (Eds.): MICCAI 2015, Part I, LNCS 9349, pp. 719–727, 2015. DOI: 10.1007/978-3-319-24553-9_88
720
M. Banerjee et al.
variables in Euclidean space (using the Whitney Embedding [1]) might result in a poor estimation of the underlying model. Also, as any general manifold globally lacks the vector space structure, any linear combination of points on the manifold may not lie on the manifold. For example, suppose the data points lie in a Kendall’s shape space [14], then an arbitrary linear combinat
Data Loading...