Jensen Bregman LogDet Divergence Optimal Filtering in the Manifold of Positive Definite Matrices

In this paper, we consider the problem of optimal estimation of a time-varying positive definite matrix from a collection of noisy measurements. We assume that this positive definite matrix evolves according to an unknown GARCH (generalized auto-regressiv

  • PDF / 5,533,977 Bytes
  • 15 Pages / 439.37 x 666.142 pts Page_size
  • 44 Downloads / 185 Views

DOWNLOAD

REPORT


Abstract. In this paper, we consider the problem of optimal estimation of a time-varying positive definite matrix from a collection of noisy measurements. We assume that this positive definite matrix evolves according to an unknown GARCH (generalized auto-regressive conditional heteroskedasticity) model whose parameters must be estimated from experimental data. The main difficulty here, compared against traditional parameter estimation methods, is that the estimation algorithm should take into account the fact that the matrix evolves on the PD manifold. As we show in the paper, measuring the estimation error using the Jensen Bregman LogDet divergence leads to computationally tractable (and in many cases convex) problems that can be efficiently solved using first order methods. Further, since it is known that this metric provides a good surrogate of the Riemannian manifold metric, the resulting algorithm respects the non-Euclidean geometry of the manifold. In the second part of the paper we show how to exploit this model in a maximum likelihood setup to obtain optimal estimates of the unknown matrix. In this case, the use of the JBLD metric allows for obtaining an alternative representation of Gaussian conjugate priors that results in closed form solutions for the maximum likelihood estimate. In turn, this leads to computationally efficient algorithms that take into account the nonEuclidean geometry. These results are illustrated with several examples using both synthetic and real data. Keywords: GARCH model · Jensen Bregman LogDet divergence Covariance feature · Manifold · Optimal filter

1

·

Introduction

Covariance matrices are ubiquitous in computer vision, in problems ranging from tracking [7,8,16–18,23,29,30,32] to object detection [27,28], person reidentification [11], activity recognition [15], face recognition [21] and Diffusion This work was supported in part by NSF grants IIS–1318145 and ECCS–1404163; AFOSR grant FA9550-15-1-0392; and the Alert DHS Center of Excellence under Award Number 2013-ST-061-ED0001. c Springer International Publishing AG 2016  B. Leibe et al. (Eds.): ECCV 2016, Part VII, LNCS 9911, pp. 221–235, 2016. DOI: 10.1007/978-3-319-46478-7 14

222

Y. Wang et al.

Fig. 1. Two examples where covariance features are used to describe a target. On the left, the appearance of the target car has roughly constant covariance. On the right, the covariance of the appearance of the spinning ball changes over time.

Tensor Imaging (DTI) [9,22]. Applications outside the computer vision field include economics [3], fault detection [20] and power systems [6]. Most of these applications require estimating the present value of a covariance matrix from a combination of noisy measurements and past historical data, with the main difficulty here arising from the need to account for the fact that these matrices evolve on a Riemannian manifold. For example, [23] proposed to use as covariance estimate the Karcher mean of the measurements, as a counterpart to the use of the arithmetic mean update in Euclidean space. However,