Adjustment of Gauss-Helmert Models with Autoregressive and Student Errors

In this contribution, we extend the Gauss-Helmert model (GHM) with t-distributed errors (previously established by K.R. Koch) by including autoregressive (AR) random deviations. This model allows us to take into account unknown forms of colored noise as w

  • PDF / 423,924 Bytes
  • 9 Pages / 595.516 x 790.987 pts Page_size
  • 7 Downloads / 268 Views

DOWNLOAD

REPORT


Abstract

In this contribution, we extend the Gauss-Helmert model (GHM) with t-distributed errors (previously established by K.R. Koch) by including autoregressive (AR) random deviations. This model allows us to take into account unknown forms of colored noise as well as heavytailed white noise components within observed time series. We show that this GHM can be adjusted in principle through constrained maximum likelihood (ML) estimation, and also conveniently via an expectation maximization (EM) algorithm. The resulting estimator is self-tuning in the sense that the tuning constant, which occurs here as the degree of freedom of the underlying scaled t-distribution and which controls the thickness of the tails of that distribution’s probability distribution function, is adapted optimally to the actual data characteristics. We use this model and algorithm to adjust 2D measurements of a circle within a closed-loop Monte Carlo simulation and subsequently within an application involving GNSS measurements. Keywords

Autoregressive process  Circle fitting  Constrained maximum likelihood estimation  Expectation maximization algorithm  Gauss-Helmert model  Scaled t-distribution  Selftuning robust estimator

1

Introduction

When a deterministic model used to approximate observations are characterized by condition equations in which multiple observations and unknown parameters are linked with each other, an adjustment by means of the GaussHelmert model (GHM) is often the procedure of choice. The classical formulation of that method, being based on the method of least squares, does not require the specification of a probability density function (pdf) for the random deviations or the observables. When the observables are outlier-afflicted or heavy-tailed, this least-squares approach can be expected to break down. However, it can be turned into an outlierresistant (‘robust’) procedure by including a re-weighting B. Kargoll () · M. Omidalizarandi · H. Alkhatib Geodetic Institute, Leibniz University Hannover, Hannover, Germany e-mail: [email protected]

or variance-inflation scheme based on a heavy-tailed error law such as Student’s t-distribution (Koch 2014a,b). This procedure is implemented as an expectation maximization (EM) algorithm, which allows for the estimation not only of the parameters within the condition equations, but also of the scale factor and degree of freedom of the underlying tdistribution. The latter feature turns the method into a selftuning robust estimator in the sense of Parzen (1979). An additional common characteristic of observables that complicates their adjustment is given by autocorrelations or colored noise, which phenomena frequently occur with electronic instruments measuring at a high sampling rate (cf. Kuhlmann 2003). When the data covariance matrix is unknown or too large, autoregressive (AR) or AR moving average (ARMA) processes enable a parsimonious modeling of correlations in situations when the measurements can be treated as a time series (e.g., Schuh 2003). Such processe