The BAB algorithm for computing the total least trimmed squares estimator

  • PDF / 515,994 Bytes
  • 14 Pages / 595.276 x 790.866 pts Page_size
  • 66 Downloads / 248 Views

DOWNLOAD

REPORT


ORIGINAL ARTICLE

The BAB algorithm for computing the total least trimmed squares estimator Zhipeng Lv1

· Lifen Sui1

Received: 21 January 2020 / Accepted: 26 August 2020 © Springer-Verlag GmbH Germany, part of Springer Nature 2020

Abstract Robust estimation in the errors-in-variables (EIV) model remains a difficult problem because of the leverage point and the masking effect and swamping effect. In this contribution, a new robust estimator is introduced for the EIV model. This method is a follow-up to least trimmed squares, which is applied to the Gauss–Markov model when only the observation vector contains outliers. We call this estimator the total least trimmed squares (TLTS) estimator because its criterion function consists of squared orthogonal residuals. The TLTS estimator excludes some large squared orthogonal residuals from the criterion function, thereby allowing the fit to ignore outliers. The TLTS estimator inherits appropriate equivariance properties, namely regression equivariance, scale equivariance and affine equivariance, and the maximal 50% asymptotic breakdown point in terms of observations y within the special cofactor matrix structure. The TLTS estimate can directly be obtained by the exhaustive evaluation method. We further develop another algorithm for the TLTS estimator based on the branch-and-bound method without exhaustive evaluation, but the cofactor matrix of the independent variables needs to have a certain block structure. Finally, two simulation studies provide insights into the robustness and efficiency of the proposed algorithms. Keywords Errors-in-variables model · Robust estimation · Total least trimmed squares · Exhaustive evaluation · Branch-andbound · High breakdown point · Equivariance

1 Introduction The purpose of regression analysis is to fit equations to observed variables. In the classic theory, the coefficient matrix is assumed to be exactly observed and only the observation vector is subject to normal random errors. In this case, the most popular regression estimator is the least-squares (LS) method proposed by Gauss and Legendre (Plackett 1972). More recently, people have realized that real observations usually do not entirely satisfy the above assumptions. In some applications, the coefficient matrix is perturbed by other types of error, such as environment error, human error and instrument error, and an errors-in-variables (EIV) model should be included: The LS method is biased and inconsistent in such cases (van Huffel and Vandewalle 1991). The total least-squares (TLS) method was introduced to solve the EIV model by Golub and van Loan (1980), and their analysis

B 1

Zhipeng Lv [email protected] Institute of Surveying and Mapping, Information Engineering University, Zhengzhou 450001, China

and algorithm were based on the singular value decomposition (SVD) method. van Huffel and Vandewalle (1991) presented a comprehensive description on the TLS method from its development to its application. Many researchers focused on solving EIV models in the field of geodesy