Minimax Linear Estimation with the Probability Criterion under Unimodal Noise and Bounded Parameters
- PDF / 1,053,410 Bytes
- 16 Pages / 612 x 792 pts (letter) Page_size
- 109 Downloads / 149 Views
OCHASTIC SYSTEMS
Minimax Linear Estimation with the Probability Criterion under Unimodal Noise and Bounded Parameters A. S. Arkhipov∗,a and K. V. Semenikhin∗,b ∗
Moscow Aviation Institute, Moscow, Russia e-mail: a [email protected], b [email protected] Received December 2, 2019 Revised January 23, 2020 Accepted January 30, 2020
Abstract—We consider a linear regression model with a vector of bounded parameters and a centered noise vector that has an uncertain unimodal distribution but known covariance matrix. We pose the minimax estimation problem for a linear combination of unknown parameters with the use of the probability criterion. The minimax estimate is determined as a result of minimizing a probability bound over the region of possible values of the variance and squared bias for all possible linear estimates. We establish that the resulting robust solution is less conservative in comparison with wider classes of distributions. Keywords: minimax estimation, probability criterion, bounded parameters, unimodal noise, worst-case distribution DOI: 10.1134/S0005117920070024
1. INTRODUCTION Minimax statements of estimation problems arise when one needs to construct an estimate that has the best accuracy based on the worst-case combination of uncertain characteristics in the observation model. With this interpretation, the estimation problem is formulated in the form of an optimization problem whose objective is to choose an estimate that minimizes the maximum value of the error. In the choice of an optimization formulation, it is necessary to take into account that wide classes of uncertainty lead to overly conservative and therefore inefficient statistical decisions. Therefore, it is important to look for minimax statements that lead to estimates that combine the properties of robustness and efficiency. One possible solution to this problem is a comparative analysis of the minimax decisions corresponding to different classes of distributions. In this work, the main quality criterion for estimates is the error probability, i.e., the probability of the estimation error exceeding a given threshold. This probability criterion was used by Bahadur to determine a special concept of asymptotic efficiency [1]. In contrast to the classical approach proposed by Fisher and developed by Rao and Cramer, this concept is based on a comparison of error probabilities rather than mean squared errors [2]. Non-asymptotic bounds for the error probability were obtained by solving statistical problems of pattern recognition and machine learning [3]. When constructing minimax linear estimates for scalar parametric functions by the probability criterion, the work [4] used the generalized Chebyshev inequality. To estimate multidimensional parameters, another method similar to minimizing the error probability is the confidence estimation method, where the problem is to construct the smallest confidence region. For models containing both Gaussian noise and uncertain parameters, a nonlinear confidence estimation method was developed in [5, 6]. For st
Data Loading...