Maximum likelihood and the maximum product of spacings from the viewpoint of the method of weighted residuals

  • PDF / 655,929 Bytes
  • 18 Pages / 439.37 x 666.142 pts Page_size
  • 18 Downloads / 154 Views

DOWNLOAD

REPORT


Maximum likelihood and the maximum product of spacings from the viewpoint of the method of weighted residuals Takuya Kawanishi1 Received: 7 July 2019 / Revised: 24 April 2020 / Accepted: 29 April 2020 © SBMAC - Sociedade Brasileira de Matemática Aplicada e Computacional 2020

Abstract In parameter estimation, the maximum-likelihood method (ML) does not work for some distributions. In such cases, the maximum product of spacings method (MPS) is used as an alternative. However, the advantages and disadvantages of the MPS, its variants, and the ML are still unclear. These methods are based on the Kullback–Leibler divergence (KLD), and we consider applying the method of weighted residuals (MWR) to it. We prove that, after transforming the KLD to the integral over [0, 1], the application of the collocation method yields the ML, and that of the Galerkin method yields the MPS and Jiang’s modified MPS (JMMPS); and the application of zero boundary conditions yields the ML and JMMPS, and that of non-zero boundary conditions yields the MPS. Additionally, we establish formulas for the approximate difference among the ML, MPS, and JMMPS estimators. Our simulation for seven distributions demonstrates that, for zero boundary condition parameters, for the bias convergence rate, ML and JMMPS are better than the MPS; however, regarding the MSE for small samples, the relative performance of the methods differs according to the distributions and parameters. For non-zero boundary condition parameters, the MPS outperforms the other methods: the MPS yields an unbiased estimator and the smallest MSE among the methods. We demonstrate that from the viewpoint of the MWR, the counterpart of the ML is JMMPS not the MPS. Using this KLD-MWR approach, we introduce a unified view for comparing estimators, and provide a new tool for analyzing and selecting estimators. Keywords Parameter estimation · Kullback–Leibler divergence · Bias · Mean squared error · Point collocation method · Galerkin method Mathematics Subject Classification 62F10 · 62F12 · 65N30

Communicated by Clémentine Prieur. We thank Maxine Garcia, PhD, from Edanz Group (www.edanzediting.com/ac) for editing a draft of this manuscript.

B 1

Takuya Kawanishi [email protected] Institute of Science and Engineering, Kanazawa University, Kakuma-machi, Kanazawa 9201192, Japan 0123456789().: V,-vol

123

156

Page 2 of 18

T. Kawanishi

1 Introduction The maximum-likelihood method (ML) is one of the most popular methods of parameter estimation in statistics. However, the ML does not work for some distributions. For example,Smith (1985) showed that for the three-parameter Weibull distribution, F(x) =  1 − exp − {(x − μ)/σ }γ , if γ < 2, then the ML estimator (MLE) is not consistent, and if γ < 1, then there is no MLE. An alternative to the MLE for such cases is the maximum product of spacings method (MPS, Cheng and Amin 1983), also known as the maximum spacing method (Ranneby 1984), in which we minimize the product of spacings instead of the likelihood. The MPS can be applied t