Implementation of a goodness-of-fit test through Khmaladze martingale transformation
- PDF / 756,513 Bytes
- 25 Pages / 439.37 x 666.142 pts Page_size
- 42 Downloads / 183 Views
Implementation of a goodness-of-fit test through Khmaladze martingale transformation Jiwoong Kim1 Received: 4 January 2019 / Accepted: 26 February 2020 © Springer-Verlag GmbH Germany, part of Springer Nature 2020
Abstract Khmaladze martingale transformation provides an asymptotically-distribution-free method for a goodness-of-fit test. With its usage not being restricted to testing for normality, it can also be selected to test for a location-scale family of distributions such as logistic and Cauchy distributions. Despite its merits, the Khmaladze martingale transformation, however, could not have enjoyed deserved celebrity since it is computationally expensive; it entails the complex and time-consuming computations, including optimization, integration of a fractional function, matrix inversion, etc. To overcome these computational challenges, this paper proposes a fast algorithm which provides a solution to the Khmaladze martingale transformation method. To that end, the proposed algorithm is equipped with a novel strategy, named integrationin-advance, which rigorously exploits the structure of the Khmaladze martingale transformation. Keywords Asymptotically-distribution-free · Integration-in-advance strategy · Location-scale family · Normality test
1 Introduction A classical goodness-of-fit problem, that is, a problem of testing whether a random sample comes from a specific distribution or from a given parametric family of distributions has been of interest to many fields for a long time. For example, testing for normality has been crucial in the vast literature of social and physical sciences. Existence of more than 40 tests for normality—which include the Jarque and Bera (1980) test and D’Agostino (1971) test—well reflects the importance of the goodness-of-fit test for normality: see Dufour et al. (1998) and references therein for more detail.
B 1
Jiwoong Kim [email protected] Clinical Trial Center, Ajou University Medical Center, 164, World Cup-ro, Yeongtong-gu, Suwon 16499, Republic of Korea
123
J. Kim
In parallel with the development of the normality tests, various nonparametric tests for general distributions also have been proposed in the statistical literature. The bestknown nonparametric test is Kolmogorov–Smirnov (K–S) test. When the K–S test is used to test for a known continuous distribution, the asymptotic distribution of its test statistic does not depend on the null distribution: the K–S test is said to be asymptotically-distribution-free (ADF). However, the K–S test loses this property if any necessity to estimate parameters of the null distribution arises: see Durbin (1973) for more detail. Consequently, critical values determined under the assumption of known parameters are not valid. In such a case, Monte Carlo method is employed to compute empirical critical values; an example of the modified version of the K–S test which uses the Monte-Carlo-generated critical values is Lilliefors (1967) test. However, the Lilliefors test is designed to test for normality only. Seeking a test for gen
Data Loading...