Double Penalized Semi-Parametric Signed-Rank Regression with Adaptive LASSO
- PDF / 1,834,427 Bytes
- 21 Pages / 595.276 x 841.89 pts (A4) Page_size
- 28 Downloads / 229 Views
Double Penalized Semi-Parametric Signed-Rank Regression with Adaptive LASSO KWESSI Eddy
DOI: 10.1007/s11424-020-9097-9 Received: 20 March 2019 / Revised: 24 October 2019 c The Editorial Office of JSSC & Springer-Verlag GmbH Germany 2020 Abstract In this paper, a semi-parametric regression model with an adaptive LASSO penalty imposed on both the linear and the nonlinear components of the mode is considered. The model is rewritten so that a signed-rank technique can be used for estimation. The nonlinear part consists of a covariate that enters the model nonlinearly via an unknown function that is estimated using Bsplines. The author shows that the resulting estimator is consistent under heavy-tailed distributions and asymptotic normality results are given. Monte Carlo simulations as well as practical applications are studied to assess the validity of the proposed estimation method. Keywords
1
Adaptive-LASSO, B-splines, penalty, rank regression, semi-parametric.
Introduction
Nonparametric regression has been proved to be a powerful tool to model the relationship between various variables of interest. To reduce the curse of dimensionality, semi-parametric models are generally preferred to full nonlinear models. There are many ways to estimate the nonlinear part in a semi-parametric model: Local averaging, kernels, local polynomials, splines, and wavelets. With the exception of wavelets, these methods produce linear smoothers, that is, there is an estimate of the nonlinear function f (x) given as f(x) = N T τ , where τ is a vector of coefficients and N is a matrix of weights, independent of τ . Splines, local averaging, and kernels are used more often nowadays and there is an extensive literature on the application of these methods in semi-parametric regression. Brunk[1] proposed a regression estimation technique without the linear part and Cleveland[2] proposed a local polynomial regression called the lowess regression; Brunk and Johansen[3] and Wright[4] designed isotonic (monotone increasing) leastsquares estimators while Wang and Huang[5] developed L1 -estimators. Isotonic M -estimators ´ were proposed by He and Shi[6] , Alvarez and Yohai[7] , and others. Ramsay[8] proposed a monotonic B-splines estimation and this was further developed by He and Shi[6] , Lu, et al.[9–11] , and Du, et al.[12] . The interested reader can refer to the book by Schumaker[13] for a good KWESSI Eddy Trinity Place, Trinity University, San Antonio, TX 78258, USA. Email: [email protected]. This paper was recommended for publication by Editor DONG Yuexiao.
2
KWESSI EDDY
reference on B-splines and the one by Ruppert, et al.[14] for a thorough discussion on semiparametric regression. The linear part in a semi-parametric regression is often estimated using a least squares loss function and this would be adequate for normally distributed errors. When the distribution of errors is heavy-tailed, the least squares approach may no longer be appropriate because estimates become inaccurate. Moreover, the problem of overfitting may arise which
Data Loading...