Beschreibung
For a linear regression, the traditional technique deals with a case where the number of observations n are more than the number of predictors variables p (n>p). In the case n<p, the classical method fails to estimate the coefficients. A solution of this problem in the case of correlated predictors is provided in this book. A new regularization and variable selection is proposed under the name of Sparse Ridge Fusion (SRF). In the case of highly correlated predictor (grouping effect), The simulated examples and a real data show that the SRF always outperforms than the lasso, elastic net, and the S-Lasso, also the results show that the SRF selects more predictor variables than the sample size n while the maximum selected variables by lasso is n size.
Autorenporträt
Nozad H. Mahmood is from Kurdistan, and he is living in Sulaymaniyah. During this time, he has obtained his master's degree in Statistical Computing from University of Central Florida/College of Science in Orlando, Florida. Presently, the author is working as a statistical adviser, data analysis, and he is a lecturer in Sulaimani university.
Herstellerkennzeichnung:
BoD - Books on Demand
In de Tarpen 42
22848 Norderstedt
DE
E-Mail: info@bod.de




































































































