Shrinkage Parameters for Each Explanatory Variable Found Via Particle Swarm Optimization in Ridge Regression

Main Article Content

Eren Bas*
Erol Egrioglu
Vedide Rezan Uslu

Abstract

Ridge regression method is an improved method when the assumptions of independence of the explanatory variables cannot be achieved, which is also called multicollinearity problem, in regression analysis. One of the way to eliminate the multicollinearity problem is to ignore the unbiased property of. β Ridge regression estimates the regression coefficients biased in order to decrease the variance of the regression coefficients. One of the most important problems in ridge regression is to decide what the shrinkage parameter (k) value will be. This k value was found to be a single value in almost all these studies in the literature. In this study, different from those studies, we found different k values corresponding to each diagonal elements of variance-covariance matrix of instead of a single value of k by using a new algorithm based on particle swarm optimization. To evaluate the performance of our proposed method, the proposed method is firstly applied to real-life data sets and compared with some other studies suggested in the ridge regression literature. Finally, two different simulation studies are performed and the performance of the proposed method with different conditions is evaluated by considering other studies suggested in the ridge regression literature.

Downloads

Download data is not yet available.

Article Details

Bas, E., Egrioglu, E., & Uslu, V. R. (2017). Shrinkage Parameters for Each Explanatory Variable Found Via Particle Swarm Optimization in Ridge Regression. Trends in Computer Science and Information Technology, 2(1), 012–020. https://doi.org/10.17352/tcsit.000005
Research Articles

Copyright (c) 2017 Bas E, et al.

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Licensing and protecting the author rights is the central aim and core of the publishing business. Peertechz dedicates itself in making it easier for people to share and build upon the work of others while maintaining consistency with the rules of copyright. Peertechz licensing terms are formulated to facilitate reuse of the manuscripts published in journals to take maximum advantage of Open Access publication and for the purpose of disseminating knowledge.

We support 'libre' open access, which defines Open Access in true terms as free of charge online access along with usage rights. The usage rights are granted through the use of specific Creative Commons license.

Peertechz accomplice with- [CC BY 4.0]

Explanation

'CC' stands for Creative Commons license. 'BY' symbolizes that users have provided attribution to the creator that the published manuscripts can be used or shared. This license allows for redistribution, commercial and non-commercial, as long as it is passed along unchanged and in whole, with credit to the author.

Please take in notification that Creative Commons user licenses are non-revocable. We recommend authors to check if their funding body requires a specific license.

With this license, the authors are allowed that after publishing with Peertechz, they can share their research by posting a free draft copy of their article to any repository or website.
'CC BY' license observance:

License Name

Permission to read and download

Permission to display in a repository

Permission to translate

Commercial uses of manuscript

CC BY 4.0

Yes

Yes

Yes

Yes

The authors please note that Creative Commons license is focused on making creative works available for discovery and reuse. Creative Commons licenses provide an alternative to standard copyrights, allowing authors to specify ways that their works can be used without having to grant permission for each individual request. Others who want to reserve all of their rights under copyright law should not use CC licenses.

Hoerl AE, Kennard RW (1970) Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12: 55–67. Link: https://goo.gl/5ZV56T

Hoerl AE, Kennard RW, Baldwin KF (1975) Ridge regression: some simulations. Communications in Statistics 4: 105–123. Link: https://goo.gl/QGgP3L

McDonald GC, Galarneau DI (1975) A Monte Carlo evaluation of some ridge-type estimators. Journal of the American Statistical Association 70: 407–412. Link: https://goo.gl/7ZN2co

Lawless JF, Wang P (1976) A simulation study of ridge and other regression estimators. Communications in Statistics – Theory and Methods 14: 1589–1604. Link: https://goo.gl/WfUz0p

Hocking RR, Speed FM, Lynn MJ (1976) A class of biased estimators in linear regression. Technometrics 18: 425–437. Link: https://goo.gl/NEjsRY

Gunst RF, Mason RL (1977) Biased estimation in regression: an evaluation using mean squared error. Journal of the American Statistical Association 72: 616–628. Link: https://goo.gl/HfxIin

Wichern D, Curchill G (1978) A comparison of ridge estimators. Technometrics 20: 301-311. Link: https://goo.gl/U6OiUQ

Lawless JF (1978) Ridge and related estimation procedure Theory and Methods. Communications in Statistics 7: 139–164. Link: https://goo.gl/KceYME

Nordberg L (1982) A procedure for determination of a good ridge parameter in linear regression, Communications in Statistics 11: 285–309. Link: https://goo.gl/pNqtc2

Saleh AK, Kibria BM (1993) Performances of some new preliminary test ridge regression estimators and their properties. Communications in Statistics – Theory and Methods 22: 2747–2764. Link: https://goo.gl/4XqQNd

Haq MS, Kibria BMG (1996) a shrinkage estimator for the restricted linear regression model: ridge regression approach. Journal of Applied Statistical Science 3: 301–316. Link: https://goo.gl/sjCZrw

Kibria BM (2003) Performance of some new ridge regression estimators. Communications in Statistics – Simulation and Computation 32: 419–435. Link: https://goo.gl/3OJp6a

Pasha GR, Shah MA (2004) Application of ridge regression to multicollinear data. Journal of Research Science 15: 97– 106. Link: https://goo.gl/5eP2I5

Khalaf G, Shukur G (2005) Choosing ridge parameter for regression problem. Communications in Statistics – Theory and Methods 34: 1177–1182. Link: https://goo.gl/Nu1Xs4

Norliza A, Maizah HA, Robin A (2006) A comparative study on some methods for handling multicollinearity problems. Mathematika 22: 109–119. Link: https://goo.gl/Tlyqej

Alkhamisi MA, Shukur G (2007) A Monte Carlo study of recent ridge parameters. Communications in Statistics – Simulation and Computation 36: 535–547. Link: https://goo.gl/Mv2FMY

Mardikyan S, Cetin E (2008) Efficient choice of biasing constant for ridge regression. Int. J. Contemp. Math. Sciences, 3: 527–536. Link: https://goo.gl/oOgsiH

Prago-Alejo RJ,Torre-Trevino LM, Pina-Monarrez MR (2008) Optimal determination of k constant of ridge regression using a simple genetic algorithm. Electronics robotics and Automotive Mechanics Conference. Link: https://goo.gl/uPVi0B

Dorugade AV, Kashid DN (2010) Alternative method for choosing ridge parameter for regression. Applied Mathematical Sciences 4: 447–456. Link: https://goo.gl/E7MYJ5

Al-Hassan Y (2010) Performance of new ridge regression estimators. Journal of the Association of Arab Universities for Basic and Applied Science 9: 23–26. Link: https://goo.gl/zTjoEe

Ahn JJ, Byun HW, Oh KJ, Kim TY (2012) Using ridge regression with genetic algorithm to enhance real estate appraisal forecasting. Expert Systems with Applications 39: 8369–8379. Link: https://goo.gl/TM0Udi

Uslu VR, Egrioglu E, Bas E (2014) Finding optimal value for the shrinkage parameter in ridge regression via particle swarm optimization. American Journal of Intelligent Systems 4: 142-147. Link: https://goo.gl/U06GuG

Chitsaz S, Ahmed SE (2012) Shrinkage estimation for the regression parameter matrix in multivariate regression model. Journal of Statistical Computation and Simulation 82: 309-323. Link: https://goo.gl/lDIZzU

Firinguetti L (1997) Ridge regression in the context of a system of seemingly unrelated regression equations. Journal of Statistical Computation and Simulation 56: 145-162. Link: https://goo.gl/wWROue

Halawa AM, El Bassiouni MY (2000) Tests of regression coefficients under ridge regression models. Journal of Statistical Computation and Simulation 65: 341-356. Link: https://goo.gl/LUQbtW

Dorugade AV, Kashid DN (2010) Variable selection in linear regression based on ridge estimator. Journal of Statistical Computation and Simulation 80: 1211-1224. Link: https://goo.gl/A0WJm7

Golam Kibria BM (2004) Performance of the shrinkage preliminary test ridge regression estimators based on the conflicting of W, LR and LM tests, Journal of Statistical Computation and Simulation 74: 793-810. Link: https://goo.gl/TMvLYe

Roozbeh M, Arashi M, Niroumand HA (2011) Ridge regression methodology in partial linear models with correlated errors. Journal of Statistical Computation and Simulation 81: 517-528. Link: https://goo.gl/Y2r4Nz

Simpsona JR, Montgomery DC (1996) A biased-robust regression technique for the combined outlier-multicollinearity problem. Journal of Statistical Computation and Simulation 56: 1-22. Link: https://goo.gl/qgK7Fz

Uzuke CA, Mbegbu JI, Nwosu CR (2015) Performance of kibria, khalaf and shurkur's methods when the eigenvalues are skewed. Communications in Statistics - Simulation and Computation Link: https://goo.gl/VdvgIo

Wong KY, Chiu SN (2015) an iterative approach to minimize the mean squared error in ridge regression. Computational Statistics 30: 625-639. Link: https://goo.gl/rdHK1p

Dorugade AV (2014) new ridge parameters for ridge regression. Journal of the Association of Arab Universities for Basic and Applied Sciences 15: 94-99. Link: https://goo.gl/wpNPWf

Khalaf G (2013) An optimal estimation for the ridge regression parameter. Journal of Fundamental and Applied Statistics 5: 11-19.Link: https://goo.gl/bo4bde

Muniz G, Golam Kibria BM, Månsson K, Ghazi S (2012) On developing ridge regression parameters: a graphical investigation. Sort-Statistics and Operations Research Transactions 36: 115-138. Link: https://goo.gl/o13EFW

Muniz G, Golam Kibria BM (2009) On some ridge regression estimators: an empirical comparisons. Communications in Statistics - Simulation and Computation 38: 621-630. Link: https://goo.gl/wqCKbh

Nomura M (1988) On the almost unbiased ridge regression estimation. Communications in Statistics – Simulation and Computation 17: 729–743. Link: https://goo.gl/5kX0MM

Montogomery DC, Peck EA, Vining GG (2006) Introduction to Linear Regression Analysis. John Wiley and Sons. Link: https://goo.gl/M3tgXY

Batah FS, Ramnathan T, Gore SD (2008) The efficiency of modified jackknife and ridge type regression estimators: a comparison. Surveys in Mathematics and its Applications 3: 111–122. Link: https://goo.gl/bw8Xcf

Hoerl AE, Kennard RW (1976) Ridge regression: iterative estimation of the biasing parameter. Commun. Statist. Theor. Meth.5: 77-88. Link: https://goo.gl/VxoSpF

Hoerl AE, Kennard RW (1970) Ridge Regression: Applications to Nonorthogorial Problems. Technometrics 12: 69-82.Link: https://goo.gl/HKnemY

Firinguetti L (1999) A generalized ridge regression estimator and its finite sample properties. Communications in Statistics-Theory and Methods 28: 1217-1229. Link: https://goo.gl/VTzhbj

Kennedy J, Eberhart R (1995) Particle swarm optimization. In Proceedings of IEEE International Conference on Neural Networks, Piscataway, NJ, USA, IEEE Press. 1942–1948

Chatterjee S, Hadi (2006) A Regression Analysis by Example. John Wiley and Sons. Link: https://goo.gl/Dx6iqn

Gibbons DG (1981) A simulation study of some ridge estimators. Journal of the American Statistical Association 76:131–139. Link: https://goo.gl/XMzqJs