1Department of Mathematics and Statistics, P. K. Campus, Tribhuvan University, Kathmandu, Nepal
American Journal of Applied Mathematics and Statistics.
2020,
Vol. 8 No. 2, 39-42
DOI: 10.12691/ajams-8-2-1
Copyright © 2020 Science and Education PublishingCite this paper: Noora Shrestha. Detecting Multicollinearity in Regression Analysis.
American Journal of Applied Mathematics and Statistics. 2020; 8(2):39-42. doi: 10.12691/ajams-8-2-1.
Correspondence to: Noora Shrestha, Department of Mathematics and Statistics, P. K. Campus, Tribhuvan University, Kathmandu, Nepal. Email:
shresthanoora@gmail.comAbstract
Multicollinearity occurs when the multiple linear regression analysis includes several variables that are significantly correlated not only with the dependent variable but also to each other. Multicollinearity makes some of the significant variables under study to be statistically insignificant. This paper discusses on the three primary techniques for detecting the multicollinearity using the questionnaire survey data on customer satisfaction. The first two techniques are the correlation coefficients and the variance inflation factor, while the third method is eigenvalue method. It is observed that the product attractiveness is more rational cause for the customer satisfaction than other predictors. Furthermore, advanced regression procedures such as principal components regression, weighted regression, and ridge regression method can be used to determine the presence of multicollinearity.
Keywords