Journal of New Theory, cilt.41, ss.1-17, 2022 (Hakemli Dergi)
The sensitivity of the least-squares estimation in a regression model
is impacted by multicollinearity and autocorrelation problems. To deal with the
multicollinearity, Ridge, Liu, and Ridge-type biased estimators have been presented
in the statistical literature. The recently proposed Kibria-Lukman estimator is one
of the Ridge-type estimators. The literature has compared the Kibria-Lukman estimator with the others using the mean square error criterion for the linear regression
model. It was achieved in a study conducted on the Kibria-Lukman estimator’s
performance under the first-order autoregressive erroneous autocorrelation. When
there is an autocorrelation problem with the second-order, evaluating the performance of the Kibria-Lukman estimator according to the mean square error criterion
makes this paper original. The scalar mean square error of the Kibria-Lukman estimator under the second-order autoregressive error structure was evaluated using
a Monte Carlo simulation and two real examples, and compared with the Generalized Least-squares, Ridge, and Liu estimators. The findings revealed that when
the variance of the model was small, the mean square error of the Kibria-Lukman
estimator gave very close values with the popular biased estimators. As the model
variance grew, Kibria-Lukman did not give fairly similar values with popular biased
estimators as in the model with small variance. However, according to the mean
square error criterion the Kibria-Lukman estimator outperformed the Generalized
Least-Squares estimator in all possible cases.