Then [T'.sub.r][ST.sub.r] = [[LAMBDA].sub.r]; the PCRE of [beta] can be written as
Now, we are to propose two new estimator classes by combining the PCRE with the AURE and AULE, that is, the almost unbiased ridge principal components estimator (AURPCE) and the almost unbiased Liu estimator principal component estimator (AULPCE), as follows:
If k = 0, then [[??].sub.AU](r, k) = [??](r) = [T.sub.r][T'.sub.r][??], PCRE.
If d = 0, then [[??].sub.AU](r, d) = [T.sub.r][T'.sub.r][??], PCRE.
So the [[??].sub.AU](r,k) could be regarded as a generalization of PCRE, OLSE, and AURE, while [[??].sub.AU](r, d) could be regarded as a generalization of PCRE, OLSE, and AULE.
Let us consider the AURPCE, AULPCE, AURE, AULE, PCRE, and OLSE and compute their respective estimated MSE values with the different levels of multicollinearity, namely, [gamma] = 0.7, 0.85, 0.9, 0.999 to show the weakly, strong, and severely collinear relationships between the explanatory variables (see Tables 1 and 2).
From the simulation results shown in Tables 1 and 2 and the estimated MSE values of these estimators, we can see that for most cases, the AURPCE and AULPCE have smaller estimated MSE values than those of the AURE, AULE, PCRE, and OLSE, respectively, which agree with our theoretical findings.
(1) [[??].sub.r](1, 1) = [[??].sub.r] = [T.sub.r][([T'.sub.r]X'X[T.sub.r]).sup.-1][T'.sub.r]X'y is the PCRE;
For the OLSE, PCRE, r-k class estimator, r-d class estimator, Liu-type estimator (LTE), and new estimator (PCTTE), their estimated mean square error (MSE) values are obtained by replacing all unknown model parameters by their, respectively, least squares estimators in corresponding expressions.
From Figure 3, we see that when d, fixed, if 0 < d < k, then the new estimator is better than the PCRE. In Theorem 1, we see that if 0 < d < k, the new estimator is better.
Then, we discuss the superiority of the new estimator with the OLSE, PCRE, r-k class estimator, r-d class estimator, and Liu-type estimator in the sense of mean square error.