Prove multicollinearity by regressing one X on ALL the others?

G

Guest

I have a multiple regression with 4 independent variables.It has a high
predictive value (R2 = 0.82 and the P-value for F = 0.0004). However, two of
the variables (X1 and X2) hav high P-values for t (0.86 and 0.3,
respectively).

I suspect multicollinearity. However, individual correlation analysis
between each of the X variables is inconclusive. The highest correlation is r
= 0.64. Is that high enough to prove multicollinearity?

Regardless, I was wondering if it is OK to do a multiple correlation
analysis of X1 on the other X variables to prove intercorrelation and
multicollinearity? When regressing X1 on the other variables, r = 0.86; R2 =
0.74. Does that prove multicollinearity?
 
G

Guest

Multiconllinearity is used with at least two very different meanings in the
literature.

1. Predictor variables that very nearly lie in a reduced dimensional
subspace, so that it is difficult or impossible to numerically solve for
unique least squares estimates. In this case the
MDETERM(MMULT(TRANSPOSE(xmatrix),xmatrix)) is nearly zero. I see no evidence
of this in the information that you provided.

2. Predictor variables that are sufficiently "correlated" that it is
difficult to separate their unique contributions. This may in fact be
happening to you. For more information, you might find some of the following
articles to be instructive:
Sharpe & Roberts (1997) American Statistician 51:46-48
Schey (1993) American Statistican 47:26-30
Hamilton (1988) American Statistician 42:89-90
Lewis & Escobar (1986) The Statistician 35:17-26
Lewis, Escobar, & Geaghan (1985) J. Statistical Computation & Simulation
22:51-66

Jerry
 
Top