Jim Clark gave a fine answer to the question posed by Sangdon Lee. However, I am curious about the correlation and R-square figures given by Sangdon. Apparently, the R-squares for the simple linear regressions on X1 and X2 are (-.2)^2 = .04 and (.3)^2 = .09, but Sangdon says that the R-sq for the multiple regression is "ONLY" 0.3. I find this to be surprisingly high, not low. In the examples I see, the R-sq for the combined model is at most the sum of the individual R-squares. Is it even possible for the opposite to occur?
Rich Einsporn Sangdon Lee wrote: > Greetings! > > I have one Y and two Xs (X1 and X2), and am trying to perform multiple > linear regression. All Xs and Y variables are standardized (zero mean > and unit variance). X1 and X2 are moderately correlated (r=0.6) and > the correlation of X1 and X2 to Y is -0.2 and 0.3, respectively. > ANOVA shows that the linear regression is significant at p=0.05, and > X1 and X2 are also significant. However the r-square is only 0.3. > > When I plot the Y versus the predicted Y, I found that Y has a range > of -3 to 3, but the predicted Y shows the range of -1 to 1. Could > somebody explain why the predicted values show much smaller ranges? > > Thank you very much in advance. > > Sangdon Lee > General Motors Tech Center > [EMAIL PROTECTED] > > ================================================================= > Instructions for joining and leaving this list, remarks about the > problem of INAPPROPRIATE MESSAGES, and archives are available at > http://jse.stat.ncsu.edu/ > ================================================================= -- Dr. Rich Einsporn Associate Professor, Dept. of Statistics The University of Akron [EMAIL PROTECTED] http://gozips.uakron.edu/~rle ================================================================= Instructions for joining and leaving this list, remarks about the problem of INAPPROPRIATE MESSAGES, and archives are available at http://jse.stat.ncsu.edu/ =================================================================