Hi
On Thu, 24 Jan 2002, Rich Ulrich wrote:
> On 24 Jan 2002 07:09:23 -0800, [EMAIL PROTECTED] (Rich Einsporn)
> wrote:
> > Jim Clark gave a fine answer to the question posed by Sangdon Lee.
> > However, I am curious about the correlation and R-square figures given by
> > Sangdon. Apparently, th
On 24 Jan 2002 07:09:23 -0800, [EMAIL PROTECTED] (Rich Einsporn)
wrote:
> Jim Clark gave a fine answer to the question posed by Sangdon Lee.
> However, I am curious about the correlation and R-square figures given by
> Sangdon. Apparently, the R-squares for the simple linear regressions on
> X1
There is an interesting article called "Sometimes R^2 > r^2_{yx1} +
r^2_{yx2}: Correlated Variables Are Not Always Redundant" by David
Hamilton in The American Statistician, May 1987, 41(2), pp. 129-134.
The paper gives an example in which there is little correlation between
y and either x1 or x2
Jim Clark gave a fine answer to the question posed by Sangdon Lee.
However, I am curious about the correlation and R-square figures given by
Sangdon. Apparently, the R-squares for the simple linear regressions on
X1 and X2 are (-.2)^2 = .04 and (.3)^2 = .09, but Sangdon says that the
R-sq for the
Hi
On 23 Jan 2002, Sangdon Lee wrote:
> I have one Y and two Xs (X1 and X2), and am trying to perform multiple
> linear regression. All Xs and Y variables are standardized (zero mean
> and unit variance). X1 and X2 are moderately correlated (r=0.6) and
> the correlation of X1 and X2 to Y is -0.
Greetings!
I have one Y and two Xs (X1 and X2), and am trying to perform multiple
linear regression. All Xs and Y variables are standardized (zero mean
and unit variance). X1 and X2 are moderately correlated (r=0.6) and
the correlation of X1 and X2 to Y is -0.2 and 0.3, respectively.
ANOVA sho