Hi

On 25 Mar 2004, Bin Zhou wrote:

> Hello, all,
> 
> In linear regression, Y1 is dependent variable, Y2 is predicted value.
> R is Pearson's r for Y1 and Y2, so I can get R square.
> I can also get R square by the following formula.
> R square = 1 - SSE/CSS
> WHERE                                  
> SSE = the sum of squares for error
> CSS = CORRECTED TOTAL SUM OF SQUARES FOR THE DEPENDENT VARIABLE.
> 
> I found the two values are different. So I think I can only use the
> second formula for nolinear regrssion, in linear regression, I can
> only calculate pearson's r. Is it right?

No, the two ways of computing r^2 should agree, no matter how
many predictors you have.  Just to be clear,

if y^ = b0 + b1*x1 + b2*x2 ....

, where one or more xs could be polynomial predictors,

then

R^2 = SSy^ / SSy  (just another variation of your formula)

will equal r(yy^)^2

Best wishes
Jim

============================================================================
James M. Clark                          (204) 786-9757
Department of Psychology                (204) 774-4134 Fax
University of Winnipeg                  4L05D
Winnipeg, Manitoba  R3B 2E9             [EMAIL PROTECTED]
CANADA                                  http://www.uwinnipeg.ca/~clark
============================================================================

.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to