On Thu, Jan 29, 2015 at 6:41 PM, Yan Wu <yanwu1...@gmail.com> wrote:
> Hi,
>
> When I fit the regression model without an intercept term, R-squared tends
> to much larger than the R-squared in the model with an intercept. So in this
> case, what¹s a more reasonable measure of the goodness of fit for the model
> without an intercept?
>
> Thanks a lot!!
>
> Yan
>

I am going through the list archives and found your question. I guess
it is unanswered because it is not directly related to R language per
se but is more to do with time series analysis in general.

In general, R square tells you only part of the story. You need to
look at the t-stats of the regression coefficients to understand
whether the betas from the regression are statistically significant.

Further, IIRC R square always increases as more variables are added to
the regression. That is why practitioners look at "adjusted r-square"
instead of "r square" which account for this. So I am curious as to
why your data produces less r square when you add the constant. Is it
possible to upload your data somewhere so pther can take a look at it?

thanks
-- 
Kamaraju S Kusumanchi | http://raju.shoutwiki.com/wiki/Blog

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to