Thank you very much for your effort!
But is there a measure, which can compare the goodness of fit of regression
models with and without the intercept? Can I only compare them in terms of
sum of squares residual?
--
View this message in context:
http://r.789695.n4.nabble.com/Correct-statistical
On Jul 20, 2010, at 11:41 AM, StatWM wrote:
>
> Dear R community,
>
> is there a way to get correct t- and p-values and R squared for linear
> regression models specified without an intercept?
>
> example model:
> summary(lm(y ~ 0 + x))
>
> This gives too low p-values and too high R squared.
Hi:
On Tue, Jul 20, 2010 at 2:41 AM, StatWM wrote:
>
> Dear R community,
>
> is there a way to get correct t- and p-values and R squared for linear
> regression models specified without an intercept?
>
> example model:
> summary(lm(y ~ 0 + x))
>
> This gives too low p-values and too high R squar
What x and y represent? Are they non-stationary, trending? then you would get
very high R2 (~97-99%) and very low p-value. Perhaps you land on the world
of spurious regression.
In this case forcing intercept to zero would not help you. Work with
differenced series instead raw data.
Thanks and re
Let's assume x and y as stationary. It's not a spurious regression problem
here. I think the function lm() has to have an intercept to give correct
values of t- and p- and R squared. I wonder if you can correct the values in
R though?
--
View this message in context:
http://r.789695.n4.nabble.co
Dear R community,
is there a way to get correct t- and p-values and R squared for linear
regression models specified without an intercept?
example model:
summary(lm(y ~ 0 + x))
This gives too low p-values and too high R squared. Is there a way to
correct it? Or should I specify with intercept t
6 matches
Mail list logo