sorry for spaming, but IU just had an idea not sure if that may be a way of
doing it:

1. calculate the standrad error of e.g. intercept as mean of the standards
errors obtained from the upper/lower
    confidence intervals of the intercept
     error_intercept1 = (allo.lmodel2$confidence.intervals[4,2] -
mean(log(biomass_data$BM_roots)))/1.96
     error_intercept2 = (allo.lmodel2$confidence.intervals[4,3] -
mean(log(biomass_data$BM_roots)))/1.96
     stderr_intercept = round((error_intercept1+error_intercept2)/2,
digits=8)

2. Calculate the t-value as intercept estimate divided by the standard error
from 1. and using
     the following for calculating a two-tailed p-value
      p_intercept = 2 * (1 - pt(abs(intercept/stderr_intercept),
df=length(biomass_data)-1))

Might this a reasonable approach for a 'rough' estimation of an p-value? I
glad for every suggestion...



2009/7/20 Katharina May <may.kathar...@googlemail.com>

> Hi *,
>
> is there a way to obtain some kind of p-value for a model fitted with RMA
> using the lmodel2 package?
> I know that p-values are discussed and criticized a lot and as you can
> image from my question I'm not
> very much of a statistican (only writing my bachelor thesis).
>
> As fare as I understood the confidence interval statistic correctly, a
> coefficient is regarded as statistically
> significant if the corresponding CI does not include 0 (null hypothesis).
> But can I obtain some kind of a
> p-value to say that it is highly significant (< 0.01), significant
> (0.05),... like in the output of lm?
>
> Sorry for bothering everybody with this, well, probably rather idiotic
> question, but I don't know where to
> continue from this point...
>
> Thanks,
>
>           Katharina
>
>
> Here the output of my lmodel2 regression:
>
>
> Model II regression
>
> Call: lmodel2(formula = log(AGB) ~ log(BM_roots), data = biomass_data,
> range.y = "interval", range.x = "interval", nperm = 99)
>
> n = 1969   r = 0.9752432   r-square = 0.9510993
> Parametric P-values:   2-tailed = 0    1-tailed = 0
> Angle between the two OLS regression lines = 1.433308 degrees
>
> Permutation tests of OLS, MA, RMA slopes: 1-tailed, tail corresponding to
> sign
> A permutation test of r is equivalent to a permutation test of the OLS
> slope
> P-perm for SMA = NA because the SMA slope cannot be tested
>
> Regression results
>   Method Intercept     Slope  Angle (degrees)  P-perm (1-tailed)
> 1    OLS 0.6122146  1.038792         46.09002               0.01
> 2     MA 0.5787299  1.066868         46.85300               0.01
> 3    SMA 0.5807645  1.065162         46.80725                 NA
> 4    RMA 0.5792123  1.066463         46.84216               0.01
>
> Confidence intervals
>   Method  2.5%-Intercept 97.5%-Intercept  2.5%-Slope 97.5%-Slope
> 1    OLS       0.5779465       0.6464828    1.028376    1.049207
> 2     MA       0.5659033       0.5914203    1.056227    1.077622
> 3    SMA       0.5682815       0.5931260    1.054797    1.075628
> 4    RMA       0.5663916       0.5918989    1.055826    1.077213
>
> Eigenvalues: 19.83213 0.2475542
>
> H statistic used for computing C.I. of MA: 2.502866e-05
>
>
>


-- 
Time flies like an arrow, fruit flies like bananas.

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to