label/behavioral%20risk%20factor%20surveillance%20system%20%28brfss%29
>>
>> --
>> Better name for the general practitioner might be multispecialist.
>> ~Martin H. Fischer (1879-1962)
>>
>>
>> -Original Message-
>> From: R-help [mailto:r-help-bo
Hi,
I want to load a dataset into R. This dataset is available in two formats:
.XPT and .ASC. The dataset is available at
http://www.cdc.gov/brfss/annual_data/annual_2006.htm.
They are about 40mb zipped, and about 500mb unzipped.
I can get the .xpt data to load, using:
> library(hmisc)
> data <
Hello --
I am comparing two
GLMs (binomial dependent variable)
, the results are the following:
> m1<-glm(symptoms ~ phq_index, data=data2)
> m2<-glm(symptoms ~ 1, data=data2)
Trying to compare these models using
> anova (m1, m2)
I do not obtain chi-square values or a chi-square difference test; i
Jose Iparraguirre <
jose.iparragui...@ageuk.org.uk> wrote:
> Hi Eiko,
>
> How about this?
>
> > anova (m1, m2, test="Chisq")
>
> See: ?anova.glm
>
> Regards,
> José
>
>
> Prof. José Iparraguirre
> Chief Economist
> Age UK
>
>
&g
e
not found a way to do this with several DVs at the same time (I am
interested in the time * DVs interaction term, so I need a multivariate
model).
Is this possible? Is there example syntax for this problem?
Thank you for your help
Torvon
[[alternative HTML version de
In linear regression, regression weights of x1 on Y given x2 and x3 should
be mathematically identical to the semipartial correlations between x1 and
Y, given x2 and x3.
However, I do not obtain identical results, so apparently I'm doing
something wrong in R.
Data preparation:
data<-read.csv("fil
and the first column usually have
variable names. We've been unsuccessful so far to read such a file into
qgraph, and haven't found to manually assign names to variables in qgraph.
Would you know of a solution to this problem?
Thank you,
Torvon
[[alternative HT
I'm sorry, no clue how I did not see that. Thank you!
On 12 February 2013 15:21, Uwe Ligges wrote:
>
>
> On 12.02.2013 15:15, Torvon wrote:
>
>> The code is quite long because I am running a WLS regression instead of an
>> OLS regression (due to heteroscedasticity)
08e-17
On 12 February 2013 15:07, Uwe Ligges wrote:
>
>
> On 12.02.2013 14:44, Torvon wrote:
>
>>
>>
>> Thank you, Uwe.
>>
>> summary(m1) gives me p-value estimates of:
>> (Intercept) 2e-16
>> x1 6.9e-15
>> x2 1.9e-07
>> x3 2.7e-09
Thank you, Uwe.
summary(m1) gives me p-value estimates of:
(Intercept) 2e-16
x1 6.9e-15
x2 1.9e-07
x3 2.7e-09
While coef(summary(m1))[,4] gives me:
(Intercept) 3.0e-23
x1 5.7e-13
x2 2.6e-07
x3 1.7e-17
While the first one confirms my suspicion (-23 instead of -16), the latter
one vary drastically
obtain the "true"
unrounded p-values for these regressors?
m1 <- lm(y ~ x1+x2+x3+4+x5, data=D)
Thank you
Torvon
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-
I am currently using the relaimpo package to estimate the relative
importance of regressors (N= 4000):
> m1 <- lm(y ~ x1+x2+x3+x4+x5+, data=data)
> calc.relimp(m1, rela=TRUE)
> m2=boot.relimp(m1, boot = 500, rela=TRUE, type="lmg")
> booteval.relimp(m2)
> plot(booteval.relimp(m2))
In a new dataset
Dear R Mailinglist,
I want to understand how predictors are associated with a dependent
variable in a regression. I have 3 measurement points. I'm not interested
in understanding the associations of regressors and the predictor at each
measurement separately, instead I would like to use the whole
Hello.
Using the polychor function
> polychor(data[c(s1,s2)] )
for polychoric correlations of two ordinal variables in R takes a long time
for N=7000 (20 minutes+) and significantly slows down my computer.
Now, I have a pretty old computer, but it takes about 20 seconds for MPLUS
to print out the
Rui,
Thank you very much. Are there other things I have to adjust except for
exchanging "object" by the name of my model?
Torvon
On 29 November 2012 08:17, Rui Barradas wrote:
> ci_lm <- function(object, level = 0.95){
> summfit <- summary(object)
> beta &l
find a way to get the
CIs for multiple independent variables.
Thank you
Torvon
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R
de.
Thank you!
Torvon
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented,
still don't know how to get the CIs for these values seeing that they are
manually computed.
Thanks
On 21 November 2012 19:10, Bert Gunter wrote:
> ?confint
>
> -- Bert
>
> On Wed, Nov 21, 2012 at 3:55 PM, Torvon wrote:
> > Bert,
> >
> > Please excuse me,
Bert,
Please excuse me, and let me rephrase:
How do I obtain the confidence intervals of the _standardized_ beta weights
for predictors in a linear regression in R?
Thank you.
Torvon
On 21 November 2012 16:10, Bert Gunter wrote:
> 1. This is a statistics, not an R, question. Post o
I run 9 WLS regressions in R, with 7 predictors each.
What I want to do now is compare:
(1) The strength of predictors within each model (assuming all predictors
are significant). That is, I want to say whether x1 is stronger than x2,
and also say whether it is significantly stronger. I compare st
I'm sorry, but I wasn't aware that I should attach data (and can't, because I
must not disclose them).
Thanks for helping me get rid of the individual points, I'm now trying to
get standard errors and - more importantly - parallel lines for different
values of a variable, e.g. history of depressi
21 matches
Mail list logo