Many thanks for your replies. I appreciate that.
I tried what you suggested and it did work for the Poisson model (glm,
"poisson" familly). Unfortunately, the negative binomial (glm.nb) did not
work as I work the following message:
Warning messages:
1: In ifelse(y > mu, d.res, -d.res) :
Reache
On 21.10.2011 23:14, Ken wrote:
Your memory shouldn't be capped there,
Where? You cannot know from the output below.
try ?memory.size and ?memory.limit. Background less things.
Good luck,
Ken Hutchison
On Oct 21, 2554 BE, at 11:57 AM, D_Tomas wrote:
My apologies for my vague
Your memory shouldn't be capped there, try ?memory.size and ?memory.limit.
Background less things.
Good luck,
Ken Hutchison
On Oct 21, 2554 BE, at 11:57 AM, D_Tomas wrote:
> My apologies for my vague comment.
>
> My data comprises 400.000 x 21 (17 explanatory variables, plus response
My apologies for my vague comment.
My data comprises 400.000 x 21 (17 explanatory variables, plus response
variable, plus two offsets).
If I build the full model (only linear) I get:
Error: cannot allocate vector of size 112.3 Mb
I have a 4GB RAM laptop... Would i get any improvemnt on a 8G
D_Tomas hotmail.com> writes:
>
> Hi,
>
> I am trying to fi a glm-poisson model to 400.000 records. I have tried biglm
> and glmulti but i have problems... can it really be the case that 400.000
> are too many records???
>
> I am thinking of using random samples of my dataset.
>
"I hav
Hi,
I am trying to fi a glm-poisson model to 400.000 records. I have tried biglm
and glmulti but i have problems... can it really be the case that 400.000
are too many records???
I am thinking of using random samples of my dataset.
Many thanks,
--
View this message in context:
http://r.78
6 matches
Mail list logo