>> spend more time on this. I really don't mind using the
>previous version.
Hello Derek,
or upgrade to R 2.5.0dev; the execution of your code snippet is not
hampered by memory issues:
> sessionInfo()
R version 2.5.0 Under development (unstable) (2006-10-10 r39600)
i386-pc-mingw32
locale:
L
"Derek Stephen Elmerick" <[EMAIL PROTECTED]> writes:
> Peter,
>=20
> I ran the memory limit function you mention below and both versions provi=
de
> the same result:
>=20
> >
> > memory.limit(size=3D4095)
> NULL
> > memory.limit(NA)
> [1] 4293918720
> >
> I do have 4GB ram on my PC. As a more repr
"Derek Stephen Elmerick" <[EMAIL PROTECTED]> writes:
> Peter,
>
> I ran the memory limit function you mention below and both versions provide
> the same result:
>
> >
> > memory.limit(size=4095)
> NULL
> > memory.limit(NA)
> [1] 4293918720
> >
> I do have 4GB ram on my PC. As a more reproducible
Peter,
I ran the memory limit function you mention below and both versions provide
the same result:
memory.limit(size=4095)
NULL
memory.limit(NA)
[1] 4293918720
I do have 4GB ram on my PC. As a more reproducible form of the test, I
have attached output that uses a randomly generated data
"Derek Stephen Elmerick" <[EMAIL PROTECTED]> writes:
> Thanks for the replies. Point taken regarding submission protocol. I have
> included a text file attachment that shows the R output with version 2.3.=
0and
> 2.4.0. A label distinguishing the version is included in the comments.
>=20
> A quick
"Derek Stephen Elmerick" <[EMAIL PROTECTED]> writes:
> Thanks for the replies. Point taken regarding submission protocol. I have
> included a text file attachment that shows the R output with version 2.3.0and
> 2.4.0. A label distinguishing the version is included in the comments.
>
> A quick bac
Thanks for the replies. Point taken regarding submission protocol. I have
included a text file attachment that shows the R output with version 2.3.0and
2.4.0. A label distinguishing the version is included in the comments.
A quick background on the attached example. The dataset has 650,000 record
"Derek Stephen Elmerick" <[EMAIL PROTECTED]> writes:
> thanks for the friendly reply. i think my description was fairly clear: i
> import a large dataset and run a model. using the same dataset, the
> process worked previously and it doesn't work now. if the new version of R
> requires more memory
"Derek Stephen Elmerick" <[EMAIL PROTECTED]> writes:
> thanks for the friendly reply. i think my description was fairly clear: i
> import a large dataset and run a model. using the same dataset, the
> process worked previously and it doesn't work now. if the new version of R
> requires more memory
It would be helpful to produce a script that reproduces the error on =20
your system. And include details on the size of your data set and =20
what you are doing with it. It is unclear what function is actually =20
causing the error and such. Really, in order to do something about it =20=
you need
It would be helpful to produce a script that reproduces the error on
your system. And include details on the size of your data set and
what you are doing with it. It is unclear what function is actually
causing the error and such. Really, in order to do something about it
you need to show h
thanks for the friendly reply. i think my description was fairly clear: i
import a large dataset and run a model. using the same dataset, the
process worked previously and it doesn't work now. if the new version of R
requires more memory and this compromises some basic data analyses, i would
label
[EMAIL PROTECTED] writes:
> Full_Name: Derek Elmerick
> Version: 2.4.0
> OS: Windows XP
> Submission from: (NULL) (38.117.162.243)
>
>
>
> hello -
>
> i have some code that i run regularly using R version 2.3.x . the final step
> of
> the code is to build a multinomial logit model. the datase
Full_Name: Derek Elmerick
Version: 2.4.0
OS: Windows XP
Submission from: (NULL) (38.117.162.243)
hello -
i have some code that i run regularly using R version 2.3.x . the final step of
the code is to build a multinomial logit model. the dataset is large; however, i
have not had issues in the pa
14 matches
Mail list logo