> On 01/07/2008, at 5:07 AM, Antonio P. Ramos wrote:
>
>> Thanks for the comments.
>>
>> What I'm doing is very simple: I'm running an one dimensional item
>> response model, similar to the ones use in psychology and educational
>> testing data via Markov Chain Monte Carlo Methods.
>>
>>
>> model_
On Jun 30, 2008, at 3:07 PM, Antonio P. Ramos wrote:
Thanks for the comments.
What I'm doing is very simple: I'm running an one dimensional item
response model, similar to the ones use in psychology and educational
testing data via Markov Chain Monte Carlo Methods.
model_m12<- ideal(rollcall
On 01/07/2008, at 5:07 AM, Antonio P. Ramos wrote:
Thanks for the comments.
What I'm doing is very simple: I'm running an one dimensional item
response model, similar to the ones use in psychology and educational
testing data via Markov Chain Monte Carlo Methods.
model_m12<- ideal(rollcall_m2
On Jun 30, 2008, at 2:38 PM, Kasper Daniel Hansen wrote:
Thanks for the clarification. How did you get that output?
vmmap (and for 64-bit processes use vmmap64).
Cheers,
Simon
Kasper
On Jun 30, 2008, at 10:23 AM, Simon Urbanek wrote:
On Jun 30, 2008, at 1:04 PM, Kasper Daniel Hansen
Thanks for the comments.
What I'm doing is very simple: I'm running an one dimensional item
response model, similar to the ones use in psychology and educational
testing data via Markov Chain Monte Carlo Methods.
model_m12<- ideal(rollcall_m2, maxiter = 500 000 000, thin = 1000,
burnin = 5000,
It might be worth reminding people that gc() reports the maximum (R heap)
memory use as well as current memory use.
The memory profiler might also be useful for tracking down what is
happening, but I don't think it's compiled into the CRAN binary of R.
-thomas
On Mon, 30 Jun 2008,
On Jun 30, 2008, at 12:07 PM, Antonio P. Ramos wrote:
Thanks for the comments.
What I'm doing is very simple: I'm running an one dimensional item
response model, similar to the ones use in psychology and educational
testing data via Markov Chain Monte Carlo Methods.
model_m12<- ideal(rollcall
Thanks for the clarification. How did you get that output?
Kasper
On Jun 30, 2008, at 10:23 AM, Simon Urbanek wrote:
On Jun 30, 2008, at 1:04 PM, Kasper Daniel Hansen wrote:
Like Sean is aying, you most likely are using _way_ more memory
than 1.2 GB.
However, if you a re running 32bit R
On Jun 30, 2008, at 1:04 PM, Kasper Daniel Hansen wrote:
Like Sean is aying, you most likely are using _way_ more memory than
1.2 GB.
However, if you a re running 32bit R (which is the case if you use
the CRAN binary) R can only access 2GB,
That's not true, 32-bit process can use up to a
Like Sean is aying, you most likely are using _way_ more memory than
1.2 GB.
However, if you a re running 32bit R (which is the case if you use the
CRAN binary) R can only access 2GB, so you can squeeze a little more
out of your machine by switching to a 64bit version of R. You can
check
On Sun, Jun 29, 2008 at 6:35 AM, Antonio P. Ramos
<[EMAIL PROTECTED]> wrote:
> Hi everybody,
>
> I have a memory allocation problem while using R in my macbook pro,
> which runs the latest leopard. I'm trying to run a monte carlo
> simulation with 500,000 interactions, but the machine failed:
>
>
Hi everybody,
I have a memory allocation problem while using R in my macbook pro,
which runs the latest leopard. I'm trying to run a monte carlo
simulation with 500,000 interactions, but the machine failed:
Starting MCMC Iterations...
Error: cannot allocate vector of size 1.2 Gb
R(176,0xa0640fa
12 matches
Mail list logo