Re: [R] memory problem

2017-05-03 Thread Anthoni, Peter (IMK)
Hi Amit,

Is the file gzipped or extracted?
if you read the plain text file, try to gzip it and make a read.table on the 
gzipped file, the read.table can handle gzipped files at least on linux and mac 
OS, not sure about windows.

cheers
Peter



> On 2. May 2017, at 18:59, Amit Sengupta via R-help  
> wrote:
> 
> Hi,I was unable to read a 2.4 gig file into an R object using read.table in 
> 64 bit R environment. Please let me have your suggestions.Amit Sengupta
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory problem

2017-05-02 Thread Amit Sengupta via R-help
Hi,I was unable to read a 2.4 gig file into an R object using read.table in 64 
bit R environment. Please let me have your suggestions.Amit Sengupta

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem

2016-11-22 Thread Henrik Bengtsson
On Windows 32-bit I think (it's been a while) you can push it to 3 GB but
to go beyond you need to run R  on 64-bit Windows (same rule for all
software not just R). I'm pretty sure this is already documented in the R
documentation.

Henrik

On Nov 22, 2016 19:49, "Ista Zahn"  wrote:

Not conveniently. Memory is cheap, you should buy more.

Best,
Ista

On Nov 22, 2016 12:19 PM, "Partha Sinha"  wrote:

>  I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to use
> more than 2 Gb data set ?
>
> Regards
> Partha
>
> [[alternative HTML version deleted]]
>
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/
> posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem

2016-11-22 Thread Jeff Newmiller
Ah, you also need to use a 64-bit operating system. Depending on the age of 
your hardware this may also mean you need a new computer. 

There are ways to process data on disk for certain algorithms, but you will be 
glad to leave them behind once the opportunity arises, so you might as well do 
so now. 
-- 
Sent from my phone. Please excuse my brevity.

On November 22, 2016 10:47:29 AM PST, Ista Zahn  wrote:
>Not conveniently. Memory is cheap, you should buy more.
>
>Best,
>Ista
>
>On Nov 22, 2016 12:19 PM, "Partha Sinha"  wrote:
>
>>  I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to
>use
>> more than 2 Gb data set ?
>>
>> Regards
>> Partha
>>
>> [[alternative HTML version deleted]]
>>
>> __
>> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/
>> posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>
>   [[alternative HTML version deleted]]
>
>__
>R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem

2016-11-22 Thread Ista Zahn
Not conveniently. Memory is cheap, you should buy more.

Best,
Ista

On Nov 22, 2016 12:19 PM, "Partha Sinha"  wrote:

>  I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to use
> more than 2 Gb data set ?
>
> Regards
> Partha
>
> [[alternative HTML version deleted]]
>
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/
> posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem

2016-11-22 Thread Marcus Nunes
Yes.

If you cannot read the dataset with the usual means, using functions like
read.table or read.csv, try the ff package: https://cran.r-
project.org/web/packages/ff/index.html.

Best,

On Tue, Nov 22, 2016 at 2:16 PM, Partha Sinha  wrote:

>  I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to use
> more than 2 Gb data set ?
>
> Regards
> Partha
>
> [[alternative HTML version deleted]]
>
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/
> posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>



-- 
Marcus Nunes
http://marcusnunes.me/

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem

2016-11-22 Thread Bert Gunter
Depends how you use it. e.g. it can be stored on disk and worked with
in pieces. Or some packages work with virtual memory, I believe.

However, it is certainly not possible to read it into R. In fact, you
probably won't be able to handle more (and maybe much less) than about
500 mb in R.

Cheers,
Bert
Bert Gunter

"The trouble with having an open mind is that people keep coming along
and sticking things into it."
-- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )


On Tue, Nov 22, 2016 at 9:16 AM, Partha Sinha  wrote:
>  I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to use
> more than 2 Gb data set ?
>
> Regards
> Partha
>
> [[alternative HTML version deleted]]
>
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory problem

2016-11-22 Thread Partha Sinha
 I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to use
more than 2 Gb data set ?

Regards
Partha

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem

2016-04-07 Thread Amelia Marsh
Dear Sir,


Yes I am using the plyr and in the end I am writing the output to the 
data.frame. Earlier I had the problem of process time and hence I made some 
changes in the code and now I am fetching all the required inputs needed for 
valuation purpose using ddply, store the results in a data.frame and once that 
is over, I am carrying out the calculations.

Here is part of my R code-


library(plyr)
library(reshape)


tx <- read.csv('transaction_fxdeal.csv')
tx$id  <-  as.character(tx$id)

n<- max(unique(simulated_exchange$id))

result <- NULL
current  <- 1
rcount   <- 0
current1 <- 1
rcount1  <- 0
current2 <- 1
rcount2  <- 0
for (env in 0:n) {
  
  if (rcount == 0) rcount <- nrow(subset(simulated_interest, id==env))
  temp <- current+rcount-1
  env_rates  <- simulated_interest[current:temp,]
  env_rates  <- env_rates[order(env_rates$curve, env_rates$day_count), ]
  if (rcount1 == 0)rcount1 <- nrow(subset(simulated_exchange, id==env))
  temp <- current1+rcount1-1
  exch_rates <- simulated_exchange[current1:temp,]
  if (rcount2 == 0)rcount2 <- nrow(subset(simulated_instruments, id==env))
  temp <- current2+rcount2-1
  instr_rates<- simulated_instruments[current2:temp,]
  current <- current+rcount
  current1 <- current1+rcount1
  current2 <- current2+rcount2
  
  curve   <- daply(env_rates, 'curve', function(x) {
return(approxfun(x$day_count, x$rate, rule = 2))
  })
  
result <- rbind(result, ddply(tx, 'id', function(x) {

intrate_from <- curve[[x$currency_from]](x$maturity_from)
intrate_to   <- curve[[x$currency_to]](x$maturity_to)
cross_rate   <- subset(exch_rates, key==paste(x$currency_from_exch, 
x$currency_to_exch, sep='_'))$rate
base_rate<- subset(exch_rates, key==paste(x$currency_to_exch, 
x$currency_base, sep='_'))$rate

return(data.frame(env=env, intrate_from=intrate_from, intrate_to=intrate_to, 
cross_rate=cross_rate, base_rate=base_rate))


  }))
}

sorted <- result[order(result$id, result$env),]

sorted$currency_from_exch <- rep(tx$currency_from_exch, each = 
length(unique(sorted$env)))
sorted$currency_to_exch <- rep(tx$currency_to_exch, each = 
length(unique(sorted$env)))
sorted$currency_base <- rep(tx$currency_base, each = length(unique(sorted$env)))
sorted$transaction_type <- rep(tx$transaction_type, each = 
length(unique(sorted$env)))
sorted$amount_fromccy <- rep(tx$amount_fromccy, each = 
length(unique(sorted$env)))
sorted$amount_toccy <- rep(tx$amount_toccy, each = length(unique(sorted$env)))
sorted$intbasis_fromccy <- rep(tx$intbasis_fromccy, each = 
length(unique(sorted$env)))
sorted$intbasis_toccy <- rep(tx$intbasis_toccy, each = 
length(unique(sorted$env)))
sorted$maturity_from <- rep(tx$maturity_from, each = length(unique(sorted$env)))
sorted$maturity_to <- rep(tx$maturity_to, each = length(unique(sorted$env)))
sorted$currency_from <- rep(tx$currency_from, each = 
length(unique(sorted$env))) 
sorted$currency_to <- rep(tx$currency_to, each = length(unique(sorted$env))) 

sorted$from_mtm <- sorted$cross_rate * (sorted$amount_fromccy / ((1 + 
(sorted$intrate_from/100))^(sorted$maturity_from / sorted$intbasis_fromccy)))

sorted$to_mtm   <- (sorted$amount_toccy   / ((1 + 
(sorted$intrate_to/100))^(sorted$maturity_to / sorted$intbasis_toccy)))

mtm_base <- function(from_mtm, to_mtm, base_rate)
{
mtm <- (from_mtm + to_mtm)
mtm_bc = mtm*base_rate[1]

return(data.frame(mtm_bc = mtm_bc))
}

sorted1 <- ddply(.data=sorted, .variables = "id", .fun=function(x) 
mtm_base(from_mtm = x$from_mtm, to_mtm = x$to_mtm, base_rate = x$base_rate))

sorted$mtm <- sorted1$mtm
sorted$mtm_bc <- sorted1$mtm_bc

sorted2 <- ddply(sorted, .(currency_from_exch, id), mutate, change_in_mtm_bc = 
mtm_bc - mtm_bc[1])

sorted$change_in_mtm_bc <- sorted2$change_in_mtm_bc

sorted <- sorted[order(sorted$id, sorted$env),]

write.csv(data.frame(sorted), file='MC_result_fxdeal.csv', row.names=FALSE)

# 

# END of Code



With regards

Amelia







On Wednesday, 6 April 2016 7:43 PM, Jeff Newmiller  
wrote:



As Jim has indicated, memory usage problems can require very specific 
diagnostics and code changes,  so generic help is tough to give. 

However, in most cases I have found the dplyr package to be more memory 
efficient than plyr, so you could consider that. Also, you can be explicit 
about only saving the minimum results you want to keep rather than making a 
list of complete results and extracting results later. 
-- 
Sent from my phone. Please excuse my brevity.


On April 6, 2016 4:39:59 AM PDT, Amelia Marsh via R-help  
wrote:
Dear R Forum,
>
>I have about 2000+ FX forward transactions and I am trying to run 1000 
>simulations. If I use less no of simulations, I am able to get the desired 
>results. However, when I try to use more than 1000 simulations, I get 
>following error.
>
>
>sorted2 <- ddply(sorted, .(currency_from_exch, id), mutate, change_in_mtm_bc = 
>mtm_bc - mtm_bc[1]) 
>>

Re: [R] Memory problem

2016-04-06 Thread Amelia Marsh
Dear Sir,
Thanks for the guidance. Will check. And yes, at the end of each simulation, a 
large result is getting stored. 
Regards
Amelia 

On Wednesday, 6 April 2016 5:48 PM, jim holtman  wrote:
 

 It is hard to tell from the information that you have provided.  Do you have a 
list of the sizes of all the objects that you have in memory?  Are you 
releasing large objects at the end of each simulation run?  Are you using 'gc' 
to garbage collect any memory after deallocating objects?  Collect some 
additional information with a simple function like below:
f_mem_stats <- function(memo='') cat(memo, proc.time(), memory.size(), '\n')

> f_mem_stats(2)2 2.85 11.59 85444.93 NA NA 39.08 
This will print out what you pass in as a parameter, e.g., the iteration 
number, and then outputs the amount of CPU and memory used so far.  I use this 
all the time to keep track of resource consumption in long running scripts.

Jim Holtman
Data Munger Guru
 
What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.
On Wed, Apr 6, 2016 at 7:39 AM, Amelia Marsh via R-help  
wrote:

Dear R Forum,

I have about 2000+ FX forward transactions and I am trying to run 1000 
simulations. If I use less no of simulations, I am able to get the desired 
results. However, when I try to use more than 1000 simulations, I get following 
error.

> sorted2 <- ddply(sorted, .(currency_from_exch, id), mutate, change_in_mtm_bc 
> = mtm_bc - mtm_bc[1])

Error: cannot allocate vector of size 15.6 Mb


In addition: Warning messages:
1: Reached total allocation of 3583Mb: see help(memory.size)
2: Reached total allocation of 3583Mb: see help(memory.size)
3: In output[[var]][rng] <- df[[var]] :
Reached total allocation of 3583Mb: see help(memory.size)
4: In output[[var]][rng] <- df[[var]] :
Reached total allocation of 3583Mb: see help(memory.size)
5: In output[[var]][rng] <- df[[var]] :
Reached total allocation of 3583Mb: see help(memory.size)
6: In output[[var]][rng] <- df[[var]] :
Reached total allocation of 3583Mb: see help(memory.size)
7: In output[[var]][rng] <- df[[var]] :
Reached total allocation of 3583Mb: see help(memory.size)
8: In output[[var]][rng] <- df[[var]] :
Reached total allocation of 3583Mb: see help(memory.size)


When I checked -

> memory.size()
[1] 846.83
> memory.limit()
[1] 3583


The code is bit lengthy and unfortunately can't be shared.

Kindly guide how this memory probelm can be tackled? I am using R x64 3.2.0

Regards

Amelia

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.




  
[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Re: [R] Memory problem

2016-04-06 Thread Jeff Newmiller
As Jim has indicated, memory usage problems can require very specific 
diagnostics and code changes,  so generic help is tough to give. 

However, in most cases I have found the dplyr package to be more memory 
efficient than plyr, so you could consider that. Also, you can be explicit 
about only saving the minimum results you want to keep rather than making a 
list of complete results and extracting results later. 
-- 
Sent from my phone. Please excuse my brevity.

On April 6, 2016 4:39:59 AM PDT, Amelia Marsh via R-help  
wrote:
>Dear R Forum,
>
>I have about 2000+ FX forward transactions and I am trying to run 1000
>simulations. If I use less no of simulations, I am able to get the
>desired results. However, when I try to use more than 1000 simulations,
>I get following error.
>
>> sorted2 <- ddply(sorted, .(currency_from_exch, id), mutate,
>change_in_mtm_bc = mtm_bc - mtm_bc[1]) 
>
>Error: cannot allocate vector of size 15.6 Mb 
>
>
>In addition: Warning messages: 
>1: Reached total allocation of 3583Mb: see help(memory.size) 
>2: Reached total allocation of 3583Mb: see help(memory.size) 
>3: In output[[var]][rng] <- df[[var]] : 
>Reached total allocation of 3583Mb: see help(memory.size) 
>4: In output[[var]][rng] <- df[[var]] : 
>Reached total allocation of 3583Mb: see help(memory.size) 
>5: In output[[var]][rng] <- df[[var]] : 
>Reached total allocation of 3583Mb: see help(memory.size) 
>6: In output[[var]][rng] <- df[[var]] : 
>Reached total allocation of 3583Mb: see help(memory.size) 
>7: In output[[var]][rng] <- df[[var]] : 
>Reached total allocation of 3583Mb: see help(memory.size) 
>8: In output[[var]][rng] <- df[[var]] : 
>Reached total allocation of 3583Mb: see help(memory.size)
>
>
>When I checked -
>
>> memory.size() 
>[1] 846.83 
>> memory.limit() 
>[1] 3583
>
>
>The code is bit lengthy and unfortunately can't be shared.
>
>Kindly guide how this memory probelm can be tackled? I am using R x64
>3.2.0
>
>Regards
>
>Amelia
>
>__
>R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem

2016-04-06 Thread jim holtman
You say it is "getting stored"; is this in memory or on disk?  How are you
processing the results of the 1,000 simulations?

So some more insight into the actual process would be useful.  For example,
how are the simulations being done, are the results stored in memory, or
out to a file, what are you doing with the results at the end, etc.


Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.

On Wed, Apr 6, 2016 at 8:44 AM, Amelia Marsh 
wrote:

> Dear Sir,
>
> Thanks for the guidance. Will check. And yes, at the end of each
> simulation, a large result is getting stored.
>
> Regards
>
> Amelia
>
>
> On Wednesday, 6 April 2016 5:48 PM, jim holtman 
> wrote:
>
>
> It is hard to tell from the information that you have provided.  Do you
> have a list of the sizes of all the objects that you have in memory?  Are
> you releasing large objects at the end of each simulation run?  Are you
> using 'gc' to garbage collect any memory after deallocating objects?
> Collect some additional information with a simple function like below:
>
> f_mem_stats <- function(memo='') cat(memo, proc.time(), memory.size(),
> '\n')
>
>
> > f_mem_stats(2)
> 2 2.85 11.59 85444.93 NA NA 39.08
>
> This will print out what you pass in as a parameter, e.g., the iteration
> number, and then outputs the amount of CPU and memory used so far.  I use
> this all the time to keep track of resource consumption in long running
> scripts.
>
>
> Jim Holtman
> Data Munger Guru
>
> What is the problem that you are trying to solve?
> Tell me what you want to do, not how you want to do it.
>
> On Wed, Apr 6, 2016 at 7:39 AM, Amelia Marsh via R-help <
> r-help@r-project.org> wrote:
>
> Dear R Forum,
>
> I have about 2000+ FX forward transactions and I am trying to run 1000
> simulations. If I use less no of simulations, I am able to get the desired
> results. However, when I try to use more than 1000 simulations, I get
> following error.
>
> > sorted2 <- ddply(sorted, .(currency_from_exch, id), mutate,
> change_in_mtm_bc = mtm_bc - mtm_bc[1])
>
> Error: cannot allocate vector of size 15.6 Mb
>
>
> In addition: Warning messages:
> 1: Reached total allocation of 3583Mb: see help(memory.size)
> 2: Reached total allocation of 3583Mb: see help(memory.size)
> 3: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 4: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 5: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 6: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 7: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 8: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
>
>
> When I checked -
>
> > memory.size()
> [1] 846.83
> > memory.limit()
> [1] 3583
>
>
> The code is bit lengthy and unfortunately can't be shared.
>
> Kindly guide how this memory probelm can be tackled? I am using R x64 3.2.0
>
> Regards
>
> Amelia
>
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> 
> and provide commented, minimal, self-contained, reproducible code.
>
>
>
>
>

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem

2016-04-06 Thread jim holtman
It is hard to tell from the information that you have provided.  Do you
have a list of the sizes of all the objects that you have in memory?  Are
you releasing large objects at the end of each simulation run?  Are you
using 'gc' to garbage collect any memory after deallocating objects?
Collect some additional information with a simple function like below:

f_mem_stats <- function(memo='') cat(memo, proc.time(), memory.size(), '\n')


> f_mem_stats(2)
2 2.85 11.59 85444.93 NA NA 39.08

This will print out what you pass in as a parameter, e.g., the iteration
number, and then outputs the amount of CPU and memory used so far.  I use
this all the time to keep track of resource consumption in long running
scripts.


Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.

On Wed, Apr 6, 2016 at 7:39 AM, Amelia Marsh via R-help <
r-help@r-project.org> wrote:

> Dear R Forum,
>
> I have about 2000+ FX forward transactions and I am trying to run 1000
> simulations. If I use less no of simulations, I am able to get the desired
> results. However, when I try to use more than 1000 simulations, I get
> following error.
>
> > sorted2 <- ddply(sorted, .(currency_from_exch, id), mutate,
> change_in_mtm_bc = mtm_bc - mtm_bc[1])
>
> Error: cannot allocate vector of size 15.6 Mb
>
>
> In addition: Warning messages:
> 1: Reached total allocation of 3583Mb: see help(memory.size)
> 2: Reached total allocation of 3583Mb: see help(memory.size)
> 3: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 4: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 5: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 6: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 7: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 8: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
>
>
> When I checked -
>
> > memory.size()
> [1] 846.83
> > memory.limit()
> [1] 3583
>
>
> The code is bit lengthy and unfortunately can't be shared.
>
> Kindly guide how this memory probelm can be tackled? I am using R x64 3.2.0
>
> Regards
>
> Amelia
>
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory problem

2016-04-06 Thread Amelia Marsh via R-help
Dear R Forum,

I have about 2000+ FX forward transactions and I am trying to run 1000 
simulations. If I use less no of simulations, I am able to get the desired 
results. However, when I try to use more than 1000 simulations, I get following 
error.

> sorted2 <- ddply(sorted, .(currency_from_exch, id), mutate, change_in_mtm_bc 
> = mtm_bc - mtm_bc[1]) 

Error: cannot allocate vector of size 15.6 Mb 


In addition: Warning messages: 
1: Reached total allocation of 3583Mb: see help(memory.size) 
2: Reached total allocation of 3583Mb: see help(memory.size) 
3: In output[[var]][rng] <- df[[var]] : 
Reached total allocation of 3583Mb: see help(memory.size) 
4: In output[[var]][rng] <- df[[var]] : 
Reached total allocation of 3583Mb: see help(memory.size) 
5: In output[[var]][rng] <- df[[var]] : 
Reached total allocation of 3583Mb: see help(memory.size) 
6: In output[[var]][rng] <- df[[var]] : 
Reached total allocation of 3583Mb: see help(memory.size) 
7: In output[[var]][rng] <- df[[var]] : 
Reached total allocation of 3583Mb: see help(memory.size) 
8: In output[[var]][rng] <- df[[var]] : 
Reached total allocation of 3583Mb: see help(memory.size)


When I checked -

> memory.size() 
[1] 846.83 
> memory.limit() 
[1] 3583


The code is bit lengthy and unfortunately can't be shared.

Kindly guide how this memory probelm can be tackled? I am using R x64 3.2.0

Regards

Amelia

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem when changing a function

2015-11-27 Thread Marwah Sabry Siam
i didn't write them because I thought it would be long. I am using
HPbayes package. I changed mp8.mle function. Two functions depend on
this one; loop.optim and prior.likewts, so I changed them and rename
them. The memory problem arises when applying the new loop.optim
function named loop.optim_m. The data is
> dput(AUS)
structure(list(Year = c(2011L, 2011L, 2011L, 2011L, 2011L, 2011L,
2011L, 2011L, 2011L, 2011L, 2011L, 2011L, 2011L, 2011L, 2011L,
2011L, 2011L, 2011L, 2011L, 2011L, 2011L, 2011L, 2011L, 2011L
), Age = structure(c(1L, 2L, 3L, 7L, 8L, 9L, 10L, 11L, 12L, 13L,
14L, 15L, 16L, 17L, 18L, 19L, 20L, 21L, 22L, 23L, 24L, 4L, 5L,
6L), .Label = c("0", "04-Jan", "09-May", "100-104", "105-109",
"110+", "14-Oct", "15-19", "20-24", "25-29", "30-34", "35-39",
"40-44", "45-49", "50-54", "55-59", "60-64", "65-69", "70-74",
"75-79", "80-84", "85-89", "90-94", "95-99"), class = "factor"),
mx = c(0.00381, 0.00018, 1e-04, 9e-05, 0.00033, 0.00046,
0.00051, 0.00067, 0.00088, 0.00122, 0.00184, 0.00277, 0.00418,
0.00645, 0.01005, 0.01725, 0.02955, 0.05478, 0.10292, 0.18274,
0.30093, 0.45866, 0.62819, 0.75716), qx = c(0.0038, 0.00071,
5e-04, 0.00047, 0.00163, 0.00229, 0.00256, 0.00337, 0.00437,
0.00609, 0.00916, 0.01374, 0.02068, 0.03177, 0.04912, 0.08298,
0.13827, 0.24257, 0.41114, 0.61482, 0.80056, 0.91837, 0.9686,
1), ax = c(0.06, 1.56, 2.36, 2.79, 2.81, 2.47, 2.55, 2.59,
2.6, 2.62, 2.7, 2.67, 2.65, 2.64, 2.67, 2.7, 2.67, 2.64,
2.55, 2.34, 2.08, 1.74, 1.43, 1.32), lx = c(10L, 99620L,
99550L, 99500L, 99453L, 99291L, 99064L, 98811L, 98478L, 98048L,
97450L, 96558L, 95231L, 93262L, 90299L, 85864L, 78739L, 67852L,
51393L, 30263L, 11657L, 2325L, 190L, 6L), dx = c(380L, 70L,
50L, 47L, 162L, 227L, 253L, 333L, 430L, 598L, 893L, 1327L,
1969L, 2963L, 4436L, 7125L, 10887L, 16459L, 21130L, 18606L,
9332L, 2135L, 184L, 6L), Lx = c(99643L, 398308L, 497617L,
497397L, 496912L, 495882L, 494700L, 493254L, 491358L, 488818L,
485200L, 479691L, 471529L, 459313L, 441161L, 412941L, 368377L,
300433L, 205300L, 101820L, 31011L, 4655L, 293L, 8L), Tx = c(8215623L,
8115980L, 7717672L, 7220055L, 6722657L, 6225746L, 5729864L,
5235163L, 4741909L, 4250551L, 3761733L, 3276532L, 2796841L,
2325312L, 1865999L, 1424838L, 1011897L, 643520L, 343087L,
137787L, 35967L, 4956L, 301L, 8L), ex = c(82.16, 81.47, 77.53,
72.56, 67.6, 62.7, 57.84, 52.98, 48.15, 43.35, 38.6, 33.93,
29.37, 24.93, 20.66, 16.59, 12.85, 9.48, 6.68, 4.55, 3.09,
2.13, 1.58, 1.32)), .Names = c("Year", "Age", "mx", "qx",
"ax", "lx", "dx", "Lx", "Tx", "ex"), class = "data.frame", row.names = c(NA,
-24L))

loop.optim_m function is

function (prior, nrisk, ndeath, d = 10, theta.dim = 8, age = c(1e-05,
1, seq(5, 110, 5)))
{
lx <- nrisk
dx <- ndeath
H.k <- prior
pllwts <- prior.likewts_m(prior = prior, nrisk = lx, ndeath = dx)
log.like.0 <- pllwts$log.like.0
wts.0 <- pllwts$wts.0
B0 <- 1000 * theta.dim
q0 <- H.k
d.keep <- 0
theta.new <- H.k[wts.0 == max(wts.0), ]
keep <- H.k
ll.keep <- log.like.0
opt.mu.d <- matrix(NA, nrow = d, ncol = theta.dim)
opt.cov.d <- array(NA, dim = c(theta.dim, theta.dim, d))
prior.cov <- cov(q0)
opt.low <- apply(q0, 2, min)
opt.hi <- apply(q0, 2, max)
imp.keep <- theta.dim * 100
max.log.like.0 <- max(log.like.0)
mp8.mle <- function(theta, x.fit = age) {
p.hat <- mod8p(theta = q0, x = age)
ll = dmultinom(x = dx, size = NULL, prob = p.hat, log = FALSE)
return(ll)
}
for (i in 1:d) {
out <- optim(par = theta.new, fn = mp8.mle, method = "L-BFGS-B",
lower = opt.low, upper = opt.hi, control = list(fnscale = -1,
maxit = 1e+05))
out.mu <- out$par
if (out$value > max.log.like.0) {
d.keep <- d.keep + 1
opt.mu.d[i, ] <- out.mu
out.hess <- hessian(func = mp8.mle, x = out$par)
if (is.positive.definite(-out.hess)) {
out.cov <- try(solve(-out.hess))
opt.cov.d[, , i] <- out.cov
}
if (!is.positive.definite(-out.hess)) {
out.grad <- grad(func = mp8.mle, x = out.mu)
A <- out.grad %*% t(out.grad)
out.prec <- try(solve(prior.cov)) + A
if (!is.positive.definite(out.prec)) {
  out.prec <- solve(prior.cov)
}
out.cov <- try(solve(out.prec))
opt.cov.d[, , i] <- out.cov
}
}
if (i == 1 & out$value <= max.log.like.0) {
out.hess <- hessian(func = mp8.mle, x = out$par)
if (is.positive.definite(-out.hess)) {
out.cov <- solve(-out.hess)
}
if (!is.positive.definite(-out.hess)) {
out.grad <- grad(func = mp8.mle, x = out.mu)
A <- out.grad %*% t(out.grad)
out.prec <- 

[R] Memory problem when changing a function

2015-11-26 Thread Marwah Sabry Siam
I changed a function in a package and I want to run this new function.
It always gives the error of "Error in memory: couldn't allocate a
vector of 15.3 Gb" altough  the built in function doesn't give this
error.

My system is window 10, 8 Ram, AMD Quad-Core processor.
I've read about memory problems but I couldn't solve it. I tried the
code on another system with 16 RAM but it didn't work also. How can I
solve this problem given that i can't change the code?Thank you.
Regards,
Marwah Sabry Siam,
Teaching Assistant at Faculty of Economics and Political Science,
Statistics Department,
01225875205

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem when changing a function

2015-11-26 Thread John Kane
Perhaps you should tell us what package you were using, what the function was, 
and how you changed it. 

Please have a look at
http://stackoverflow.com/questions/5963269/how-to-make-a-great-r-reproducible-example
 and/or http://adv-r.had.co.nz/Reproducibility.html for some suggestions on how 
to ask a question here.

Note that sample data, preferably in dput() format as described in the links 
above, is likely to be very important.

John Kane
Kingston ON Canada


> -Original Message-
> From: marwa.s...@feps.edu.eg
> Sent: Thu, 26 Nov 2015 21:57:12 +0200
> To: r-help@r-project.org
> Subject: [R] Memory problem when changing a function
> 
> I changed a function in a package and I want to run this new function.
> It always gives the error of "Error in memory: couldn't allocate a
> vector of 15.3 Gb" altough  the built in function doesn't give this
> error.
> 
> My system is window 10, 8 Ram, AMD Quad-Core processor.
> I've read about memory problems but I couldn't solve it. I tried the
> code on another system with 16 RAM but it didn't work also. How can I
> solve this problem given that i can't change the code?Thank you.
> Regards,
> Marwah Sabry Siam,
> Teaching Assistant at Faculty of Economics and Political Science,
> Statistics Department,
> 01225875205
> 
> __
> R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.


Can't remember your password? Do you need a strong and secure password?
Use Password manager! It stores your passwords & protects your account.

__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory problem of betadiver of vegan

2013-07-12 Thread Elaine Kuo
Hello List,

This is Elaine.
I am running betadiver for a dataset of 4873 rows and 2749 columns.
(4873 rows = 4873 gridcell of the study region and 2749 columns for the
bird species)
The dataset was produced by combing 5 dbf.

When running the code o, an error message jumped out, saying
Error: cannot allocate vector of size 90.6 Mb

I posted the issue in r-help and checked the previous mails about R memory
problems.
gc() and save .Rdata were tried but did not work.
Also, I tried to transformed the dataset into a matrix, using the code m
below.
However, an error also appeared, saying
Error in ifelse(x  0, 1, 0) :
  (list) object cannot be coerced to type 'double'.

Please kindly advise how to alleviate the memory problem, particularly in
modifying the code of betadiver of vegan.
Thank you.

Elaine

code m
matrixR-matrix(data = dataR, nrow = 4873, ncol = 2749)
d  -  betadiver(matrixR,  sim)



code o
# Non-Passerine table
dataNP_1 -read.dbf(H:/temp_D/stage_4_R_2748/NP_1-10.dbf, as.is = FALSE)
dataNP_2 -read.dbf(H:/temp_D/stage_4_R_2748/NP_11-19.dbf, as.is = FALSE)
dataNP-merge(dataNP_1,dataNP_2,by=c(GID),all=T)

.. skip...

# Non-Passerine and Passerine table (2748 species)
dataR-merge(dataP,dataNP,by=c(GID),all=T)
dim(dataR)
str(dataR)

library(vegan)

  ##  The  beta sim  index (Lennon 2001)
  d  -  betadiver(dataR,  sim)

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem of betadiver of vegan

2013-07-12 Thread Elaine Kuo
Hello List,

I solved the problem by using the code with 31 votes
http://stackoverflow.com/questions/1358003/tricks-to-manage-the-available-memory-in-an-r-session


On Sat, Jul 13, 2013 at 6:15 AM, Elaine Kuo elaine.kuo...@gmail.com wrote:

 Hello List,

 This is Elaine.
 I am running betadiver for a dataset of 4873 rows and 2749 columns.
 (4873 rows = 4873 gridcell of the study region and 2749 columns for the
 bird species)
 The dataset was produced by combing 5 dbf.

 When running the code o, an error message jumped out, saying
 Error: cannot allocate vector of size 90.6 Mb

 I posted the issue in r-help and checked the previous mails about R memory
 problems.
 gc() and save .Rdata were tried but did not work.
 Also, I tried to transformed the dataset into a matrix, using the code m
 below.
 However, an error also appeared, saying
 Error in ifelse(x  0, 1, 0) :
   (list) object cannot be coerced to type 'double'.

 Please kindly advise how to alleviate the memory problem, particularly in
 modifying the code of betadiver of vegan.
 Thank you.

 Elaine

 code m
 matrixR-matrix(data = dataR, nrow = 4873, ncol = 2749)
 d  -  betadiver(matrixR,  sim)



 code o
 # Non-Passerine table
 dataNP_1 -read.dbf(H:/temp_D/stage_4_R_2748/NP_1-10.dbf, as.is = FALSE)
 dataNP_2 -read.dbf(H:/temp_D/stage_4_R_2748/NP_11-19.dbf, as.is =
 FALSE)
 dataNP-merge(dataNP_1,dataNP_2,by=c(GID),all=T)

 .. skip...

 # Non-Passerine and Passerine table (2748 species)
 dataR-merge(dataP,dataNP,by=c(GID),all=T)
 dim(dataR)
 str(dataR)

 library(vegan)

   ##  The  beta sim  index (Lennon 2001)
   d  -  betadiver(dataR,  sim)


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory problem in R

2012-03-01 Thread saqlain raza
Hi all, I am running an -MNP- multinomial probit model package using R. It 
gives me the following objection instead of giving me the results:
Erreur : impossible d'allouer un vecteur de taille 137.9 Mo (in english: cannot 
allocate a 137.9 Mb vector memory). 

I have already increased the memory size upto 2047Mb. This problem has been 
discussed in 2008 (archives) but no profitable answers were delievered. I am 
sending the programming and bit of data. Thanks in advance for your help.

model-mnp(choice~1+Asso+FP+BEV+IAA+CER+FrtVeg+Meat+Others+lnADH+lnTO+Emp+LT50inReg+lnEXinEU+lnEXoutEU+GDist+RET+WS+LT50SC+Others_hotel,
base=4, n.draws=1, burnin=2000, thin=3, verbose=TRUE, trace=FALSE,
p.scale=1, coef.start=0, invcdf=FALSE, cov.start=1)

AssoFPCGIAABEVCERFrtVegMeatMilkOilOtherslnADHlnTOEmpLT50inReglnEXinEUlnEXoutEUGDistRETWSLT50SCOthers_hotelchoice
0001103,68897,55331,,0,010004
0001104,86759,01681,,0,14
0001104,90538,98720,,0,010004
0001104,41888,03530,,0,000104
0001105,634815,19545,,69310,000101
10100010003,401214,59170,0010,0,001001
0001104,499815,12016,,0,14
1101103,784213,49204,,0,6931001001
1001101,38638,86052,,0,02
 
Saqlain RAZA
PhD Student
[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem; Error: cannot allocate vector of size 915.5 Mb

2011-08-01 Thread Dimitris.Kapetanakis
Thanks a lot for the help. 

Actually, I am using a mac which (R for Mac OS X GUI 1.40-devel Leopard
build 32-bit (5751)) but I think I can find access on windows 7 64-bit. What
I am trying to do is a maximization through grid search (because I am not
sure that any of the optim() methods works sufficiently to my case, at least
all of them provide quite different results), the reason that I want the
optimizing is because I want to use it for a Monte Carlo analysis for
Smoothed Maximum Score estimator, and for that reason I want the
optimization to be the most efficient possible, but given that I am kind of
amateur on R and on programming in general, I doubt that I can do that
sufficiently.

Thanks again for your help

Dimitris

--
View this message in context: 
http://r.789695.n4.nabble.com/memory-problem-Error-cannot-allocate-vector-of-size-915-5-Mb-tp3707943p3709002.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem; Error: cannot allocate vector of size 915.5 Mb

2011-08-01 Thread David Winsemius


On Aug 1, 2011, at 3:04 AM, Dimitris.Kapetanakis wrote:


Thanks a lot for the help.

Actually, I am using a mac which (R for Mac OS X GUI 1.40-devel  
Leopard
build 32-bit (5751)) but I think I can find access on windows 7 64- 
bit.


I don't think that was what Holtman was advising. You just need more  
available  memory, no need to use Win7. The Mac platform has been 64- 
bit capable longer than the Windoze OS, anyway. The way you get there  
might be as simple as rebooting, not starting any other applications,  
and re-running your code. Success depends upon how much addressable  
memory you have, which you did not state. All of the stuff below is  
immaterial to these considerations.



What
I am trying to do is a maximization through grid search (because I  
am not
sure that any of the optim() methods works sufficiently to my case,  
at least
all of them provide quite different results), the reason that I want  
the

optimizing is because I want to use it for a Monte Carlo analysis for
Smoothed Maximum Score estimator, and for that reason I want the
optimization to be the most efficient possible, but given that I am  
kind of

amateur on R and on programming in general, I doubt that I can do that
sufficiently.


Your code ran without problem on my Mac running Leopard using an R64  
GUI session with 32 GB RAM (R.app GUI 1.41 (5866)).


 str(G.search)
 num [1:4000, 1:3] 1 1 1 1 1 1 1 1 1 1 ...

I have no idea whether it produced meaningful results, but a 120  
million item matrix is not a problem with enough physical memory. It's  
only around a Gig. Your error indicated a problem with allocating  
915.5 Mb. That should be possible (although borderline) in 4GB Mac  
running 32 bit R. (32 bit R is more memory efficient when working with  
physical memory of 4 GB or less because the pointer size is smaller.)


--
david.


--
View this message in context: 
http://r.789695.n4.nabble.com/memory-problem-Error-cannot-allocate-vector-of-size-915-5-Mb-tp3707943p3709002.html
Sent from the R help mailing list archive at Nabble.com.




David Winsemius, MD
West Hartford, CT

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory problem; Error: cannot allocate vector of size 915.5 Mb

2011-07-31 Thread Dimitris.Kapetanakis
Dear all,

I am trying to make some matrix operations (whose size I think is smaller
than what R allows) but the operations are not feasible when they run in one
session but it is feasible if they run separately while each operation is
totally independent of the other. I run the code in one session the error
that appears is:

Error: cannot allocate vector of size 915.5 Mb
R(16467,0xa0421540) malloc: *** mmap(size=960004096) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
R(16467,0xa0421540) malloc: *** mmap(size=960004096) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug

In the code that I run (next lines), if I do not include the last three
lines it runs perfectly, if I exclude operations to create the xMax again it
runs perfectly, if I include both G.search and xMax appears the error term.
Does anyone knows the solution of this problem or why this problem happens?

The code that I run is:

N-250
x-matrix(c(rnorm(N,-1.5,1), rnorm(N,1,1), rbinom(N,1,0.5)), ncol=3)
start-(-1)
end-3
step-10^(-2)
n.steps-(end-start)/step   
steps2  -n.steps^2
grids-seq(from=start+step, to=end, by=step)
xMax-matrix(0,N*steps2,3)
xMax[,1]-rep(x[,1],steps2) 
xMax[,2]-rep(x[,2],steps2)
xMax[,3]-rep(x[,3],steps2)
G.search1-as.matrix(rep(grids, n.steps, each=N))
G.search2-as.matrix(rep(grids, N, each=n.steps))
G.search-cbind(1,G.search1, G.search2)
 
Thank you

Dimitris


--
View this message in context: 
http://r.789695.n4.nabble.com/memory-problem-Error-cannot-allocate-vector-of-size-915-5-Mb-tp3707943p3707943.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem; Error: cannot allocate vector of size 915.5 Mb

2011-07-31 Thread jim holtman
My advice to you is to get a 64-bit version of R.  Here is what it
does on my 64-bit Windows 7 version:

 N-250
 x-matrix(c(rnorm(N,-1.5,1), rnorm(N,1,1), rbinom(N,1,0.5)), ncol=3)
 my.stats(1)
1 (1) - Rgui : 22:30:20 0.7 78.6 78.6 : 20.5MB
 start-(-1)
 end-3
 step-10^(-2)
 n.steps-(end-start)/step
 steps2  -n.steps^2
 grids-seq(from=start+step, to=end, by=step)
 xMax-matrix(0,N*steps2,3)
 my.stats(2)
2 (1) - Rgui : 22:30:23 4.1 82.1 82.1 : 935.5MB
 xMax[,1]-rep(x[,1],steps2)
 xMax[,2]-rep(x[,2],steps2)
 xMax[,3]-rep(x[,3],steps2)
 my.stats(3)
3 (1) - Rgui : 22:30:35 16.0 94.3 94.3 : 1998.9MB
 G.search1-as.matrix(rep(grids, n.steps, each=N))
 G.search2-as.matrix(rep(grids, N, each=n.steps))
 G.search-cbind(1,G.search1, G.search2)
 my.stats(3)
3 (1) - Rgui : 22:30:45 25.2 103.7 103.7 : 2456.6MB

 gc()
used   (Mb) gc trigger   (Mb)  max used   (Mb)
Ncells1437267.7 35   18.735   18.7
Vcells 320137296 2442.5  353288723 2695.4 320138039 2442.5
 my.ls()
  SizeMode
.my.env 56 environment
.Random.seed 2,544 numeric
end 48 numeric
G.search   960,000,200 numeric
G.search1  320,000,200 numeric
G.search2  320,000,200 numeric
grids3,240 numeric
N   48 numeric
n.steps 48 numeric
start   48 numeric
step48 numeric
steps2  48 numeric
x6,200   character
xMax   960,000,200 numeric
**Total  2,560,013,128 ---


You have objects totaling 2.5GB which is probably larger than can be
handled on a 32-bit version, especially when copies have to be made.

On Sun, Jul 31, 2011 at 11:53 AM, Dimitris.Kapetanakis
dimitrios.kapetana...@gmail.com wrote:
 Dear all,

 I am trying to make some matrix operations (whose size I think is smaller
 than what R allows) but the operations are not feasible when they run in one
 session but it is feasible if they run separately while each operation is
 totally independent of the other. I run the code in one session the error
 that appears is:

 Error: cannot allocate vector of size 915.5 Mb
 R(16467,0xa0421540) malloc: *** mmap(size=960004096) failed (error code=12)
 *** error: can't allocate region
 *** set a breakpoint in malloc_error_break to debug
 R(16467,0xa0421540) malloc: *** mmap(size=960004096) failed (error code=12)
 *** error: can't allocate region
 *** set a breakpoint in malloc_error_break to debug

 In the code that I run (next lines), if I do not include the last three
 lines it runs perfectly, if I exclude operations to create the xMax again it
 runs perfectly, if I include both G.search and xMax appears the error term.
 Does anyone knows the solution of this problem or why this problem happens?

 The code that I run is:

 N-250
 x-matrix(c(rnorm(N,-1.5,1), rnorm(N,1,1), rbinom(N,1,0.5)), ncol=3)
 start-(-1)
 end-3
 step-10^(-2)
 n.steps-(end-start)/step
 steps2  -n.steps^2
 grids-seq(from=start+step, to=end, by=step)
 xMax    -matrix(0,N*steps2,3)
 xMax[,1]-rep(x[,1],steps2)
 xMax[,2]-rep(x[,2],steps2)
 xMax[,3]-rep(x[,3],steps2)
 G.search1-as.matrix(rep(grids, n.steps, each=N))
 G.search2-as.matrix(rep(grids, N, each=n.steps))
 G.search-cbind(1,G.search1, G.search2)

 Thank you

 Dimitris


 --
 View this message in context: 
 http://r.789695.n4.nabble.com/memory-problem-Error-cannot-allocate-vector-of-size-915-5-Mb-tp3707943p3707943.html
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem

2010-08-19 Thread Duncan Murdoch

avsha38 wrote:

Hi,
when i run the following code i get this massege:

The instruction at 0x reference memory at 

0x###, the memory cannot be read. 
and then i have to close R.


what is the problem and how can i solve it?
  


The problem is a bug in the underlying C (or other) code.  To solve it, 
put together a minimal example that produces it reliably.  If it 
requires the frailtypack package to work, then send your example to the 
maintainer of that package.  If the minimal example uses only standard R 
packages, then submit an R bug report about it.


There's no point reporting it unless someone can reproduce it, and 
people might not follow up if the example is too complicated, so you 
should make an effort to simplify as much as you can before reporting.


Duncan Murdoch

thanks in advance
Avi

my code

# frailtypack
library(frailtypack)
cgd.ag - read.csv(C:/rfiles/RE/cgd.csv)
cgd.nfm -frailtyPenal(Surv(TStart, TStop,
Status)~cluster(Center)+subcluster(ID) 
 Treatment,data=cgd.ag,Frailty=TRUE,n.knots=8,kappa1=5, 
   cross.validation=TRUE,recurrentAG=TRUE)


  



__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory problem

2010-08-18 Thread avsha38

Hi,
when i run the following code i get this massege:

The instruction at 0x reference memory at 

0x###, the memory cannot be read. 
and then i have to close R.

what is the problem and how can i solve it?

thanks in advance
Avi

my code

# frailtypack
library(frailtypack)
cgd.ag - read.csv(C:/rfiles/RE/cgd.csv)
cgd.nfm -frailtyPenal(Surv(TStart, TStop,
Status)~cluster(Center)+subcluster(ID) 
 Treatment,data=cgd.ag,Frailty=TRUE,n.knots=8,kappa1=5, 
   cross.validation=TRUE,recurrentAG=TRUE)

  
-- 
View this message in context: 
http://r.789695.n4.nabble.com/memory-problem-tp2330510p2330510.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory problem for scatterplot using ggplot

2010-07-28 Thread Edwin Husni Sutanudjaja
Dear all,

I have a memory problem in making a scatter plot of my 17.5 million-pair 
datasets.
My intention to use the ggplot package and use the bin2d. Please find the 
attached script for more details.

Could somebody please give me any clues or tips to solve my problem?? please ...
Just for additional information: I'm running my R script on my 32-bit machine: 
Ubuntu 9.10, hardware: AMD Athlon Dual Core Processor 5200B, memory: 1.7GB.

Many thanks in advance.
Kind Regards, 

-- 
Ir. Edwin H. Sutanudjaja
Dept. of Physical Geography, Faculty of Geosciences, Utrecht University



  __
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem for scatterplot using ggplot

2010-07-28 Thread Brandon Hurr
It was my understanding that R wasn't really the best thing for absolutely
huge datasets. 17.5 million points would probably fall under the category of
absolutely huge.

I'm on a little netbook right now (atom/R32) and it failed, but I'll try it
on my macbookPro/R64 later and see if it's able to handle the size better.
For more information, my error is the following:

Error: cannot allocate vector of size 66.8 Mb
R(6725,0xa016e500) malloc: *** mmap(size=7640) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
R(6725,0xa016e500) malloc: *** mmap(size=7640) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug

 sessionInfo()
R version 2.11.1 (2010-05-31)
i386-apple-darwin9.8.0

locale:
[1] en_US.UTF-8/en_US.UTF-8/C/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] grid  stats graphics  grDevices utils datasets  methods
base

other attached packages:
[1] sp_0.9-65   mapproj_1.1-8.2 maps_2.1-4  mgcv_1.6-2
 ggplot2_0.8.8
[6] reshape_0.8.3   plyr_1.0.2  proto_0.3-8

loaded via a namespace (and not attached):
[1] digest_0.4.2   lattice_0.18-8 Matrix_0.999375-39 nlme_3.1-96

[5] tools_2.11.1

On Wed, Jul 28, 2010 at 11:13, Edwin Husni Sutanudjaja 
hsutanudjajacchm...@yahoo.com wrote:

 Dear all,

 I have a memory problem in making a scatter plot of my 17.5 million-pair
 datasets.
 My intention to use the ggplot package and use the bin2d. Please find
 the
 attached script for more details.

 Could somebody please give me any clues or tips to solve my problem??
 please ...
 Just for additional information: I'm running my R script on my 32-bit
 machine:
 Ubuntu 9.10, hardware: AMD Athlon Dual Core Processor 5200B, memory: 1.7GB.

 Many thanks in advance.
 Kind Regards,

 --
 Ir. Edwin H. Sutanudjaja
 Dept. of Physical Geography, Faculty of Geosciences, Utrecht University





 --
 You received this message because you are subscribed to the ggplot2 mailing
 list.
 Please provide a reproducible example: http://gist.github.com/270442

 To post: email ggpl...@googlegroups.com
 To unsubscribe: email 
 ggplot2+unsubscr...@googlegroups.comggplot2%2bunsubscr...@googlegroups.com
 More options: http://groups.google.com/group/ggplot2


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem for scatterplot using ggplot

2010-07-28 Thread Mark Connolly

On 07/28/2010 06:13 AM, Edwin Husni Sutanudjaja wrote:

Dear all,

I have a memory problem in making a scatter plot of my 17.5 million-pair
datasets.
My intention to use the ggplot package and use the bin2d. Please find the
attached script for more details.

Could somebody please give me any clues or tips to solve my problem?? please ...
Just for additional information: I'm running my R script on my 32-bit machine:
Ubuntu 9.10, hardware: AMD Athlon Dual Core Processor 5200B, memory: 1.7GB.

Many thanks in advance.
Kind Regards,

   
You should try to get access to a fairly robust 64bit machine, say in 
the range of =8GiB real memory and see what you can do.  No chance on a 
32 bit machine.  No chance on a 64 bit machine without sufficient real 
memory (you will be doomed to die by swap).  Does your institution have 
a virtualization lab with the ability to allocate machines with large 
memory footprints?  There is always Amazon EC2.  You could experiment 
with sizing before buying that new workstation you've had your eye on.


Alternatively, you might take much smaller samples of your data and 
massively decrease the size of the working set.  I assume this is not 
want you want though.


Mark

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem for scatterplot using ggplot

2010-07-28 Thread David Winsemius


On Jul 28, 2010, at 9:53 AM, Brandon Hurr wrote:

It was my understanding that R wasn't really the best thing for  
absolutely
huge datasets. 17.5 million points would probably fall under the  
category of

absolutely huge.

I'm on a little netbook right now (atom/R32) and it failed, but I'll  
try it
on my macbookPro/R64 later and see if it's able to handle the size  
better.


With 24GB on a Mac with 64 bit R, I routinely work with objects that  
are, let's see...
3,969,086,272, ... after commas   4GB in size (about 4.5 million  
records with about 100 columns). Thank you Simon and all the others  
doing R core and Mac development. As far as I am concerned 64bit R  
_IS_ the best thing.


--
David.



For more information, my error is the following:

Error: cannot allocate vector of size 66.8 Mb
R(6725,0xa016e500) malloc: *** mmap(size=7640) failed (error  
code=12)

*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
R(6725,0xa016e500) malloc: *** mmap(size=7640) failed (error  
code=12)

*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug


sessionInfo()

R version 2.11.1 (2010-05-31)
i386-apple-darwin9.8.0

locale:
[1] en_US.UTF-8/en_US.UTF-8/C/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] grid  stats graphics  grDevices utils datasets   
methods

base

other attached packages:
[1] sp_0.9-65   mapproj_1.1-8.2 maps_2.1-4  mgcv_1.6-2
ggplot2_0.8.8
[6] reshape_0.8.3   plyr_1.0.2  proto_0.3-8

loaded via a namespace (and not attached):
[1] digest_0.4.2   lattice_0.18-8 Matrix_0.999375-39  
nlme_3.1-96


[5] tools_2.11.1

On Wed, Jul 28, 2010 at 11:13, Edwin Husni Sutanudjaja 
hsutanudjajacchm...@yahoo.com wrote:


Dear all,

I have a memory problem in making a scatter plot of my 17.5 million- 
pair

datasets.
My intention to use the ggplot package and use the bin2d.  
Please find

the
attached script for more details.

Could somebody please give me any clues or tips to solve my problem??
please ...
Just for additional information: I'm running my R script on my 32-bit
machine:
Ubuntu 9.10, hardware: AMD Athlon Dual Core Processor 5200B,  
memory: 1.7GB.


Many thanks in advance.
Kind Regards,

--
Ir. Edwin H. Sutanudjaja
Dept. of Physical Geography, Faculty of Geosciences, Utrecht  
University






--
You received this message because you are subscribed to the ggplot2  
mailing

list.
Please provide a reproducible example: http://gist.github.com/270442

To post: email ggpl...@googlegroups.com
To unsubscribe: email ggplot2+unsubscr...@googlegroups.comggplot2%2bunsubscr...@googlegroups.com 


More options: http://groups.google.com/group/ggplot2



[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


David Winsemius, MD
West Hartford, CT

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory problem in multinomial logistic regression

2010-07-05 Thread Daniel Wiesmann
Dear All

I am trying to fit a multinomial logistic regression to a data set with a size 
of 94279 by 14 entries. The data frame has one sample column which is the 
categorical variable, and the number of different categories is 9. The size of 
the data set (as a csv file) is less than 10 MB.

I tried to fit a multinomial logistic regression, either using vglm() from the 
VGAM package or mlogit() from the mlogit package.

In both cases the estimation crashes because I do not have enough memory, 
although the free memory before starting the regression is more than 2GB. The 
regression functions eat up all of my memory.

Does anyone know why this relatively small data set leads to memory problems, 
and how I could work around my problem?

thank you for your help,

Daniel

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem in multinomial logistic regression

2010-07-05 Thread Charles C. Berry

On Mon, 5 Jul 2010, Daniel Wiesmann wrote:


Dear All

I am trying to fit a multinomial logistic regression to a data set with 
a size of 94279 by 14 entries. The data frame has one sample column 
which is the categorical variable, and the number of different 
categories is 9. The size of the data set (as a csv file) is less than 
10 MB.



First, do

str( your.data.frame )

so we can be sure that you do not have a factor lurking among your 
regressors.


Then report the calls you used for vglm() and mlogit().

It might not hurt to construct the model.matrix() first and check on it 
with object.size()


Also try

for (i in levels(your.data.frame$sample)){
print(
glm(I(sample==i) ~. , your.data.,frame, family=binomial)
)}

just to check on your data. If that loop fails all bets are off.


HTH,

Chuck



I tried to fit a multinomial logistic regression, either using vglm() 
from the VGAM package or mlogit() from the mlogit package.


In both cases the estimation crashes because I do not have enough 
memory, although the free memory before starting the regression is more 
than 2GB. The regression functions eat up all of my memory.


Does anyone know why this relatively small data set leads to memory 
problems, and how I could work around my problem?


thank you for your help,

Daniel

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



Charles C. Berry(858) 534-2098
Dept of Family/Preventive Medicine
E mailto:cbe...@tajo.ucsd.edu   UC San Diego
http://famprevmed.ucsd.edu/faculty/cberry/  La Jolla, San Diego 92093-0901

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-08 Thread Meenakshi

Hi,

Can I use macro variables in R. If we can use macro variables in R, 
where i can get that programs or macro in R books.

-- 
View this message in context: 
http://n4.nabble.com/Memory-Problem-tp1459740p1472700.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-08 Thread jim holtman
What exactly is your definition of macro?  What to you want to do?
What is the problem that you are trying to solve?  Why to you think
macros will help?  Typically R does not have macros; I assume that
idea is a holdover from SAS.

On Mon, Feb 8, 2010 at 4:30 AM, Meenakshi
meenakshichidamba...@gmail.com wrote:

 Hi,

 Can I use macro variables in R. If we can use macro variables in R,
 where i can get that programs or macro in R books.

 --
 View this message in context: 
 http://n4.nabble.com/Memory-Problem-tp1459740p1472700.html
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem that you are trying to solve?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-08 Thread S Ellison
 jim holtman jholt...@gmail.com 08/02/2010 14:09:52 
Typically R does not have macros; 

I know exactly why Jim Holtman said that; R doesn't have a separate
'macro' construct with separate 'macro variables'.

But it is perhaps a bit misleading to say that R doesn't have macros
without saying a bit more about what it _does_ have. Most of R _is_ a
collection of R 'macros' and writing R functions or text files full of R
commands - for batch or interactive use - is a common part of day-to-day
R use; so much so, perhaps, that noone would bother giving R programs
and functions a separate label like 'macro'.

So while you won't have an excel or SAS 'macro' you most certainly _do_
have at least equivalent capabilities.

Steve Ellison

 jim holtman jholt...@gmail.com 08/02/2010 14:09:52 
What exactly is your definition of macro?  What to you want to do?
What is the problem that you are trying to solve?  Why to you think
macros will help?  Typically R does not have macros; I assume that
idea is a holdover from SAS.

On Mon, Feb 8, 2010 at 4:30 AM, Meenakshi
meenakshichidamba...@gmail.com wrote:

 Hi,

 Can I use macro variables in R. If we can use macro variables in R,
 where i can get that programs or macro in R books.

 --
 View this message in context:
http://n4.nabble.com/Memory-Problem-tp1459740p1472700.html 
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help 
 PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html 
 and provide commented, minimal, self-contained, reproducible code.




-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem that you are trying to solve?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help 
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html 
and provide commented, minimal, self-contained, reproducible code.

***
This email and any attachments are confidential. Any use...{{dropped:8}}

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-06 Thread Meenakshi

Hi,
I am using R 10.2.1 version.

Before run any statement/functions the gc report is:
 used (Mb) gc trigger (Mb) max used (Mb)
Ncells 124352  3.4 35  9.4   35  9.4
Vcells  81237  0.7 786432  6.0   310883  2.4

After I run the repeat statement, I got the following error message:

Error: cannot allocate vector of size 100 Kb
In addition: There were 50 or more warnings (use warnings() to see the first
50)

Finally I have 22 objects. All are 3 columns and within 50 rows only. I
don't know its size.

I gave final gc report below:(That meas after got error messange)
used   (Mb) gc trigger   (Mb)  max used   (Mb)
Ncells3224518.7 597831   16.0597831   16.0
Vcells 194014676 1480.3  285240685 2176.3 198652226 1515.6

Please give solution to me.

 

-- 
View this message in context: 
http://n4.nabble.com/Memory-Problem-tp1459740p1471138.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-06 Thread Meenakshi

Hi,
After get error message,
My main file size is 1.05MB.
Other objects are within 400bytes only.
Thanks.

-- 
View this message in context: 
http://n4.nabble.com/Memory-Problem-tp1459740p1471153.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-06 Thread jim holtman
Here is a function I use to get the size of the objects in my
workspace.  Let us know the output of this command

my.object.size - function (pos = 1, sorted = F)
{
.result - sapply(ls(pos = pos, all.names = TRUE), function(..x)
object.size(eval(as.symbol(..x
if (sorted) {
.result - rev(sort(.result))
}
.ls - as.data.frame(rbind(as.matrix(.result), `**Total` = sum(.result)))
names(.ls) - Size
.ls$Size - formatC(.ls$Size, big.mark = ,, digits = 0,
format = f)
.ls$Mode - c(unlist(lapply(rownames(.ls)[-nrow(.ls)], function(x)
mode(eval(as.symbol(x),
---)
.ls
}


You will get something like this:

 my.object.size()
 SizeMode
.my.env28 environment
.Random.seed2,528 numeric
.required  72   character
my.object.size  6,712function
x   6,712   character
**Total16,052 ---





On Sat, Feb 6, 2010 at 4:51 AM, Meenakshi
meenakshichidamba...@gmail.com wrote:

 Hi,
 I am using R 10.2.1 version.

 Before run any statement/functions the gc report is:
         used (Mb) gc trigger (Mb) max used (Mb)
 Ncells 124352  3.4     35  9.4   35  9.4
 Vcells  81237  0.7     786432  6.0   310883  2.4

 After I run the repeat statement, I got the following error message:

 Error: cannot allocate vector of size 100 Kb
 In addition: There were 50 or more warnings (use warnings() to see the first
 50)

 Finally I have 22 objects. All are 3 columns and within 50 rows only. I
 don't know its size.

 I gave final gc report below:(That meas after got error messange)
            used   (Mb) gc trigger   (Mb)  max used   (Mb)
 Ncells    322451    8.7     597831   16.0    597831   16.0
 Vcells 194014676 1480.3  285240685 2176.3 198652226 1515.6

 Please give solution to me.



 --
 View this message in context: 
 http://n4.nabble.com/Memory-Problem-tp1459740p1471138.html
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem that you are trying to solve?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-06 Thread Meenakshi

This is my objects size:   

   Size Mode
asa_Condition   912 list
asa_GatedCommunity9,912 list
asa_Neighbourhood 2,872 list
asa_Security832 list
asa_Storeys 800 list
Condition_adju  560 list
final_Condition 672 list
final_GatedCommunity  3,936 list
final_Neighbourhood   1,376 list
final_Security  608 list
final_Storeys   616 list
GatedCommunity_adju   3,000 list
model_Condition 648 list
model_GatedCommunity648 list
model_Neighbourhood 648 list
model_Security  648 list
model_Storeys   640 list
modeling1 9,157,856 list
mult  3,613,576 list
my.object.size6,912 function
Neighbourhood_adju1,080 list
Security_adju   512 list
Storeys_adju520 list
**Total  12,809,784  ---
Warning message:
In structure(.Internal(object.size(x)), class = object_size) :
  Reached total allocation of 1535Mb: see help(memory.size)
-- 
View this message in context: 
http://n4.nabble.com/Memory-Problem-tp1459740p1471251.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-06 Thread jim holtman
Have you tried gc() to see if any memory is released?  How big was the
file that you read in?  I don't see any large objects that appear in
your workspace.  Is there some type of processing that you did after
reading in the data?  You might want to intersperse the following
command in your script so that you can track where memory utilization
is going up:

print(memory.size())

I would try this with a smaller dataset size to see what happens.
Take a set of metrics and determine what happens as the size of the
data file is increased.  It is hard to tell without the actual script
to see what, and how, the processing is done.

Again, are there other alternatives that you might want to consider:
using a database, reading in only the columns of data you need,
preprocessing the data into smaller files, etc.  Besides reading in
the data, exactly what do you want to do with it and how much of it is
actually required for the processing?  For example, I have scripts
that only read in the data and then write out the object for later
processing since it is usually the reading and initial processing that
takes a lot of time.  This is another way of partitioning the work.
Anytime I have problems with processing data, I always take a smaller
chunk (cutting it half each time) till I can at least read it in in a
reasonable time.  One of the skills that you have to learn is to how
to debug your programs; not only actual bugs in your script, but
workarounds that may have to be created due to some constraint in the
system(s) that you are using.  This is a good place to practice design
of experiments.

On Sat, Feb 6, 2010 at 8:09 AM, Meenakshi
meenakshichidamba...@gmail.com wrote:

 This is my objects size:

                                   Size     Mode
 asa_Condition               912     list
 asa_GatedCommunity        9,912     list
 asa_Neighbourhood         2,872     list
 asa_Security                832     list
 asa_Storeys                 800     list
 Condition_adju              560     list
 final_Condition             672     list
 final_GatedCommunity      3,936     list
 final_Neighbourhood       1,376     list
 final_Security              608     list
 final_Storeys               616     list
 GatedCommunity_adju       3,000     list
 model_Condition             648     list
 model_GatedCommunity        648     list
 model_Neighbourhood         648     list
 model_Security              648     list
 model_Storeys               640     list
 modeling1             9,157,856     list
 mult                  3,613,576     list
 my.object.size            6,912 function
 Neighbourhood_adju        1,080     list
 Security_adju               512     list
 Storeys_adju                520     list
 **Total              12,809,784  ---
 Warning message:
 In structure(.Internal(object.size(x)), class = object_size) :
  Reached total allocation of 1535Mb: see help(memory.size)
 --
 View this message in context: 
 http://n4.nabble.com/Memory-Problem-tp1459740p1471251.html
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem that you are trying to solve?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-04 Thread Meenakshi

Hi,

I have to run the repeat loop more than 50 times continuously. But it runs
only 20 to 30 times only. After that the memory problem is coming. My
dataset has 6321kb only. Then how to solve this problem. 

Meenakshi
-- 
View this message in context: 
http://n4.nabble.com/Memory-Problem-tp1459740p1468737.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-04 Thread jim holtman
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

How about providing information on your operating system and version
of R.  Also provide a list of all the objects in your workspace and
the size of them.  You may be making copies or storing results in a
list.  This information would be helpful to understand where your
problem is.  Also provide the output of gc() so that we can see how
memory is being used.  To get helpful suggestions, you have to provide
useful information.  Saying you have memory problems is not
sufficient.

On Thu, Feb 4, 2010 at 7:43 AM, Meenakshi
meenakshichidamba...@gmail.com wrote:

 Hi,

 I have to run the repeat loop more than 50 times continuously. But it runs
 only 20 to 30 times only. After that the memory problem is coming. My
 dataset has 6321kb only. Then how to solve this problem.

 Meenakshi
 --
 View this message in context: 
 http://n4.nabble.com/Memory-Problem-tp1459740p1468737.html
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem that you are trying to solve?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-03 Thread Jim Lemon

On 02/02/2010 09:33 PM, Meenakshi wrote:


Hi,

When I run the repeat loop in R for large dataset, I got Memory problem.
How can I solve these problem.


1) Wait 2^m years, where m is the power of 2 that approximates the 
multiple of your current amount of RAM that would accommodate your 
problem (Moore, 1965).


2) Post some code that will give us an inkling of your problem (Plate, 
2006).


Jim

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory Problem

2010-02-02 Thread Meenakshi

Hi,

When I run the repeat loop in R for large dataset, I got Memory problem. 
How can I solve these problem. 
-- 
View this message in context: 
http://n4.nabble.com/Memory-Problem-tp1459740p1459740.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2010-02-02 Thread Uwe Ligges



On 02.02.2010 11:33, Meenakshi wrote:


Hi,

When I run the repeat loop in R for large dataset, I got Memory problem.
How can I solve these problem.


buy more memory, bigger machine, more efficient programming, import of 
only relevant data, use of specific tools, .. or in other words: 
Depends on your problem and please do read the posting guide.


Uwe Ligges

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R Memory Problem

2010-01-27 Thread David Winsemius

You were asked to provide details, but so far have not.

--  
David.


On Jan 27, 2010, at 2:17 AM, prem_R wrote:



Yes i think this is  explanation of the  problem  faced .Could you  
please

help me to solve this .

--
View this message in context: 
http://n4.nabble.com/R-Memory-Problem-tp1289221p1311291.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R Memory Problem

2010-01-26 Thread prem_R

Yes i think this is  explanation of the  problem  faced .Could you please
help me to solve this .

-- 
View this message in context: 
http://n4.nabble.com/R-Memory-Problem-tp1289221p1311291.html
Sent from the R help mailing list archive at Nabble.com.

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R Memory Problem

2010-01-25 Thread jim holtman
How big is your data set (use object.size on the object and 'str').
Exactly what statements are you executing?  Exactly what error message
are you getting?

On Mon, Jan 25, 2010 at 5:44 AM, prem_R mtechp...@gmail.com wrote:

 Is anyone could help me to resolve this problem?I'm presently an SAS user for
 my application and was exploring R to use it for my application.I have
 already posted this question on using my 32 bit machine with 2GB RAM and
 from what i understood was to use a 64Bit machine .I tried using 64bit
 machine using 4GB RAM .I'm running predictive analytics using R and to
 calibrate my model i used to adjust the variables used in the model and the
 problem happens here.R just runs out of memory .I tried garbage cleaning
 also.
 data
 APN    condition    quality    site_zip   sale_date    sale_price
 estimate
 1.1-1   good         good       10201    1/1/07         $234,000
 $254,000
 1.5-1   average     good       10201    1/1/08         $254,000
 $276,000
 1.6-1    poor          poor       10202    1/1/06         $192,000
 $199,000
 1.7-1    good         good        10202    1/1/07        $300,000
 $305,000

 Regression equation

 Sale_price=condition quality site_zip

 after running the above equation i will be getting the estimates and
 then i will calibrate the model using the dependent variables.

 For that purpose seperate dataset are created and run for 50 Iterations
 .Problem occurs here after running few iterations it shows out of space.

 I'm using R 2.10.0

 If you need any other clarifications i shall provide the needed .Help me to
 solve this
 --
 View this message in context: 
 http://n4.nabble.com/R-Memory-Problem-tp1289221p1289221.html
 Sent from the R help mailing list archive at Nabble.com.

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.




-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem that you are trying to solve?

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] R Memory Problem

2010-01-25 Thread Johann Hibschman
prem_R mtechp...@gmail.com writes:

 I'm running predictive analytics using R and to calibrate my model i
 used to adjust the variables used in the model and the problem happens
 here.R just runs out of memory .I tried garbage cleaning also.

I'm analyzing a 8 GB data set using R, so it can certainly handle large
data sets.  It tends to copy data very often, however, so you have to be
very careful with it.

For example, if you modify a single column in a data frame, R will copy
the entire data frame, rather than just replace the modified column.  If
you are running a regression that saves the input data in the model
result object, and you are modifying the data frame between runs, then
it would be very easy to have many copies of your data in memory at
once.

One solution would be not to keep the model result objects around.
Another would be to manually modify them to strip out the data object.
This can be tricky, however, since copies of the data may live on in the
environments of saved functions; I had this problem with 'mgcv::gam'
fits.

I hope that helps.

Regards,
Johann

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem on Suse

2009-12-16 Thread Ambrosi Alessandro
Dear all and dear Marc, it seems you hit the target.
I checked as you suggested, and... it is a 32 bit version! 
Now I'm  fixing it. Thank you very much.
Alessandro


From: Marc Schwartz [marc_schwa...@me.com]
Sent: 11 December 2009 17:02
To: Ambrosi Alessandro
Cc: r-help@r-project.org
Subject: Re: [R] memory problem on Suse

On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote:


 Dear all, I am meeting some problems with  memory allocation. I know
 it is an old issue, I'm sorry.
 I looked for a solution in the FAQs and manuals, mails, but without
 finding the working answer.
 I really hope you can help me.
 For instance, if I try to read micorarray data I get:

 mab=ReadAffy(cdfname=hgu133plus2cdf)
 Error: cannot allocate vector of size 858.0 Mb


 I get similar errors with smaller objects, smaller data sets or
 other procedures
 (Error: cannot allocate vector of size 123.0 Mb).
 I'm running R with Suse 11.1 Linux OS, on two Xeon processors (8
 cores), 32 GB RAM.
 I suppose I have enough resources to manage these objects and data
 files

 Any suggestions or hints will be really appreciated!
 Many thanks in advance.
 Alessandro

Well, you are running into a situation where there is not a contiguous
chunk of RAM available in the sizes referenced, for allocation to the
vector.

Presuming that you are running a 64 bit version of SUSE (what does
'uname -a' show in a system console), you should also check to be sure
that you are also running a 64 bit version of R. What does:

   .Machine$sizeof.pointer

show?

If it returns 4, then you are running a 32 bit version of R, which
cannot take advantage of your 64 bit platform. You should install a 64
bit version of R.

HTH,

Marc Schwartz
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory problem on Suse

2009-12-11 Thread Ambrosi Alessandro

Dear all, I am meeting some problems with  memory allocation. I know it is an 
old issue, I'm sorry. 
I looked for a solution in the FAQs and manuals, mails, but without finding the 
working answer. 
I really hope you can help me. 
For instance, if I try to read micorarray data I get:

 mab=ReadAffy(cdfname=hgu133plus2cdf)
Error: cannot allocate vector of size 858.0 Mb
 

I get similar errors with smaller objects, smaller data sets or other 
procedures 
(Error: cannot allocate vector of size 123.0 Mb).
I'm running R with Suse 11.1 Linux OS, on two Xeon processors (8 cores), 32 GB 
RAM.
I suppose I have enough resources to manage these objects and data files

Any suggestions or hints will be really appreciated!
Many thanks in advance.
Alessandro 
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem on Suse

2009-12-11 Thread Marc Schwartz

On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote:



Dear all, I am meeting some problems with  memory allocation. I know  
it is an old issue, I'm sorry.
I looked for a solution in the FAQs and manuals, mails, but without  
finding the working answer.

I really hope you can help me.
For instance, if I try to read micorarray data I get:


mab=ReadAffy(cdfname=hgu133plus2cdf)

Error: cannot allocate vector of size 858.0 Mb




I get similar errors with smaller objects, smaller data sets or  
other procedures

(Error: cannot allocate vector of size 123.0 Mb).
I'm running R with Suse 11.1 Linux OS, on two Xeon processors (8  
cores), 32 GB RAM.
I suppose I have enough resources to manage these objects and data  
files


Any suggestions or hints will be really appreciated!
Many thanks in advance.
Alessandro


Well, you are running into a situation where there is not a contiguous  
chunk of RAM available in the sizes referenced, for allocation to the  
vector.


Presuming that you are running a 64 bit version of SUSE (what does  
'uname -a' show in a system console), you should also check to be sure  
that you are also running a 64 bit version of R. What does:


  .Machine$sizeof.pointer

show?

If it returns 4, then you are running a 32 bit version of R, which  
cannot take advantage of your 64 bit platform. You should install a 64  
bit version of R.


HTH,

Marc Schwartz

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] memory problem on Suse

2009-12-11 Thread Martin Morgan
Ask on the bioconductpr mailing list, where you will be diirected to  
several solutions for analyzing what I guess are 100's is cel files


http://bioconductor.org

--
Martin Morgan

On Dec 11, 2009, at 8:02 AM, Marc Schwartz marc_schwa...@me.com wrote:


On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote:



Dear all, I am meeting some problems with  memory allocation. I  
know it is an old issue, I'm sorry.
I looked for a solution in the FAQs and manuals, mails, but without  
finding the working answer.

I really hope you can help me.
For instance, if I try to read micorarray data I get:


mab=ReadAffy(cdfname=hgu133plus2cdf)

Error: cannot allocate vector of size 858.0 Mb




I get similar errors with smaller objects, smaller data sets or  
other procedures

(Error: cannot allocate vector of size 123.0 Mb).
I'm running R with Suse 11.1 Linux OS, on two Xeon processors (8  
cores), 32 GB RAM.
I suppose I have enough resources to manage these objects and data  
files


Any suggestions or hints will be really appreciated!
Many thanks in advance.
Alessandro


Well, you are running into a situation where there is not a  
contiguous chunk of RAM available in the sizes referenced, for  
allocation to the vector.


Presuming that you are running a 64 bit version of SUSE (what does  
'uname -a' show in a system console), you should also check to be  
sure that you are also running a 64 bit version of R. What does:


 .Machine$sizeof.pointer

show?

If it returns 4, then you are running a 32 bit version of R, which  
cannot take advantage of your 64 bit platform. You should install a  
64 bit version of R.


HTH,

Marc Schwartz

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] memory problem - failing to load rgl in R 2.7.1 patched

2008-08-04 Thread Monica Pisica

 Hi,

 yesterday i had the surprise not to be able to load the package ca on R 
 2.7.0 saying that cannot find required package rgl although it was there. So 
 today i've upgraded to 7.2.1. patched and i got the following error:

 local({pkg - select.list(sort(.packages(all.available = TRUE)))
 + if(nchar(pkg)) library(pkg, character.only=TRUE)})
 Loading required package: rgl
 Error : cannot allocate vector of size 2.0 Gb
 Error: package 'rgl' could not be loaded

 I've up-ed the memory to max which is 4 Gb for my computer and tried again 
 ... same result. I know how to install packages and load them  or at 
 least i thought so. Do you know what is wrong??


 sysname release
 Windows XP
 version nodename
 build 2600, Service Pack 2
 machine
 x86

 sessionInfo()
 R version 2.7.1 Patched (2008-07-31 r46185)
 i386-pc-mingw32

 locale:
 LC_COLLATE=English_United States.1252;LC_CTYPE=English_United 
 States.1252;LC_MONETARY=English_United 
 States.1252;LC_NUMERIC=C;LC_TIME=English_United States.1252

 attached base packages:
 [1] stats graphics grDevices utils datasets methods base

 other attached packages:
 [1] e1071_1.5-18 class_7.2-42

 loaded via a namespace (and not attached):
 [1] tools_2.7.1

 Thanks for any advice,

 Monica


 _
 Time for vacation? WIN what you need- enter now!
 http://www.gowindowslive.com/summergiveaway/?ocid=tag_jlyhm

_


LM_WLYIA_whichathlete_us
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory Problem

2008-03-21 Thread Georgios Marentakis
Dear all,
I am having a memory problem when analyzing a rather large data set with
nested factors in R.
The model is of the form X~A*B*(C/D/F) A,B,C,D,F being the independent
variables some of which are nested.
The problem occurs when using aov but also when using glm or lme.
In particular I get the following response,

Error: cannot allocate vector of size 1.6 Gb
R(311,0xa000d000) malloc: *** vm_allocate(size=1733365760) failed (error
code=3)
R(311,0xa000d000) malloc: *** error: can't allocate region
R(311,0xa000d000) malloc: *** set a breakpoint in szone_error to debug
R(311,0xa000d000) malloc: *** vm_allocate(size=1733365760) failed (error
code=3)
R(311,0xa000d000) malloc: *** error: can't allocate region
R(311,0xa000d000) malloc: *** set a breakpoint in szone_error to debug

This is on an Intel Mac with 2 GBytes of RAM running MacOS X vs. 10.4.11
The very same result appears on an 8 core Intel Mac with 6 Gbytes of RAM and
on a Linux Box with 2 GBytes of RAM.
Is there a way to bypass this and let R allocate the necessary memory? Is
this a system problem? Would it be resolved in a mainframe for example?

thank you very much for your time,
georgios

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory Problem

2008-03-21 Thread Prof Brian Ripley
On Fri, 21 Mar 2008, Georgios Marentakis wrote:

 Dear all,
 I am having a memory problem when analyzing a rather large data set with
 nested factors in R.
 The model is of the form X~A*B*(C/D/F) A,B,C,D,F being the independent
 variables some of which are nested.
 The problem occurs when using aov but also when using glm or lme.
 In particular I get the following response,

 Error: cannot allocate vector of size 1.6 Gb
 R(311,0xa000d000) malloc: *** vm_allocate(size=1733365760) failed (error
 code=3)
 R(311,0xa000d000) malloc: *** error: can't allocate region
 R(311,0xa000d000) malloc: *** set a breakpoint in szone_error to debug
 R(311,0xa000d000) malloc: *** vm_allocate(size=1733365760) failed (error
 code=3)
 R(311,0xa000d000) malloc: *** error: can't allocate region
 R(311,0xa000d000) malloc: *** set a breakpoint in szone_error to debug

 This is on an Intel Mac with 2 GBytes of RAM running MacOS X vs. 10.4.11
 The very same result appears on an 8 core Intel Mac with 6 Gbytes of RAM and
 on a Linux Box with 2 GBytes of RAM.
 Is there a way to bypass this and let R allocate the necessary memory? Is
 this a system problem? Would it be resolved in a mainframe for example?

It is not R which is failing to allocate the memory, but the OS.

Most likely it is a 32-bit address space issue (1.6Gb is a large hole to 
find in a (probably) 3Gb address space), so you need a 64-bit OS 
(e.g. Mac OS 10.5) and a 64-bit version of R.


 thank you very much for your time,
 georgios

   [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem?

2008-01-31 Thread Jay Emerson
Elena,

Page 23 of the R Installation Guide provides some memory guidelines
that you might find helpful.

There are a few things you could try using R, at least to get up and running:

- Look at fewer tumors at a time using standard R as you have been.
- Look at the ff package, which leaves the data in flat files with
memory mapped pages.
- It may be that package filehash does something similar using a
database (I'm less familiar with this).
- Wait for the upcoming package bigmemoRy package, which is designed
to place large objects like this in RAM (using C++) but gives you a
close-to-seamless interaction with it from R.  Caveat below.

With any of these options, you are still very much restricted by the
type of analysis you are attempting.  Almost any existing procedure
(e.g. a cox model) would need a regular R object (probably impossible)
and you are back to square one.  An exception to this is Thomas
Lumley's biglm package, which processes the data in chunks.  We need
more tools like these.  Ultimately, you'll need to find some method of
analysis that is pretty smart memory-wise, and this may not be easy.

Best of luck,

Jay

-
Original message:

I am trying to run a cox model for the prediction of relapse of 80 cancer
tumors, taking into account the expression of 17000 genes. The data are
large and I retrieve an error:
Cannot allocate vector of 2.4 Mb. I increase the memory.limit to 4000
(which is the largest supported by my computer) but I still retrieve the
error because of other big variables that I have in the workspace. Does
anyone know how to overcome this problem?

Many thanks in advance,
Eleni


-- 
John W. Emerson (Jay)
Assistant Professor of Statistics
Director of Graduate Studies
Department of Statistics
Yale University
http://www.stat.yale.edu/~jay
Statistical Consultant, REvolution Computing

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory problem?

2008-01-30 Thread Eleni Christodoulou
Hello R users,

I am trying to run a cox model for the prediction of relapse of 80 cancer
tumors, taking into account the expression of 17000 genes. The data are
large and I retrieve an error:
Cannot allocate vector of 2.4 Mb. I increase the memory.limit to 4000
(which is the largest supported by my computer) but I still retrieve the
error because of other big variables that I have in the workspace. Does
anyone know how to overcome this problem?

Many thanks in advance,
Eleni

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem?

2008-01-30 Thread 宋时歌
I have a similar problem, saying cannot allocate vector size of
300MB. I would also appreciate if someone can offer some suggestion
on this.

Best,
Shige

On Jan 31, 2008 2:48 PM, Eleni Christodoulou [EMAIL PROTECTED] wrote:
 Hello R users,

 I am trying to run a cox model for the prediction of relapse of 80 cancer
 tumors, taking into account the expression of 17000 genes. The data are
 large and I retrieve an error:
 Cannot allocate vector of 2.4 Mb. I increase the memory.limit to 4000
 (which is the largest supported by my computer) but I still retrieve the
 error because of other big variables that I have in the workspace. Does
 anyone know how to overcome this problem?

 Many thanks in advance,
 Eleni

 [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Memory problem?

2008-01-30 Thread Prof Brian Ripley
On Thu, 31 Jan 2008, Eleni Christodoulou wrote:

 Hello R users,

 I am trying to run a cox model for the prediction of relapse of 80 cancer
 tumors, taking into account the expression of 17000 genes. The data are
 large and I retrieve an error:
 Cannot allocate vector of 2.4 Mb. I increase the memory.limit to 4000
 (which is the largest supported by my computer) but I still retrieve the
 error because of other big variables that I have in the workspace. Does
 anyone know how to overcome this problem?

Use a 64-bit version of R.

(The 'minimal information' asked for in the posting guide would have 
helped us give a more information answer, but likely the problem is too 
big for a 32-bit OS.)


 Many thanks in advance,
 Eleni

   [[alternative HTML version deleted]]

 __
 R-help@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.


-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory problem

2008-01-24 Thread Yoni Stoffman
Hi All, 

 

There is something I don't quite understand about R memory management. 

 

I have the following function

 

function (AdGroupId) 

{

  print(memory.size())

  channel - odbcConnect(RDsn, uid = , case = tolower, pwd =
xx)

  Tree1 - sqlQuery(channel, exec SelectAdgroups 20,0, as.is =
c(FALSE, FALSE, FALSE, FALSE, TRUE))

  Tree2 - sqlQuery(channel, exec SelectAdgroups 200337,0, as.is =
c(FALSE, FALSE, FALSE, FALSE, TRUE))

  gc()

  print(memory.size())

  odbcClose(channel)

  rm(channel);rm(Tree1);rm(Tree2);

  gc()

  print(memory.size())

  return(NULL)

}

 

I ran the function twice and I got the following results:

 

 prop(1000)

[1] 11.0589

[1] 15.97034

[1] 13.43737

NULL

 prop(1000)

[1] 13.43737

[1] 17.97294

[1] 17.42295

NULL

 

As you can see the size of the memory is being increased with each call to
the function. When I call the function using apply the memory usage keeps
growing until I get out of memory error.

 

It was tested on 4 different environments:

R:  2.4.0  and 2.6.1

OS: WinXp 64/32

 

Any ideas? 

 

Thanks,

Yoni.

 

 


[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] Memory problem using predict function

2007-12-17 Thread Brad Timm
I am trying to make a predicted vegetation map using the predict ( )
function and am running into an issue with memory size

Specifically I am building a random forest classification (dataframe = 
vegmap.rf) using the randomForest library and then am trying to apply
results from that to construct a predicted map (dataframe =testvegmap.pred
):

  testvegmap.pred -predict(vegmap.rf, veg)

And when I try to run this I get a message of:  cannot allocate vector of
size 88.0Mb

I have used the series of commands below to increase the memory size to
4000Mb (the largest I seemingly can expand to):

  memory.size(max=FALSE)
  memory.limit(size=4000)

Any suggestions?  Is my only option to reduce the size of the area I am
trying to make a predicted map of?

Thanks
Brad

[[alternative HTML version deleted]]

__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.