Re: [R] memory problem

2017-05-03 Thread Anthoni, Peter (IMK)
Hi Amit, Is the file gzipped or extracted? if you read the plain text file, try to gzip it and make a read.table on the gzipped file, the read.table can handle gzipped files at least on linux and mac OS, not sure about windows. cheers Peter > On 2. May 2017, at 18:59, Amit Sengupta via

[R] memory problem

2017-05-02 Thread Amit Sengupta via R-help
Hi,I was unable to read a 2.4 gig file into an R object using read.table in 64 bit R environment. Please let me have your suggestions.Amit Sengupta [[alternative HTML version deleted]] __ R-help@r-project.org mailing list -- To UNSUBSCRIBE and

Re: [R] Memory problem

2016-11-22 Thread Henrik Bengtsson
On Windows 32-bit I think (it's been a while) you can push it to 3 GB but to go beyond you need to run R on 64-bit Windows (same rule for all software not just R). I'm pretty sure this is already documented in the R documentation. Henrik On Nov 22, 2016 19:49, "Ista Zahn"

Re: [R] Memory problem

2016-11-22 Thread Jeff Newmiller
Ah, you also need to use a 64-bit operating system. Depending on the age of your hardware this may also mean you need a new computer. There are ways to process data on disk for certain algorithms, but you will be glad to leave them behind once the opportunity arises, so you might as well do

Re: [R] Memory problem

2016-11-22 Thread Ista Zahn
Not conveniently. Memory is cheap, you should buy more. Best, Ista On Nov 22, 2016 12:19 PM, "Partha Sinha" wrote: > I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to use > more than 2 Gb data set ? > > Regards > Partha > > [[alternative HTML

Re: [R] Memory problem

2016-11-22 Thread Marcus Nunes
Yes. If you cannot read the dataset with the usual means, using functions like read.table or read.csv, try the ff package: https://cran.r- project.org/web/packages/ff/index.html. Best, On Tue, Nov 22, 2016 at 2:16 PM, Partha Sinha wrote: > I am using R 3.3.2 on win 7, 32

Re: [R] Memory problem

2016-11-22 Thread Bert Gunter
Depends how you use it. e.g. it can be stored on disk and worked with in pieces. Or some packages work with virtual memory, I believe. However, it is certainly not possible to read it into R. In fact, you probably won't be able to handle more (and maybe much less) than about 500 mb in R. Cheers,

[R] Memory problem

2016-11-22 Thread Partha Sinha
I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to use more than 2 Gb data set ? Regards Partha [[alternative HTML version deleted]] __ R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see

Re: [R] Memory problem

2016-04-07 Thread Amelia Marsh
Dear Sir, Yes I am using the plyr and in the end I am writing the output to the data.frame. Earlier I had the problem of process time and hence I made some changes in the code and now I am fetching all the required inputs needed for valuation purpose using ddply, store the results in a

Re: [R] Memory problem

2016-04-06 Thread Amelia Marsh
Dear Sir, Thanks for the guidance. Will check. And yes, at the end of each simulation, a large result is getting stored.  Regards Amelia On Wednesday, 6 April 2016 5:48 PM, jim holtman wrote: It is hard to tell from the information that you have provided.  Do you

Re: [R] Memory problem

2016-04-06 Thread Jeff Newmiller
As Jim has indicated, memory usage problems can require very specific diagnostics and code changes, so generic help is tough to give. However, in most cases I have found the dplyr package to be more memory efficient than plyr, so you could consider that. Also, you can be explicit about only

Re: [R] Memory problem

2016-04-06 Thread jim holtman
You say it is "getting stored"; is this in memory or on disk? How are you processing the results of the 1,000 simulations? So some more insight into the actual process would be useful. For example, how are the simulations being done, are the results stored in memory, or out to a file, what are

Re: [R] Memory problem

2016-04-06 Thread jim holtman
It is hard to tell from the information that you have provided. Do you have a list of the sizes of all the objects that you have in memory? Are you releasing large objects at the end of each simulation run? Are you using 'gc' to garbage collect any memory after deallocating objects? Collect

[R] Memory problem

2016-04-06 Thread Amelia Marsh via R-help
Dear R Forum, I have about 2000+ FX forward transactions and I am trying to run 1000 simulations. If I use less no of simulations, I am able to get the desired results. However, when I try to use more than 1000 simulations, I get following error. > sorted2 <- ddply(sorted,

Re: [R] Memory problem when changing a function

2015-11-27 Thread Marwah Sabry Siam
i didn't write them because I thought it would be long. I am using HPbayes package. I changed mp8.mle function. Two functions depend on this one; loop.optim and prior.likewts, so I changed them and rename them. The memory problem arises when applying the new loop.optim function named loop.optim_m.

[R] Memory problem when changing a function

2015-11-26 Thread Marwah Sabry Siam
I changed a function in a package and I want to run this new function. It always gives the error of "Error in memory: couldn't allocate a vector of 15.3 Gb" altough the built in function doesn't give this error. My system is window 10, 8 Ram, AMD Quad-Core processor. I've read about memory

Re: [R] Memory problem when changing a function

2015-11-26 Thread John Kane
-project.org > Subject: [R] Memory problem when changing a function > > I changed a function in a package and I want to run this new function. > It always gives the error of "Error in memory: couldn't allocate a > vector of 15.3 Gb" altough the built in function does

[R] memory problem of betadiver of vegan

2013-07-12 Thread Elaine Kuo
Hello List, This is Elaine. I am running betadiver for a dataset of 4873 rows and 2749 columns. (4873 rows = 4873 gridcell of the study region and 2749 columns for the bird species) The dataset was produced by combing 5 dbf. When running the code o, an error message jumped out, saying Error:

Re: [R] memory problem of betadiver of vegan

2013-07-12 Thread Elaine Kuo
Hello List, I solved the problem by using the code with 31 votes http://stackoverflow.com/questions/1358003/tricks-to-manage-the-available-memory-in-an-r-session On Sat, Jul 13, 2013 at 6:15 AM, Elaine Kuo elaine.kuo...@gmail.com wrote: Hello List, This is Elaine. I am running betadiver

[R] Memory problem in R

2012-03-01 Thread saqlain raza
Hi all, I am running an -MNP- multinomial probit model package using R. It gives me the following objection instead of giving me the results: Erreur : impossible d'allouer un vecteur de taille 137.9 Mo (in english: cannot allocate a 137.9 Mb vector memory).  I have already increased the memory

Re: [R] memory problem; Error: cannot allocate vector of size 915.5 Mb

2011-08-01 Thread Dimitris.Kapetanakis
Thanks a lot for the help. Actually, I am using a mac which (R for Mac OS X GUI 1.40-devel Leopard build 32-bit (5751)) but I think I can find access on windows 7 64-bit. What I am trying to do is a maximization through grid search (because I am not sure that any of the optim() methods works

Re: [R] memory problem; Error: cannot allocate vector of size 915.5 Mb

2011-08-01 Thread David Winsemius
On Aug 1, 2011, at 3:04 AM, Dimitris.Kapetanakis wrote: Thanks a lot for the help. Actually, I am using a mac which (R for Mac OS X GUI 1.40-devel Leopard build 32-bit (5751)) but I think I can find access on windows 7 64- bit. I don't think that was what Holtman was advising. You just

[R] memory problem; Error: cannot allocate vector of size 915.5 Mb

2011-07-31 Thread Dimitris.Kapetanakis
Dear all, I am trying to make some matrix operations (whose size I think is smaller than what R allows) but the operations are not feasible when they run in one session but it is feasible if they run separately while each operation is totally independent of the other. I run the code in one

Re: [R] memory problem; Error: cannot allocate vector of size 915.5 Mb

2011-07-31 Thread jim holtman
My advice to you is to get a 64-bit version of R. Here is what it does on my 64-bit Windows 7 version: N-250 x-matrix(c(rnorm(N,-1.5,1), rnorm(N,1,1), rbinom(N,1,0.5)), ncol=3) my.stats(1) 1 (1) - Rgui : 22:30:20 0.7 78.6 78.6 : 20.5MB start-(-1) end-3 step-10^(-2)

Re: [R] memory problem

2010-08-19 Thread Duncan Murdoch
avsha38 wrote: Hi, when i run the following code i get this massege: The instruction at 0x reference memory at 0x###, the memory cannot be read. and then i have to close R. what is the problem and how can i solve it? The problem is a bug in the underlying C (or other)

[R] memory problem

2010-08-18 Thread avsha38
Hi, when i run the following code i get this massege: The instruction at 0x reference memory at 0x###, the memory cannot be read. and then i have to close R. what is the problem and how can i solve it? thanks in advance Avi my code # frailtypack library(frailtypack) cgd.ag -

[R] memory problem for scatterplot using ggplot

2010-07-28 Thread Edwin Husni Sutanudjaja
Dear all, I have a memory problem in making a scatter plot of my 17.5 million-pair datasets. My intention to use the ggplot package and use the bin2d. Please find the attached script for more details. Could somebody please give me any clues or tips to solve my problem?? please ... Just for

Re: [R] memory problem for scatterplot using ggplot

2010-07-28 Thread Brandon Hurr
It was my understanding that R wasn't really the best thing for absolutely huge datasets. 17.5 million points would probably fall under the category of absolutely huge. I'm on a little netbook right now (atom/R32) and it failed, but I'll try it on my macbookPro/R64 later and see if it's able to

Re: [R] memory problem for scatterplot using ggplot

2010-07-28 Thread Mark Connolly
On 07/28/2010 06:13 AM, Edwin Husni Sutanudjaja wrote: Dear all, I have a memory problem in making a scatter plot of my 17.5 million-pair datasets. My intention to use the ggplot package and use the bin2d. Please find the attached script for more details. Could somebody please give me any

Re: [R] memory problem for scatterplot using ggplot

2010-07-28 Thread David Winsemius
On Jul 28, 2010, at 9:53 AM, Brandon Hurr wrote: It was my understanding that R wasn't really the best thing for absolutely huge datasets. 17.5 million points would probably fall under the category of absolutely huge. I'm on a little netbook right now (atom/R32) and it failed, but I'll

[R] Memory problem in multinomial logistic regression

2010-07-05 Thread Daniel Wiesmann
Dear All I am trying to fit a multinomial logistic regression to a data set with a size of 94279 by 14 entries. The data frame has one sample column which is the categorical variable, and the number of different categories is 9. The size of the data set (as a csv file) is less than 10 MB. I

Re: [R] Memory problem in multinomial logistic regression

2010-07-05 Thread Charles C. Berry
On Mon, 5 Jul 2010, Daniel Wiesmann wrote: Dear All I am trying to fit a multinomial logistic regression to a data set with a size of 94279 by 14 entries. The data frame has one sample column which is the categorical variable, and the number of different categories is 9. The size of the

Re: [R] Memory Problem

2010-02-08 Thread Meenakshi
Hi, Can I use macro variables in R. If we can use macro variables in R, where i can get that programs or macro in R books. -- View this message in context: http://n4.nabble.com/Memory-Problem-tp1459740p1472700.html Sent from the R help mailing list archive at Nabble.com.

Re: [R] Memory Problem

2010-02-08 Thread jim holtman
What exactly is your definition of macro? What to you want to do? What is the problem that you are trying to solve? Why to you think macros will help? Typically R does not have macros; I assume that idea is a holdover from SAS. On Mon, Feb 8, 2010 at 4:30 AM, Meenakshi

Re: [R] Memory Problem

2010-02-08 Thread S Ellison
jim holtman jholt...@gmail.com 08/02/2010 14:09:52 Typically R does not have macros; I know exactly why Jim Holtman said that; R doesn't have a separate 'macro' construct with separate 'macro variables'. But it is perhaps a bit misleading to say that R doesn't have macros without saying a bit

Re: [R] Memory Problem

2010-02-06 Thread Meenakshi
Hi, I am using R 10.2.1 version. Before run any statement/functions the gc report is: used (Mb) gc trigger (Mb) max used (Mb) Ncells 124352 3.4 35 9.4 35 9.4 Vcells 81237 0.7 786432 6.0 310883 2.4 After I run the repeat statement, I got the following error

Re: [R] Memory Problem

2010-02-06 Thread Meenakshi
Hi, After get error message, My main file size is 1.05MB. Other objects are within 400bytes only. Thanks. -- View this message in context: http://n4.nabble.com/Memory-Problem-tp1459740p1471153.html Sent from the R help mailing list archive at Nabble.com.

Re: [R] Memory Problem

2010-02-06 Thread jim holtman
Here is a function I use to get the size of the objects in my workspace. Let us know the output of this command my.object.size - function (pos = 1, sorted = F) { .result - sapply(ls(pos = pos, all.names = TRUE), function(..x) object.size(eval(as.symbol(..x if (sorted) {

Re: [R] Memory Problem

2010-02-06 Thread Meenakshi
This is my objects size: Size Mode asa_Condition 912 list asa_GatedCommunity9,912 list asa_Neighbourhood 2,872 list asa_Security832 list asa_Storeys 800 list

Re: [R] Memory Problem

2010-02-06 Thread jim holtman
Have you tried gc() to see if any memory is released? How big was the file that you read in? I don't see any large objects that appear in your workspace. Is there some type of processing that you did after reading in the data? You might want to intersperse the following command in your script

Re: [R] Memory Problem

2010-02-04 Thread Meenakshi
Hi, I have to run the repeat loop more than 50 times continuously. But it runs only 20 to 30 times only. After that the memory problem is coming. My dataset has 6321kb only. Then how to solve this problem. Meenakshi -- View this message in context:

Re: [R] Memory Problem

2010-02-04 Thread jim holtman
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. How about providing information on your operating system and version of R. Also provide a list of all the objects in your workspace and the size of

Re: [R] Memory Problem

2010-02-03 Thread Jim Lemon
On 02/02/2010 09:33 PM, Meenakshi wrote: Hi, When I run the repeat loop in R for large dataset, I got Memory problem. How can I solve these problem. 1) Wait 2^m years, where m is the power of 2 that approximates the multiple of your current amount of RAM that would accommodate your problem

[R] Memory Problem

2010-02-02 Thread Meenakshi
Hi, When I run the repeat loop in R for large dataset, I got Memory problem. How can I solve these problem. -- View this message in context: http://n4.nabble.com/Memory-Problem-tp1459740p1459740.html Sent from the R help mailing list archive at Nabble.com.

Re: [R] Memory Problem

2010-02-02 Thread Uwe Ligges
On 02.02.2010 11:33, Meenakshi wrote: Hi, When I run the repeat loop in R for large dataset, I got Memory problem. How can I solve these problem. buy more memory, bigger machine, more efficient programming, import of only relevant data, use of specific tools, .. or in other words:

Re: [R] R Memory Problem

2010-01-27 Thread David Winsemius
You were asked to provide details, but so far have not. -- David. On Jan 27, 2010, at 2:17 AM, prem_R wrote: Yes i think this is explanation of the problem faced .Could you please help me to solve this . -- View this message in context: http://n4.nabble.com/R-Memory-Problem

Re: [R] R Memory Problem

2010-01-26 Thread prem_R
Yes i think this is explanation of the problem faced .Could you please help me to solve this . -- View this message in context: http://n4.nabble.com/R-Memory-Problem-tp1289221p1311291.html Sent from the R help mailing list archive at Nabble.com

Re: [R] R Memory Problem

2010-01-25 Thread jim holtman
R 2.10.0 If you need any other clarifications i shall provide the needed .Help me to solve this -- View this message in context: http://n4.nabble.com/R-Memory-Problem-tp1289221p1289221.html Sent from the R help mailing list archive at Nabble.com

Re: [R] R Memory Problem

2010-01-25 Thread Johann Hibschman
prem_R mtechp...@gmail.com writes: I'm running predictive analytics using R and to calibrate my model i used to adjust the variables used in the model and the problem happens here.R just runs out of memory .I tried garbage cleaning also. I'm analyzing a 8 GB data set using R, so it can

Re: [R] memory problem on Suse

2009-12-16 Thread Ambrosi Alessandro
Alessandro Cc: r-help@r-project.org Subject: Re: [R] memory problem on Suse On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote: Dear all, I am meeting some problems with memory allocation. I know it is an old issue, I'm sorry. I looked for a solution in the FAQs and manuals, mails, but without

[R] memory problem on Suse

2009-12-11 Thread Ambrosi Alessandro
Dear all, I am meeting some problems with memory allocation. I know it is an old issue, I'm sorry. I looked for a solution in the FAQs and manuals, mails, but without finding the working answer. I really hope you can help me. For instance, if I try to read micorarray data I get:

Re: [R] memory problem on Suse

2009-12-11 Thread Marc Schwartz
On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote: Dear all, I am meeting some problems with memory allocation. I know it is an old issue, I'm sorry. I looked for a solution in the FAQs and manuals, mails, but without finding the working answer. I really hope you can help me. For

Re: [R] memory problem on Suse

2009-12-11 Thread Martin Morgan
Ask on the bioconductpr mailing list, where you will be diirected to several solutions for analyzing what I guess are 100's is cel files http://bioconductor.org -- Martin Morgan On Dec 11, 2009, at 8:02 AM, Marc Schwartz marc_schwa...@me.com wrote: On Dec 11, 2009, at 6:24 AM, Ambrosi

[R] memory problem - failing to load rgl in R 2.7.1 patched

2008-08-04 Thread Monica Pisica
Hi, yesterday i had the surprise not to be able to load the package ca on R 2.7.0 saying that cannot find required package rgl although it was there. So today i've upgraded to 7.2.1. patched and i got the following error: local({pkg - select.list(sort(.packages(all.available = TRUE))) +

[R] Memory Problem

2008-03-21 Thread Georgios Marentakis
Dear all, I am having a memory problem when analyzing a rather large data set with nested factors in R. The model is of the form X~A*B*(C/D/F) A,B,C,D,F being the independent variables some of which are nested. The problem occurs when using aov but also when using glm or lme. In particular I get

Re: [R] Memory Problem

2008-03-21 Thread Prof Brian Ripley
On Fri, 21 Mar 2008, Georgios Marentakis wrote: Dear all, I am having a memory problem when analyzing a rather large data set with nested factors in R. The model is of the form X~A*B*(C/D/F) A,B,C,D,F being the independent variables some of which are nested. The problem occurs when using

Re: [R] Memory problem?

2008-01-31 Thread Jay Emerson
Elena, Page 23 of the R Installation Guide provides some memory guidelines that you might find helpful. There are a few things you could try using R, at least to get up and running: - Look at fewer tumors at a time using standard R as you have been. - Look at the ff package, which leaves the

[R] Memory problem?

2008-01-30 Thread Eleni Christodoulou
Hello R users, I am trying to run a cox model for the prediction of relapse of 80 cancer tumors, taking into account the expression of 17000 genes. The data are large and I retrieve an error: Cannot allocate vector of 2.4 Mb. I increase the memory.limit to 4000 (which is the largest supported by

Re: [R] Memory problem?

2008-01-30 Thread 宋时歌
I have a similar problem, saying cannot allocate vector size of 300MB. I would also appreciate if someone can offer some suggestion on this. Best, Shige On Jan 31, 2008 2:48 PM, Eleni Christodoulou [EMAIL PROTECTED] wrote: Hello R users, I am trying to run a cox model for the prediction of

Re: [R] Memory problem?

2008-01-30 Thread Prof Brian Ripley
On Thu, 31 Jan 2008, Eleni Christodoulou wrote: Hello R users, I am trying to run a cox model for the prediction of relapse of 80 cancer tumors, taking into account the expression of 17000 genes. The data are large and I retrieve an error: Cannot allocate vector of 2.4 Mb. I increase the

[R] Memory problem

2008-01-24 Thread Yoni Stoffman
Hi All, There is something I don't quite understand about R memory management. I have the following function function (AdGroupId) { print(memory.size()) channel - odbcConnect(RDsn, uid = , case = tolower, pwd = xx) Tree1 - sqlQuery(channel, exec SelectAdgroups

[R] Memory problem using predict function

2007-12-17 Thread Brad Timm
I am trying to make a predicted vegetation map using the predict ( ) function and am running into an issue with memory size Specifically I am building a random forest classification (dataframe = vegmap.rf) using the randomForest library and then am trying to apply results from that to construct