You need more RAM to load this file. As the memory was being used in your 
original file, certain objects (such as numeric columns) were being shared 
among different higher-level objects (such as data frames). When serialized 
into the file those optimizations were lost, and now those columns are stored 
separately.

Search [1] for "shared" to learn more about measuring object memory 
requirements.

[1] http://adv-r.had.co.nz/memory.html

On September 2, 2020 2:31:53 PM PDT, David Jones <david.tn.jo...@gmail.com> 
wrote:
>Thank you Uwe, John, and Bert - this is very helpful context.
>
>If it helps inform the discussion, to address John and Bert's
>questions - I actually had less memory free when I originally ran the
>analyses and saved the workspace, than when I read in the data back in
>later on (I rebooted in an attempt to free all possible memory before
>rereading the workspace back in).
>
>
>
>On Wed, Sep 2, 2020 at 1:27 PM John via R-help <r-help using
>r-project.org> wrote:
>
>>> On Wed, 2 Sep 2020 13:36:43 +0200
>>> Uwe Ligges <ligges using statistik.tu-dortmund.de> wrote:
>>>
>>> > On 02.09.2020 04:44, David Jones wrote:
>>> > > I ran a number of analyses in R and saved the workspace, which
>>> > > resulted in a 2GB .RData file. When I try to read the file back
>>> > > into R
>>> >
>>> > Compressed in RData but uncompressed in main memory....
>>> >
>>> >
>>> > > later, it won't read into R and provides the error: "Error:
>cannot
>>> > > allocate vector of size 37 Kb"
>>> > >
>>> > > This error comes after 1 minute of trying to read things in - I
>>> > > presume a single vector sends it over the memory limit. But,
>>> > > memory.limit() shows that I have access to a full 16gb of ram on
>my
>>> > > machine (12 GB are free when I try to load the RData file).
>>> >
>>> > But the data may need more....
>>> >
>>> >
>>> > > gc() shows the following after I receive this error:
>>> > >
>>> > > used (Mb) gc trigger (Mb) max used (Mb)
>>> > > Ncells 623130 33.3 4134347 220.8 5715387 305.3
>>> > > Vcells 1535682 11.8 883084810 6737.5 2100594002 16026.3
>>> >
>>> > So 16GB were used when R gave up.
>>> >
>>> > Best,
>>> > Uwe Ligges
>>>
>>> For my own part, looking at the OP's question, it does seem curious
>>> that R could write that .RData file, but on the same system not be
>able
>>> to reload something it created.  How would that work.  Wouldn't the
>>> memory limit have been exceeded BEFORE the the .RData file was
>written
>>> the FIRST time?
>>>
>>> JDougherty
>
>
>>R experts may give you a detailed explanation, but it is certainly
>possible
>>that the memory available to R when it wrote the file was different
>than
>>when it tried to read it, is it not?
>
>>Bert Gunter
>
>>"The trouble with having an open mind is that people keep coming along
>and
>>sticking things into it."
>>-- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )
>
>______________________________________________
>R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

-- 
Sent from my phone. Please excuse my brevity.

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to