HI,
You can try dbLoad() from hash package. Not sure whether it will be successful.
A.K.
- Original Message -
From: Lorcan Treanor
To: r-help@r-project.org
Cc:
Sent: Monday, July 23, 2012 8:02 AM
Subject: [R] Large data set
Hi all,
Have a problem. Trying to read in a data set
First of all, try to determine the smallest file you can read with an
empty workspace. Once you have done that, then break up your file
into that size sets and read them in. The next question is what do
you want to do with 112M rows of data. Can you process them a set a
time and then aggregate t
Hi all,
Have a problem. Trying to read in a data set that has about 112,000,000
rows and 8 columns and obviously enough it was too big for R to handle. The
columns are mode up of 2 integer columns and 6 logical columns. The text
file is about 4.2 Gb in size. Also I have 4 Gb of RAM and 218 Gb of
a
Works perfectly well with R-2.14.1 32-bit on a Windows device. Since you
have not followed the posting guide and forgot to give details about
your platform, there is not much we can do.
Uwe Ligges
On 22.12.2011 23:08, Karen Liu wrote:
When I use the image() function for a relatively small
When I use the image() function for a relatively small matrix it works
perfectly, eg.x <- 1:100
> z <- matrix(rnorm(10^4),10^2,10^2)
> image(x=x,y=x,z=z,col=rainbow(3))but when I want to plot a larger matrix, it
> doesn't really work. Most of the times, it just plot a few intermitent
> points.x
Thanks Kjetil. This is exactly what I wanted.
Hardi
From: Kjetil Halvorsen
Cc: r-help
Sent: Monday, March 2, 2009 9:45:43 PM
Subject: Re: [R] Large data set in R
install.packages("biglm", dep=TRUE)
library(help=biglm)
kjetil
Hello,
I
install.packages("biglm", dep=TRUE)
library(help=biglm)
kjetil
On Mon, Mar 2, 2009 at 7:06 AM, Hardi wrote:
>
> Hello,
>
> I'm trying to use R statistical packages to do ANOVA analysis using aov()
> and lm().
> I'm having a problem when I have a large data set for input data from Full
> Factori
Hello,
I'm trying to use R statistical packages to do ANOVA analysis using aov() and
lm().
I'm having a problem when I have a large data set for input data from Full
Factorial Design Experiment with replications.
R seems to store everything in the memory and it fails when memory is not
enough
On Mon, 25 Aug 2008, Roland Rau wrote:
Hi,
Jason Thibodeau wrote:
I am attempting to perform some simple data manipulation on a large data
set. I have a snippet of the whole data set, and my small snippet is 2GB
in
CSV.
Is there a way I can read my csv, select a few columns, and write it
Hi,
Jason Thibodeau wrote:
I am attempting to perform some simple data manipulation on a large data
set. I have a snippet of the whole data set, and my small snippet is 2GB in
CSV.
Is there a way I can read my csv, select a few columns, and write it to an
output file in real time? This is what
Establish a "connection" with the file you want to read, read in 1,000
rows (or whatever you want). If you are using read.csv and there is a
header, you might want to skip it initially since there will be no
header when you read the next 1000 rows. Also put 'as.is=TRUE" so
that character fields a
I am attempting to perform some simple data manipulation on a large data
set. I have a snippet of the whole data set, and my small snippet is 2GB in
CSV.
Is there a way I can read my csv, select a few columns, and write it to an
output file in real time? This is what I do right now to a small test
12 matches
Mail list logo