Thank you Jim for your reply.
I could figure out that readLines works fine until 35,841,335 lines
(records).
When the next line is read to be read, a window with "R for Windows GUI
front-end has stopped working" message comes, with an option to close
program or checking online for a solution.
T
Hi,
On Fri, Feb 3, 2012 at 1:12 PM, HC wrote:
> Thank you.
>
> The readLines command is working fine and I am able to read 10^6 lines in
> one go and write them using the write.table command.
>
> Does this readLines command using a block concept to optimize or goes line
> by line?
>
> Steve has m
Exactly what does "crashed" mean? What was the error message? How
you tried to put:
rm(Lines)
gc()
at the end of the loop to free up and compact memory? If you watch
the performance, does the R process seem to be growing in terms of the
amount of memory that is being used? You can add:
memor
Bad news!
The readLines command works fine upto a certain limit. Once a few files have
been written the R program crashes.
I used the following code:
*
iFile<-"Test.txt"
con <- file(iFile, "r")
N<-125;
iLoop<-1
while(length(Lines <- readLines(con, n = N)) > 0 & iLo
Thank you.
The readLines command is working fine and I am able to read 10^6 lines in
one go and write them using the write.table command.
Does this readLines command using a block concept to optimize or goes line
by line?
Steve has mentioned about *nix and split commands. Would there be any spee
On Fri, Feb 3, 2012 at 8:08 AM, HC wrote:
> This is a 160 GB tab-separated .txt file. It has 9 columns and 3.25x10^9
> rows.
>
> Can R handle it?
>
You can process a file N lines at time like this:
con <- file("myfile.dat", "r")
while(length(Lines <- readLines(con, n = N)) > 0) {
... whatever.
This is a 160 GB tab-separated .txt file. It has 9 columns and 3.25x10^9
rows.
Can R handle it?
Thank you.
HC
--
View this message in context:
http://r.789695.n4.nabble.com/sqldf-for-Very-Large-Tab-Delimited-Files-tp4350555p4354556.html
Sent from the R help mailing list archive at Nabble.co
On Fri, Feb 3, 2012 at 7:37 AM, Gabor Grothendieck
wrote:
> On Fri, Feb 3, 2012 at 6:03 AM, HC wrote:
>> Thank you for indicating that SQLite may not handle a file as big as 160 GB.
>>
>> Would you know of any utility for *physically splitting *the 160 GB text
>> file into pieces. And if one can
On Fri, Feb 3, 2012 at 6:03 AM, HC wrote:
> Thank you for indicating that SQLite may not handle a file as big as 160 GB.
>
> Would you know of any utility for *physically splitting *the 160 GB text
> file into pieces. And if one can control the splitting at the end of a
> record.
>
If they are c
Thank you for indicating that SQLite may not handle a file as big as 160 GB.
Would you know of any utility for *physically splitting *the 160 GB text
file into pieces. And if one can control the splitting at the end of a
record.
Thank you again.
HC
--
View this message in context:
http://r.789
On Thu, Feb 2, 2012 at 8:07 PM, HC wrote:
> Hi Gabor,
>
> Thank you very much for your guidance and help.
>
> I could run the following code successfully on a 500 mb test data file. A
> snapshot of the data file is attached herewith.
>
> code start***
> library(sqldf)
> library
Hi Gabor,
Thank you very much for your guidance and help.
I could run the following code successfully on a 500 mb test data file. A
snapshot of the data file is attached herewith.
code start***
library(sqldf)
library(RSQLite)
iFile<-"Test100.txt"
con <- dbConnect(SQLite(),db
On Thu, Feb 2, 2012 at 3:11 AM, Gabor Grothendieck
wrote:
> On Wed, Feb 1, 2012 at 11:57 PM, HC wrote:
>> Hi All,
>>
>> I have a very (very) large tab-delimited text file without headers. There
>> are only 8 columns and millions of rows. I want to make numerous pieces of
>> this file by sub-setti
On Wed, Feb 1, 2012 at 11:57 PM, HC wrote:
> Hi All,
>
> I have a very (very) large tab-delimited text file without headers. There
> are only 8 columns and millions of rows. I want to make numerous pieces of
> this file by sub-setting it for individual stations. Station is given as in
> the first
Hi All,
I have a very (very) large tab-delimited text file without headers. There
are only 8 columns and millions of rows. I want to make numerous pieces of
this file by sub-setting it for individual stations. Station is given as in
the first column. I am trying to learn and use sqldf package for
15 matches
Mail list logo