Re: large files

2013-03-05 Thread Dr.Ruud
On 2013-03-05 21:41, Chris Stinemetz wrote: I am working on a script to parse large files, by large I mean 4 million line+ in length and when splitting on the delimiter ( ; ) there are close to 300 fields per record, but I am only interested in the first 44. Try Text::CSV_XS. -- Ruud -- To

Re: large files

2013-03-05 Thread John SJ Anderson
On Tue, Mar 5, 2013 at 12:41 PM, Chris Stinemetz wrote: > Hello List, > > I am working on a script to parse large files, by large I mean 4 million > line+ in length and when splitting on the delimiter ( ; ) there are close > to 300 fields per record, but I am only interested

Re: large files

2013-03-05 Thread Rob Dixon
On 05/03/2013 20:41, Chris Stinemetz wrote: Hello List, I am working on a script to parse large files, by large I mean 4 million line+ in length and when splitting on the delimiter ( ; ) there are close to 300 fields per record, but I am only interested in the first 44. I have begin testing to

large files

2013-03-05 Thread Chris Stinemetz
Hello List, I am working on a script to parse large files, by large I mean 4 million line+ in length and when splitting on the delimiter ( ; ) there are close to 300 fields per record, but I am only interested in the first 44. I have begin testing to see how fast the file can be read in a few

Re: Processing Multiple Large Files

2008-12-12 Thread Rob Dixon
friend...@gmail.com wrote: > > I analyzing some netwokr log files. There are around 200-300 files and > each file has more than 2 million entries in it. > > Currently my script is reading each file line by line. So it will take > lot of time to process all the files. > > Is there any efficient w

Re: Processing Multiple Large Files

2008-12-12 Thread Mr. Shawn H. Corey
On Thu, 2008-12-11 at 12:28 -0800, friend...@gmail.com wrote: > Hi, > > I analyzing some netwokr log files. There are around 200-300 files and > each file has more than 2 million entries in it. > > Currently my script is reading each file line by line. So it will take > lot of time to process all

Processing Multiple Large Files

2008-12-12 Thread friend...@gmail.com
Hi, I analyzing some netwokr log files. There are around 200-300 files and each file has more than 2 million entries in it. Currently my script is reading each file line by line. So it will take lot of time to process all the files. Is there any efficient way to do it? May be Multiprocessing, M

Re: Comparing fields in 2 large files

2008-06-11 Thread Rob Dixon
Ferry, Craig wrote: >> >> Please keep your responses to the perl.beginners group so that others >> can both >> provide input as well as learn from your experience. Thanks. >> >> I suggest you stick with Perl but process the data directly from the >> database. >> Take a look at the DBI module, whi

RE: Comparing fields in 2 large files

2008-06-11 Thread Ferry, Craig
Hi Craig Please keep your responses to the perl.beginners group so that others can both provide input as well as learn from your experience. Thanks. I suggest you stick with Perl but process the data directly from the database. Take a look at the DBI module, which isn't a standard one and so may

Re: Comparing fields in 2 large files

2008-06-11 Thread Rob Dixon
Ferry, Craig wrote: > > My original data is in a database. I did really mean that field 1 in > file a could be any part of field 1 in file b. I also forgot to > mention that in addition to it being in any part of field 1 of file b, I > have to strip out special characters from file b before do

Re: Comparing fields in 2 large files

2008-06-11 Thread Rob Dixon
Ferry, Craig wrote: > > I am new to perl and would appreciate any suggestions as to how to do > the following. > > I have two files, one with 3.5 million records, the other with almost a > million records. Basically here's what I need to do. > > See if field_1 in file_a is part of field_1 in fi

Comparing fields in 2 large files

2008-06-11 Thread Ferry, Craig
I am new to perl and would appreciate any suggestions as to how to do the following. I have two files, one with 3.5 million records, the other with almost a million records. Basically here's what I need to do. See if field_1 in file_a is part of field_1 in file_b If so, see if field_2 in file_a

Re: large Files management

2007-09-26 Thread Dr.Ruud
"Armin Garcia" schreef: > well i work with mbox files, but i have a huge mbox file, its 339 MB > and when i processing this file with Mail::MboxParser module, my > program breaks becouse send a message of out of memory Change-or-convert to maildir-type storage? -- Affijn, Ruud "Gewoon is

large Files management

2007-09-26 Thread Armin Garcia
Hi well i work with mbox files, but i have a huge mbox file, its 339 MB and when i processing this file with Mail::MboxParser module, my program breaks becouse send a message of out of memory I think is becouse the module Mail::MboxParser try to up in the cache all the file but becouse the fi

RE: large files

2006-01-26 Thread Wagner, David --- Senior Programmer Analyst --- WGO
Bill Peters wrote: > Hello, > I hope this question is appropriate to this list: > I am running ActiveState Perl 5.8 on Windows XP with > Spreadsheet::ParseExcel and Spreadsheet::WriteExcel::Big. I have 2 > programs I've written using these modules. One program reads from a > SQL DB and just writes

large files

2006-01-26 Thread Bill Peters
Hello, I hope this question is appropriate to this list: I am running ActiveState Perl 5.8 on Windows XP with Spreadsheet::ParseExcel and Spreadsheet::WriteExcel::Big. I have 2 programs I've written using these modules. One program reads from a SQL DB and just writes the data to a spreadsheet, abou

Efficiently search in large files?

2005-06-20 Thread Nan Jiang
Hi all, I'm developing a simple search engine under cgi-bin of a webserver. The problem is my database is a single 2GB xml file, when I use XML:Twig to maintain searching in the file, it usually takes a v.long time to process and I can't terminate it by just closing web explorer (the perl proc

RE: Calling "more" in perl script for large files.

2004-06-08 Thread Raminder G
Hi, I could use open(STDOUT,"| more") option to more big file. After this, I close STDOUT as if, I do not close, readline prints the prompt even when more is not 100%. ### open(STDOUT,"| more") || "Cannot open stdout for more $!"; $status = system("cat report.tx

RE: Calling "more" in perl script for large files.

2004-06-04 Thread Bob Showalter
[EMAIL PROTECTED] wrote: > Hi There, > Any pointers to how to call "more" on a file in perl script. > I am writing one script which displays one report to user and then > asks some inputs. > > User need to see complete file. To solve this, I thought I can call > `more in the script and continue.

Re: Calling "more" in perl script for large files.

2004-06-04 Thread Wiggins d Anconia
> Hi There, > Any pointers to how to call "more" on a file in perl script. > I am writing one script which displays one report to user and then asks > some inputs. > > User need to see complete file. To solve this, I thought I can call `more > in the script and continue. > But looks loke it is no

Calling "more" in perl script for large files.

2004-06-04 Thread PerlDiscuss - Perl Newsgroups and mailing lists
Hi There, Any pointers to how to call "more" on a file in perl script. I am writing one script which displays one report to user and then asks some inputs. User need to see complete file. To solve this, I thought I can call `more in the script and continue. But looks loke it is not possible. Any

Re: STATing large files

2004-03-11 Thread drieux
On Mar 11, 2004, at 3:41 PM, Phil Schaechter wrote: Folks, Does anyone know of a workaround for stat'ing large files on perl < 5.6.1 ? For large files, stat and print -M both return nothing. We cannot install any modules, or upgrade/recompile perl. The problem is that you will need

STATing large files

2004-03-11 Thread Phil Schaechter
Folks, Does anyone know of a workaround for stat'ing large files on perl < 5.6.1 ? For large files, stat and print -M both return nothing. We cannot install any modules, or upgrade/recompile perl. Thanks, Phil -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands

Re: Uploading large files thru HTTP

2004-02-09 Thread Michael C. Davis
The CGI:: module limits largest file size with a variable called $CGI::POST_MAX. From the documentation: " [ ...] If set to a non-negative integer, this variable puts a ceiling on the size of POSTings, in bytes. If CGI.pm detects a POST that is greater than the ceiling, it will immediately exit w

RE: Uploading large files thru HTTP

2004-02-09 Thread Bob Showalter
Bob Showalter wrote: > Nilay Puri, Noida wrote: >> Hi all, >> >> I am uplaoding files from one server to another server using Perl >> HTTP post. >> >> But when the file size increases to 2 MB , i get error. >> >> Is there any way I can specify the max file size ? > > You can only do this if yo

RE: Uploading large files thru HTTP

2004-02-09 Thread Bob Showalter
Nilay Puri, Noida wrote: > Hi all, > > I am uplaoding files from one server to another server using Perl > HTTP post. > > But when the file size increases to 2 MB , i get error. > > Is there any way I can specify the max file size ? You can only do this if you control the server. Do you? --

Uploading large files thru HTTP

2004-02-09 Thread Nilay Puri, Noida
Hi all, I am uplaoding files from one server to another server using Perl HTTP post. But when the file size increases to 2 MB , i get error. Is there any way I can specify the max file size ? My code is : #!usr/local/bin/perl -w use LWP::Simple; use Data::Dumper; use LWP::UserAgent; use HTTP::R

taskbar for console: 'tee' alike program for stats when copieing large files.

2003-07-01 Thread Freek Kauffmann
Dear Reader, Since I just started learning perl I have been looking for some useless tings to write just to learn perl... Maybe you consider your self a newbie too. In that case it might be interesting for you to discuss this senario. If you know where I could look in order to be less clueless

Re: How to Handle Large files in Perl ?

2003-02-07 Thread Rob Dixon
Madhu Reddy wrote: > Hi, > We are doing file operations on large files(more > than 5GB). > basically reading the file and validatiing the each > record in in the file. > > we used Tie::File > It took 35 min to read and count the number of lines >

How to Handle Large files in Perl ?

2003-02-07 Thread Madhu Reddy
Hi, We are doing file operations on large files(more than 5GB). basically reading the file and validatiing the each record in in the file. we used Tie::File It took 35 min to read and count the number of lines in file of 8millions (500MB) following is the script Does anybody have

Re: Out of Memory Working With Large Files

2003-01-20 Thread Rob Dixon
John W. Krahn wrote: > Rob Dixon wrote: >> >> John W. Krahn wrote: >>> >>> You should _always_ verify that the file opened successfully. >> >> Sure, but that's not what the question was about. You should always >> add 'use strict' and 'use warnings' too, but I didn't put that in >> either. >> >>> m

Re: Out of Memory Working With Large Files

2003-01-20 Thread John W. Krahn
Rob Dixon wrote: > > John W. Krahn wrote: > > Rob Dixon wrote: > >> > >> open FILE, "< file.txt"; > > > > You should _always_ verify that the file opened successfully. > > Sure, but that's not what the question was about. You should always add > 'use strict' and 'use warnings' too, but I

Re: Out of Memory Working With Large Files

2003-01-19 Thread Rob Dixon
John W. Krahn wrote: > Rob Dixon wrote: >> >> Nelson Ray wrote: >>> Just as a little background, I am working on a BioInformatics >>> program that runs on large (about 300 meg) text files. I am using a >>> filehandle to open and load it into an array. Then I use the join >>> command to read the a

Re: Out of Memory Working With Large Files

2003-01-19 Thread John W. Krahn
Rob Dixon wrote: > > Nelson Ray wrote: > > Just as a little background, I am working on a BioInformatics program > > that runs on large (about 300 meg) text files. I am using a > > filehandle to open and load it into an array. Then I use the join > > command to read the array into a scalar varia

Re: Out of Memory Working With Large Files

2003-01-19 Thread John W. Krahn
Nelson Ray wrote: > > Just as a little background, I am working on a BioInformatics program that > runs on large (about 300 meg) text files. I am using a filehandle to open > and load it into an array. Then I use the join command to read the array > into a scalar variable in order to be in a wor

Re: Out of Memory Working With Large Files

2003-01-19 Thread Rob Dixon
Hi Nelson Nelson Ray wrote: > Just as a little background, I am working on a BioInformatics program > that runs on large (about 300 meg) text files. I am using a > filehandle to open and load it into an array. Then I use the join > command to read the array into a scalar variable in order to be

Out of Memory Working With Large Files

2003-01-19 Thread Nelson Ray
Just as a little background, I am working on a BioInformatics program that runs on large (about 300 meg) text files. I am using a filehandle to open and load it into an array. Then I use the join command to read the array into a scalar variable in order to be in a workable form for my computation

Re: Large files

2002-09-11 Thread Paul Johnson
On Wed, Sep 11, 2002 at 11:58:12AM -0500, Frank Wiles wrote: > There aren't any limits in Perl on the size of the file, except of > course your operating system limits. Well, that's not strictly true. Large file support (>2Gb) came in 5.6.0. Now the limit is bigger ;-) -- Paul Johns

Re: Large files

2002-09-11 Thread Frank Wiles
.--[ Rob wrote (2002/09/11 at 12:56:10) ]-- | | Hi, I'm parsing a 16 meg file and my program dies 111 lines from the | bottom. The record that it stops isn't any different from the ones that | make it through. If I delete from there down it works ok and if I delete | 111 lines

Re: Large files

2002-09-11 Thread Rob
If I take 111 lines off from the top of the file it does make it through that line and finishes without a problem. Rob Good judgement comes from experience, and experience - well, that comes from poor judgement. On Wed, 11 Sep 2002, Frank Wiles wrote: > .--[ Rob wrote (2002/09/11 at 12:

Re: Large files

2002-09-11 Thread Joe Raube
When it dies, what error do you get? --- Rob <[EMAIL PROTECTED]> wrote: > Hi, I'm parsing a 16 meg file and my program dies 111 lines from > the > bottom. The record that it stops isn't any different from the ones > that > make it through. If I delete from there down it works ok and if I > dele

Large files

2002-09-11 Thread Rob
Hi, I'm parsing a 16 meg file and my program dies 111 lines from the bottom. The record that it stops isn't any different from the ones that make it through. If I delete from there down it works ok and if I delete 111 lines from the top it works ok. Is there a limit to how big a file can be in P

RE: Processing Large Files

2002-03-13 Thread RArul
PROTECTED] Subject: RE: Processing Large Files I cut and paste your code. Used a file 1070725103 (~1 Gb) bytes big with about 100 bytes per line. and it ran in split second. Are the lines in your file giant sized (>1Mb/line)? -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PRO

RE: Processing Large Files

2002-03-13 Thread Nikola Janceski
:20 AM To: [EMAIL PROTECTED] Subject: Processing Large Files Friends, I need to process 300+ MB text files. I tried to open one such file; read 10 lines (line by line) and output those 10 lines to a new text file. Despite reading line by line, it was taking a very long time for processing. Is th

Processing Large Files

2002-03-13 Thread RArul
Friends, I need to process 300+ MB text files. I tried to open one such file; read 10 lines (line by line) and output those 10 lines to a new text file. Despite reading line by line, it was taking a very long time for processing. Is there anything that I can try to speed it up!! Here is my pront

RE: Process large files quickly...how to?

2002-03-12 Thread Russ Foster
inutes to complete. Print this info up front, then a status line every so often... -rjf > -Original Message- > From: Kevin Old [mailto:[EMAIL PROTECTED]] > Sent: Monday, March 11, 2002 19:42 > To: [EMAIL PROTECTED] > Subject: Process large files quickly...how to? > >

RE: Process large files quickly...how to?

2002-03-11 Thread Timothy Johnson
If you gave some code with your question, I would have a better idea what is taking so long. I will venture a guess, only because I know what happened when I first started working with large files. The first thing to check is if you have any code that looks like this

Process large files quickly...how to?

2002-03-11 Thread Kevin Old
to read the data in and process it? Read each line of the file into an array and then process each line or just process the line directly? Anyone have any scripts that process large files quickly? I'd love to see examples of how you did it. Thanks, Kevin -- To unsubscribe, e-mail: [

Process large files quickly....how to?

2002-03-11 Thread Kevin Old
to read the data in and process it? Read each line of the file into an array and then process each line or just process the line directly? Anyone have any scripts that process large files quickly? I'd love to see examples of how you did it. Thanks, Kevin -- To unsubscribe, e-mail: [

Re: memory issues reading large files

2002-02-07 Thread Brett W. McCoy
On Thu, 7 Feb 2002, Brian Hayes wrote: > It appears the problem was using the foreach statement instead of while. > I have not tested this extensively, but using foreach the whole text > file (or output of pipe) is read into memory before continuing, but > using while (and probably for) each line

Re: memory issues reading large files

2002-02-07 Thread Brian Hayes
It appears the problem was using the foreach statement instead of while. I have not tested this extensively, but using foreach the whole text file (or output of pipe) is read into memory before continuing, but using while (and probably for) each line is processed as it is read. Thanks for all y

Re: memory issues reading large files

2002-02-07 Thread Brian Hayes
> You should be using something like > > open(FILE, $file) or die "$!\n"; > while(){ > ## do something > } > close FILE; > __END__ This is what I am doing, but before any of the file is processed, the whole text file is moved into memory. The only solution I can think of is to break

Re: memory issues reading large files

2002-02-07 Thread Brett W. McCoy
On Thu, 7 Feb 2002, Brian Hayes wrote: > > You should be using something like > > > > open(FILE, $file) or die "$!\n"; > > while(){ > > ## do something > > } > > close FILE; > > __END__ > > This is what I am doing, but before any of the file is processed, the > whole text file is moved in

RE: memory issues reading large files

2002-02-07 Thread Nikola Janceski
but quick to write it! ;) -Original Message- From: Brett W. McCoy [mailto:[EMAIL PROTECTED]] Sent: Thursday, February 07, 2002 3:49 PM To: Brian Hayes Cc: [EMAIL PROTECTED] Subject: Re: memory issues reading large files On Thu, 7 Feb 2002, Brian Hayes wrote: > Hello all. I need to

Re: memory issues reading large files

2002-02-07 Thread Brett W. McCoy
On Thu, 7 Feb 2002, Brian Hayes wrote: > Hello all. I need to read through a large (150 MB) text file line by > line. Does anyone know how to do this without my process swelling to > 300 megs? As long as you aren't reading that file into an array (which would be a foolish thing to do, IMHO), I

memory issues reading large files

2002-02-07 Thread Brian Hayes
Hello all. I need to read through a large (150 MB) text file line by line. Does anyone know how to do this without my process swelling to 300 megs? I have not been following the list, so sorry if this question has recently come up. I did not find it answered in the archives. Thanks, Brian

Re: Compare large files memory error

2001-08-08 Thread Will Crain
Perhaps the DOS FC (file compare) command will suffice for your application: Compares two files or sets of files and displays the differences between them FC [/A] [/C] [/L] [/LBn] [/N] [/T] [/U] [/W] [/] [drive1:] [path1]filename1 [drive2:][path2]filename2 FC /B [drive1:][path1]filename1 [

Re: Compare large files memory error

2001-08-07 Thread Mbedish
No, I am using Win NT. Regards, Mark Surrey,UK > Does your system have the 'cmp' program on it? IIRC this is a standard or > fairly standard Unix utility which does exactly what you want and you could > just call it from Perl... >Regards > >Mark Bedish >Surrey,UK > > >In a message dated Tue

Re: Compare large files memory error

2001-08-07 Thread Peter Scott
At 09:59 AM 8/7/01 -0400, [EMAIL PROTECTED] wrote: >Randal, > >Thanks for the file compare tip, it is incredibly fast! However it doesnt >tell me where the difference is. Can I get it to print out the first block >of data that is different? Does your system have the 'cmp' program on it? IIRC t

Re: Compare large files memory error

2001-08-07 Thread Jos I. Boumans
To: <[EMAIL PROTECTED]> Sent: Tuesday, August 07, 2001 1:35 PM Subject: Re: Compare large files memory error > Hello Mbedish, > > Tuesday, August 07, 2001, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: > > Mac> Is there a better way to compare large files than thi

Re: Compare large files memory error

2001-08-07 Thread Maxim Berlin
Hello Mbedish, Tuesday, August 07, 2001, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote: Mac> Is there a better way to compare large files than this snippet, Mac> which runs out of memory if files > 30mb. Mac> It is also slow, about the same speed as comparing in a text edito

Compare large files memory error

2001-08-07 Thread Mbedish
Is there a better way to compare large files than this snippet, which runs out of memory if files > 30mb. It is also slow, about the same speed as comparing in a text editor! Thank you. __SNIP__ @file1 = (); @file2 = (); $are_equal = compare_arrays(\@file1, \@file2); if ($are_eq