Re: Gigantic file size processing error

2014-01-03 Thread Uri Guttman
On 01/03/2014 02:28 PM, Rob Dixon wrote: On 02/01/2014 15:21, mani kandan wrote: Hi, We have file size of huge size 500MB, Need to Manipulate the file, some replacement and then write the file, I have used File::slurp and works for file size of 300MB (Thanks Uri) but for this huge size 500MB it

Re: Gigantic file size processing error

2014-01-03 Thread Rob Dixon
On 02/01/2014 15:21, mani kandan wrote: Hi, We have file size of huge size 500MB, Need to Manipulate the file, some replacement and then write the file, I have used File::slurp and works for file size of 300MB (Thanks Uri) but for this huge size 500MB it is not processing and come out with error

Re: Gigantic file size processing error

2014-01-03 Thread Uri Guttman
On 01/03/2014 12:48 PM, Shawn H Corey wrote: On Fri, 03 Jan 2014 12:22:48 -0500 Uri Guttman wrote: i haven't seen that before but it was last touched in 2005. That means it has no bugs. A better metric of a modules quality is how many outstanding bugs are? See https://rt.cpan.org//Dist/Displ

Re: Gigantic file size processing error

2014-01-03 Thread Shawn H Corey
On Fri, 03 Jan 2014 12:22:48 -0500 Uri Guttman wrote: > i haven't seen that before but it was last touched in 2005. That means it has no bugs. A better metric of a modules quality is how many outstanding bugs are? See https://rt.cpan.org//Dist/Display.html?Queue=File-Inplace -- Don't stop whe

Re: Gigantic file size processing error

2014-01-03 Thread Uri Guttman
On 01/03/2014 12:10 PM, mani kandan wrote: Hi, Thanks for all your guidance, The Error was "Perl Command Line Intepretar has encountered a problem and needs to close, that isn't the real error. you need to run this in a command window that won't close after it fails so you can see the real er

Re: Gigantic file size processing error

2014-01-03 Thread Uri Guttman
On 01/03/2014 10:22 AM, Janek Schleicher wrote: A short look to CPAN brings out https://metacpan.org/pod/File::Inplace what looks to do what OP wants. Honestly I never used, and it can be that it has also a performance problem, but for at least I looked to it's source code and it implements it

Re: Gigantic file size processing error

2014-01-03 Thread mani kandan
Hi, Thanks for all your guidance, The Error was "Perl Command Line Intepretar has encountered a problem and needs to close,  Also increased the virtual memory, No use, My system configuration OS XP SP3 Intel Core 2 duo with 2 GB Ram. regards Manikandan N On Friday, 3 January 2014 9:06 PM,

Re: Gigantic file size processing error

2014-01-03 Thread Janek Schleicher
Am 02.01.2014 18:08, schrieb David Precious: Oh, I was thinking of a wrapper that would: (a) open a new temp file (b) iterate over the source file, line-by-line, calling the provided coderef for each line (c) write $_ (potentially modified by the coderef) to the temp file (d) finally, rename th

Re: Gigantic file size processing error

2014-01-03 Thread Jan Gruber
Hi List, On Friday, January 03, 2014 10:57:13 AM kurtz le pirate wrote: > have you try this kind of command : > perl -p -i -e "s/oneThing/otherThing/g" yourFile I was about to post the same thing. My suggestion: Create a backup file just in case something goes wrong. perl -pi.bak -e "s/oneTh

Re: Gigantic file size processing error

2014-01-03 Thread kurtz le pirate
In article <1388676082.98276.yahoomail...@web193403.mail.sg3.yahoo.com>, mani_nm...@yahoo.com (mani kandan) wrote: > Hi, We have file size of huge size 500MB, Need to Manipulate the file, some > replacement and then write the file, I have used File::slurp and works for > file size of 300MB (T

Re: Gigantic file size processing error

2014-01-02 Thread David Precious
On Thu, 02 Jan 2014 12:19:16 -0500 Uri Guttman wrote: > On 01/02/2014 12:08 PM, David Precious wrote: > > Oh, I was thinking of a wrapper that would: > > > > (a) open a new temp file > > (b) iterate over the source file, line-by-line, calling the provided > > coderef for each line > > (c) write $_

Re: Gigantic file size processing error

2014-01-02 Thread Uri Guttman
On 01/02/2014 12:33 PM, David Precious wrote: On Thu, 02 Jan 2014 12:19:16 -0500 Uri Guttman wrote: On 01/02/2014 12:08 PM, David Precious wrote: Oh, I was thinking of a wrapper that would: (a) open a new temp file (b) iterate over the source file, line-by-line, calling the provided coderef f

Re: Gigantic file size processing error

2014-01-02 Thread Uri Guttman
On 01/02/2014 12:08 PM, David Precious wrote: On Thu, 02 Jan 2014 11:56:26 -0500 Uri Guttman wrote: Part of me wonders if File::Slurp should provide an in-place (not slurping into RAM) editing feature which works like edit_file_lines but line-by-line using a temp file, but that's probably feat

Re: Gigantic file size processing error

2014-01-02 Thread David Precious
On Thu, 02 Jan 2014 11:56:26 -0500 Uri Guttman wrote: > > Part of me wonders if File::Slurp should provide an in-place (not > > slurping into RAM) editing feature which works like edit_file_lines > > but line-by-line using a temp file, but that's probably feature > > creep :) > > that IS tie::

Re: Gigantic file size processing error

2014-01-02 Thread Uri Guttman
On 01/02/2014 11:48 AM, David Precious wrote: On Thu, 02 Jan 2014 11:18:31 -0500 Uri Guttman wrote: On 01/02/2014 10:39 AM, David Precious wrote: Secondly - do you need to work on the file as a whole, or can you just loop over it, making changes, and writing them back out? In other words, d

Re: Gigantic file size processing error

2014-01-02 Thread David Precious
On Thu, 02 Jan 2014 11:18:31 -0500 Uri Guttman wrote: > On 01/02/2014 10:39 AM, David Precious wrote: > > Secondly - do you need to work on the file as a whole, or can you > > just loop over it, making changes, and writing them back out? In > > other words, do you *need* to hold the whole file i

Re: Gigantic file size processing error

2014-01-02 Thread Uri Guttman
On 01/02/2014 10:39 AM, David Precious wrote: On Thu, 2 Jan 2014 23:21:22 +0800 (SGT) mani kandan wrote: Hi, We have file size of huge size 500MB, Need to Manipulate the file, some replacement and then write the file, I have used File::slurp and works for file size of 300MB (Thanks Uri) but f

Re: Gigantic file size processing error

2014-01-02 Thread David Precious
On Thu, 2 Jan 2014 23:21:22 +0800 (SGT) mani kandan wrote: > Hi, > > We have file size of huge size 500MB, Need to Manipulate the file, > some replacement and then write the file, I have used File::slurp and > works for file size of 300MB (Thanks Uri) but for this huge size > 500MB it is not pro

Gigantic file size processing error

2014-01-02 Thread mani kandan
Hi, We have file size of huge size 500MB, Need to Manipulate the file, some replacement and then write the file, I have used File::slurp and works for file size of 300MB (Thanks Uri) but for this huge size 500MB it is not processing and come out with error. I have also used Tie::file module sam