Friends.
I have to work on some huge text files that are around 2GB - 10GB.
Have to read some content on those files and have to rewrite them.
Mostly have to cut/copy/paste/edit the contents in random areas.
sed, awk kind of utilities can not be used as the data is not under
regex and the manual
On Sat, 2010-10-30 at 01:06 +0530, Shrinivasan T wrote:
Friends.
I have to work on some huge text files that are around 2GB - 10GB.
Are there any other ways?
Until someone more knowledgeable answers here goes,
I've had experience with ETL tools that can handle such data with
amazing
On Sat, Oct 30, 2010 at 1:06 AM, Shrinivasan T tshriniva...@gmail.com wrote:
I have to work on some huge text files that are around 2GB - 10GB.
Have to read some content on those files and have to rewrite them.
Mostly have to cut/copy/paste/edit the contents in random areas.
sed, awk kind of
Thanks for the inputs.
I am working on corrupted svn dump files.
Will test the vim hack soon and update the results.
Will think on splitting the file into multiple parts.
Thanks a lot.
--
Regards,
T.Shrinivasan
My Life with GNU/Linux : http://goinggnu.wordpress.com
Free/Open Source Jobs :
On Sat, Oct 30, 2010 at 1:06 AM, Shrinivasan T tshriniva...@gmail.com wrote:
Friends.
I have to work on some huge text files that are around 2GB - 10GB.
Have to read some content on those files and have to rewrite them.
Mostly have to cut/copy/paste/edit the contents in random areas.
sed,
On Sat, Oct 30, 2010 at 1:06 AM, Shrinivasan T tshriniva...@gmail.comwrote:
Friends.
I have to work on some huge text files that are around 2GB - 10GB.
Have to read some content on those files and have to rewrite them.
Mostly have to cut/copy/paste/edit the contents in random areas.
sed,
--- On Sat, 30/10/10, Raja Subramanian rajasuper...@gmail.com wrote:
Without knowing anything about the layout of your text
files, the simplest
suggestion would be to split up the files into smaller
chunks, perform
your edits, and join them in the end.
If possible you can split based on