Grant Kelly had this to say on 06/28/2006 01:01 PM:
Alright unix fans, who can answer this the best?

I have a text file, it's about 2.3 GB. I need to delete the first 300
lines, and I don't want to have to load the entire thing into an
editor.

I'm trying `sed '1,300d' inputfile > output file`  but it's taking a
long time (and space) to output everything to the new file.

There has got to be a better way, a way that can do this in-place...


Grant

_______________________________________________
RLUG mailing list
[email protected]
http://lists.rlug.org/mailman/listinfo/rlug
Interesting discussion tonight...

If you have the disk space, you could do a quick python script for this task:

<snip>
# "untested code" using filesystem vs. available RAM

import os


bigfile = 'myhugefile.txt'

fobj = open('bigfile', 'r')

fobjwrite = open('bigfile.tmp' ,'w')


linecount = 299


for lines in fobj:

    if linecount != 0:

        fobjwrite.write(l)

    else:

        linecount = 1

fobj.close()

fobjwrite.close()

os.rename('bigfile.tmp', 'bigfile')

</snip>

Another thought is to use ex from vim (I think Brandon mentioned this already)...


ex bigfile
:1,300 del
:wq

Either approach will be file system I/O intensive (but saves on available RAM), and you would probably get better performance from an XFS or IBM JFS file system.

Sounds like you solved your problem, but thought I would throw that in there...

Colin



-- 
^X^C
q
quit
:q
^C
end
x
exit
ZZ
^D
^Z
^K^B
?
help



_______________________________________________
RLUG mailing list
[email protected]
http://lists.rlug.org/mailman/listinfo/rlug

Reply via email to