Magnus Lycka wrote:
To read the last x bytes of a file, you could do:
import os
x = 2000 # or whatever...
f=open('my_big_file')
l=os.fstat(f.fileno()).st_size
f.seek(l-x)
f.read()
You don't need fstat/st_size, you can ask seek to move to an offset
relative to the end of the
On Thu, 08 Dec 2005 02:09:58 -0500, Mike Meyer [EMAIL PROTECTED] wrote:
[EMAIL PROTECTED] writes:
I have a file which is very large eg over 200Mb , and i am going to use
python to code a tail
command to get the last few lines of the file. What is a good algorithm
for this type of task in
[EMAIL PROTECTED] wrote:
hi
I have a file which is very large eg over 200Mb , and i am going to use
python to code a tail
command to get the last few lines of the file. What is a good algorithm
for this type of task in python for very big files?
Initially, i thought of reading everything
[EMAIL PROTECTED] wrote:
hi
I have a file which is very large eg over 200Mb , and i am going to use
python to code a tail
command to get the last few lines of the file. What is a good algorithm
for this type of task in python for very big files?
Initially, i thought of reading everything
As long as memory mapped files are available, the fastest
method is to map the whole file into memory and use the
mappings rfind method to search for an end of line.
The following code snippets may be usefull:
reportFile = open( filename )
length = os.fstat( reportFile.fileno()
[EMAIL PROTECTED] writes:
hi
I have a file which is very large eg over 200Mb , and i am going to
use python to code a tail command to get the last few lines of the
file. What is a good algorithm for this type of task in python for
very big files? Initially, i thought of reading everything
Gerald Klix [EMAIL PROTECTED] wrote:
As long as memory mapped files are available, the fastest
method is to map the whole file into memory and use the
mappings rfind method to search for an end of line.
Excellent idea.
It'll blow up for large 2GB files on a 32bit OS though.
--
Nick
Gerald Klix [EMAIL PROTECTED] wrote:
As long as memory mapped files are available, the fastest
method is to map the whole file into memory and use the
mappings rfind method to search for an end of line.
Actually mmap doesn't appear to have an rfind method :-(
Here is a tested solution
[EMAIL PROTECTED] wrote:
hi
I have a file which is very large eg over 200Mb , and i am going to use
python to code a tail
command to get the last few lines of the file. What is a good algorithm
for this type of task in python for very big files?
Initially, i thought of reading everything
hi
I have a file which is very large eg over 200Mb , and i am going to use
python to code a tail
command to get the last few lines of the file. What is a good algorithm
for this type of task in python for very big files?
Initially, i thought of reading everything into an array from the file
and
[EMAIL PROTECTED] wrote:
hi
I have a file which is very large eg over 200Mb , and i am going to use
python to code a tail
command to get the last few lines of the file. What is a good algorithm
for this type of task in python for very big files?
Initially, i thought of reading everything
[EMAIL PROTECTED] writes:
I have a file which is very large eg over 200Mb , and i am going to use
python to code a tail
command to get the last few lines of the file. What is a good algorithm
for this type of task in python for very big files?
Initially, i thought of reading everything into
Mike Meyer wrote:
It would probably be more efficient to read blocks backwards and paste
them together, but I'm not going to get into that.
That actually is a pretty good idea. just reverse the buffer and do a
split, the last line becomes the first line and so on. The logic then
would be no
13 matches
Mail list logo