Christos TZOTZIOY Georgiou <[EMAIL PROTECTED]> wrote:
> On 09 Feb 2005 10:31:22 GMT, rumours say that Nick Craig-Wood
> <[EMAIL PROTECTED]> might have written:
>
> >But you won't be able to md5sum a file bigger than about 4 Gb if using
> >a 32bit processor (like x86) will you? (I don't know how
On 09 Feb 2005 10:31:22 GMT, rumours say that Nick Craig-Wood
<[EMAIL PROTECTED]> might have written:
>Fredrik Lundh <[EMAIL PROTECTED]> wrote:
>> on my machine, Python's md5+mmap is a little bit faster than
>> subprocess+md5sum:
>>
>> import os, md5, mmap
>>
>> file = open(fn, "r+")
Thomas Heller <[EMAIL PROTECTED]> wrote:
> Nick Craig-Wood <[EMAIL PROTECTED]> writes:
> > Here is an implementation of md5sum in python. Its the same speed
> > give or take as md5sum itself. This isn't suprising since md5sum is
> > dominated by CPU usage of the MD5 routine (in C in both cases)
Fredrik Lundh <[EMAIL PROTECTED]> wrote:
> on my machine, Python's md5+mmap is a little bit faster than
> subprocess+md5sum:
>
> import os, md5, mmap
>
> file = open(fn, "r+")
> size = os.path.getsize(fn)
> hash = md5.md5(mmap.mmap(file.fileno(), size)).hexdigest()
>
> (I
On Tue, 8 Feb 2005 17:26:07 +0100, rumours say that "Fredrik Lundh"
<[EMAIL PROTECTED]> might have written:
>on my machine, Python's md5+mmap is a little bit faster than
>subprocess+md5sum:
>
>import os, md5, mmap
>
>file = open(fn, "r+")
[snip]
My first reaction was that "r+" should be "
Nick Craig-Wood <[EMAIL PROTECTED]> writes:
> Ola Natvig <[EMAIL PROTECTED]> wrote:
>> Hi all
>>
>> Does anyone know of a fast way to calculate checksums for a large file.
>> I need a way to generate ETag keys for a webserver, the ETag of large
>> files are not realy nececary, but it would
Ola Natvig <[EMAIL PROTECTED]> wrote:
> Hi all
>
> Does anyone know of a fast way to calculate checksums for a large file.
> I need a way to generate ETag keys for a webserver, the ETag of large
> files are not realy nececary, but it would be nice if I could do it. I'm
> using the python h
Robin Becker wrote:
>> Does anyone know of a fast way to calculate checksums for a large file. I
>> need a way to generate
>> ETag keys for a webserver, the ETag of large files are not realy nececary,
>> but it would be nice
>> if I could do it. I'm using the python hash function on the dynami
On Tue, 08 Feb 2005 16:13:43 +, rumours say that Robin Becker
<[EMAIL PROTECTED]> might have written:
>Ola Natvig wrote:
>> Hi all
>>
>> Does anyone know of a fast way to calculate checksums for a large file.
>> I need a way to generate ETag keys for a webserver, the ETag of large
>> files
Michael Hoffman wrote:
Is there a reason you can't use the sha module?
BTW, I'm using SHA-1 instead of MD5 because of the reported vulnerabilities
in MD5, which may not be important for your application, but I consider it
best to just avoid MD5 entirely in the future.
--
Michael Hoffman
--
http://m
Ola Natvig wrote:
Does anyone know of a fast way to calculate checksums for a large file.
I need a way to generate ETag keys for a webserver, the ETag of large
files are not realy nececary, but it would be nice if I could do it. I'm
using the python hash function on the dynamic generated strings
Ola Natvig wrote:
Hi all
Does anyone know of a fast way to calculate checksums for a large file.
I need a way to generate ETag keys for a webserver, the ETag of large
files are not realy nececary, but it would be nice if I could do it. I'm
using the python hash function on the dynamic generated
Hi all
Does anyone know of a fast way to calculate checksums for a large file.
I need a way to generate ETag keys for a webserver, the ETag of large
files are not realy nececary, but it would be nice if I could do it. I'm
using the python hash function on the dynamic generated strings (like in
13 matches
Mail list logo