John Craig wrote:

> Richard Gaskin wrote:
>> Turns out my test wasn't all that useful, since the OS has a
>> bit of a bottleneck grabbing the info from 12,000+ files in
>> a single directory.
>>
>> Running the same test on a folder that has only a few hundred files
>> gives a per-file speed more on par with what we might expect:
>>
>> # File: 329  Total: 9ms  Per file: 0.027356ms
>>
>> MacBook Pro, 2.16 GHz, 2MB RAM
>
> For only 1 folder containing 34,782 files on MY machine (3GHz, 512Mb
> RAM) which is not running any services.
>
> Time taken to get 'the detailed files';
> 26888 millisecs = 26.888 seconds
>
> Size of output generated by rev for 'the detailed files';
> 2543957 bytes = 2.5Mb
>
> On a busy server, the results could be considerably greater.  The
> fact that it amounts to just under 0.8 millisecs per file is
> irrelevant - If I need a few (or a few hundred) file sizes, I
> still need to wait for the entire output to be generated.
> Economical?

Given that the biggest bottleneck here appears to be the OS itself, what do you propose which would be more economical?

--
 Richard Gaskin
 Managing Editor, revJournal
 _______________________________________________________
 Rev tips, tutorials and more: http://www.revJournal.com
_______________________________________________
use-revolution mailing list
use-revolution@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to