Hi,

That's what I'm doing, but it's *way* too slow and crashes if you read too much data. I've got a 60+ GB file and need to read at "random" positions in the file. Moving from somewhere near the start to somewhere near the end takes minutes! The ironic thing is that the data I want to read when I get there is quite small!

All the Best
Dave

On 17 May 2007, at 18:49, Richard Gaskin wrote:


While much slower, as a workaround in the meantime you could read the file in chunks until you get to the part you're interested in.

--
 Richard Gaskin
 Fourth World Media Corporation
 ___________________________________________________________
 [EMAIL PROTECTED]       http://www.FourthWorld.com
_______________________________________________
use-revolution mailing list
use-revolution@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution

_______________________________________________
use-revolution mailing list
use-revolution@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to