Hi Colin,

The biggest are about 70000 bytes/characters...in one "session" there may be as many as 10 or 20 of these files to import and convert. Speed in the conversion is essential.

Overnight, I've adjusted my code so that bytes from the files are read in (and converted) in 5000 byte chunks. It's maybe about twice as fast this way (bringing it in in smaller chunks)...but still not fast enough. I might play around and see if can't squeeze out a little more speed by making the chunks even smaller. Cumulatively I'm going to guess that I won't get much of boost doing that. Anybody care to make a prediction?

Funny, when I started this, I figured it'd be the IO and read/write that would be consuming...but that part's just fine. It's converting the imported strings/chars to integers that's really taking "forever."

Gilles

On Saturday, May 1, 2004, at 08:33 US/Eastern, Colin Holgate wrote:

But again, OSX is just slow with text.

I'm handling fairly big chunks of text in OS X without it seeming slow. How big is the original string that needs to be converted to a list of numbers? Also, how many times do you have to do that conversion, and does it really matter if it takes a while?


[To remove yourself from this list, or to change to digest mode, go to http://www.penworks.com/lingo-l.cgi To post messages to the list, email [EMAIL PROTECTED] (Problems, email [EMAIL PROTECTED]). Lingo-L is for learning and helping with programming Lingo. Thanks!]


[To remove yourself from this list, or to change to digest mode, go to http://www.penworks.com/lingo-l.cgi To post messages to the list, email [EMAIL PROTECTED] (Problems, email [EMAIL PROTECTED]). Lingo-L is for learning and helping with programming Lingo. Thanks!]

Reply via email to