Hi,

Since I discovered that my program spent most of its time reading and
writing files, I had a go at transforming it from using readFile to
using array IO.

Since my file consists of lines of length 18, I tried writing a
similar function to readFile, to return a lazy list of lines as
arrays, included below.  This works, but wasn't more efficient.  I
haven't toyed to much with this kind of thing before, so please let me
know if I'm doing anything obviously stupid.

Perhaps the whole strategy is wrong?  Should I wrap more of my program
in IO, and do things incrementally there?  (The files can be large,
typically 100M-2G range, so reading them strictly is not an option)

Or would it be more efficient to read larger chunks at a time?  If so,
what is a good chunk size? 1K? 4K? 1M?

----------8<--------------------

  -- read the external file as a lazy list of arrays
  readXFile :: FilePath -> IO [UArray Int Word8]
  readXFile f = do
              h <- openFile f ReadMode
              getArrays h
      where getArrays h = do end <- hIsEOF h
                           case end of
                             True -> return []
                             False -> do
                                (a :: IOUArray Int Word8) <- newArray (0,17) 0 
                                hGetArray h a 18
                                a' <- unsafeFreeze a
                                as <- unsafeInterleaveIO (getArrays h)
                                return (a':as)

----------8<--------------------

-kzm
-- 
If I haven't seen further, it is by standing in the footprints of giants
_______________________________________________
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell

Reply via email to