On Friday 13 August 2010 21:32:12, Kevin Jardine wrote: > Surely a lot of real world text processing programs are IO intensive? > So if there is no native Text IO and everything needs to be read in / > written out as ByteString data converted to/from Text this strikes me > as a major performance sink. > > Or is there native Text IO but just not in your example?
Outdated information, sorry. Up to ghc-6.10, text's IO was via ByteString, it's no longer so. However, the native Text IO is (of course) much slower than ByteString IO due to the need of en/decoding. > > Kevin > > On Aug 13, 8:57 pm, Daniel Fischer <daniel.is.fisc...@web.de> wrote: > > Just occurred to me, a lot of the difference is due to the fact that > > text has to convert a ByteString to Text on reading the file, so I > > timed that by reading the file and counting the chunks, that took text > > 0.21s for big.txt vs. Data.ByteString.Lazy's 0.01s. > > So for searching in-memory strings, subtract about 0.032s/MB from the > > difference - it's still large. _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe