Dear all,
can ghc compile huge tables into efficient code if they are constant at
compile time?
Two examples may clearify the question:
big1 :: UArray Int Char
big1 = array (0,1000) $! map (\i -> (i,toEnum i)) [0..1000]
big2 = sum [0..10000]::Int -- == 50005000 == n*(n+1)/2 where n = 10000
Both values are constant at compile time. As they are given by pure
functions, the compiler could evaluate them and write the *result* into the
object file 'foo.o'. This would save code size and run time.
I peeked into 'foo.hc' but I didn't found 0x2fb0408 nor an array {0, 1, 2,
3, ..., 1000} or similar things.
The function in big2 should show that the computation of the value could be
very time consuming. If the compiler does not compute it, the source file
could be generated by a helper program (or template Haskell?).
The function in big1 should show that the conversion of the data into an
array could be run-time, code-size and heap-size consuming. (You need the
list (maybe explicitly - think of 1000 fixed pseudo-random numbers) and
convert it into an array.) If the compiler generates the unboxed array
directly it could be rather efficient.
PS: I compiled with: ghc6 -c -O2 -keep-tmp-files -keep-hc-files foo.hs
Regards,
--
Stefan Karrmann
module Foo (module Foo) where
import Data.Array.IArray
import Data.Array.Unboxed
big1 :: UArray Int Char
big1 = array (0,1000) $! map (\i -> (i,toEnum i)) [0..1000]
big2 = sum [0..10000]::Int
_______________________________________________
Haskell mailing list
[email protected]
http://www.haskell.org/mailman/listinfo/haskell