Jan Kort writes:

> It seem that any record, no matter how trivial, can't be much
> longer than about 200 lines in Haskell. If a try to compile a
> 300 line record containing just:
> data X = X {
>         f1 :: String,
>         f2 :: String,
>         f3 :: String,
>       ...
>       f300 :: String
> }
> It needs about 90M heap in ghc4.06. Whereas a 150 line record
> requires less than 6M heap. After this big gap it levels off
> to a somewhat more decent exponential increase: a 450 line
> record requires about 180M heap.
> 
> I could file a bug report, but it seems that all compilers
> (ghc4.06, nhc98, hbc0.9994 and hugs) have this problem. So,
> is this a fundamental problem ?

Actually, the 150-line record needs about 20M, and the 300-line record needs
about 75M.  These figures are roughly double the actual residency, because
GHC's underlying collector is a copying, not compacting, one.

GHC automatically increases the heap size up to a maximum of 64M unless you
tell it not to (with -optCrts-M32m, for example).  I'll bet this is the
source of the confusion.

The heap requirement is still non-linear, but I'm guessing that this is
because for each line you add to the record the compiler has to not only
generate a new selector function, but also add a field to the record being
pattern matched against in all the existing selectors.

Cheers,
        Simon

Reply via email to