RE: stupid strictness question

2002-12-05 Thread Simon Marlow
 Now, we define:
 
  data SMaybe a = SNothing | SJust !a  deriving Show
 
 Now, we run:
 
 *Strict Just (undefined::Int)
 Just *** Exception: Prelude.undefined
 *Strict Just $! (undefined::Int)
 *** Exception: Prelude.undefined
 *Strict SJust $! (undefined::Int)
 *** Exception: Prelude.undefined
 *Strict SJust (undefined::Int)
 SJust *** Exception: Prelude.undefined

 I can't figure out why this last one is different from the 
 one before it, or the one before that.

This one is a GHCi (not GHC) bug.  You may have seen this message while
loading the source containing the strict constructor definition:

WARNING: ignoring polymorphic case in interpreted mode.
   Possibly due to strict polymorphic/functional constructor args.
   Your program may leak space unexpectedly.

which means that GHCi essentially ignored the strictness flag on the
polymorphic field of the SJust constructor.  To work around the bug, you
can compile that module with GHC.

The good news is that this bug will be fixed in the next major release.

Cheers,
Simon
___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



stupid strictness question

2002-12-04 Thread Hal Daume III
i know i know, this has been asked a million times, but i was searching
through the archives with no success, so i figured i'd bug you guys.

according to the report:

  A declaration of the form 

data cx = T u1 ... uk = ... | K s1 ... sn | ... 

  where each si is either of the form ! ti or ti, replaces every 
  occurance of K in an expression by 

(\x1 ... xn - ( ((K op1 x1) op2 x2) ... ) opn xn) 

  where opi is the lazy apply function $ if si is of the form ti, 
  and opi is the strict apply function $! (see Section 6.2) if si 
  is of the form !ti. Pattern matching on K is not affected by
  strictness flags. 

so.  we define:

 data L = L  Int deriving Show
 data S = S !Int deriving Show

and, as expected, we get:

*Strict L undefined
L *** Exception: Prelude.undefined
*Strict L $! undefined
*** Exception: Prelude.undefined
*Strict S undefined
*** Exception: Prelude.undefined

Now, we define:

 data SMaybe a = SNothing | SJust !a  deriving Show

Now, we run:

*Strict Just (undefined::Int)
Just *** Exception: Prelude.undefined
*Strict Just $! (undefined::Int)
*** Exception: Prelude.undefined
*Strict SJust $! (undefined::Int)
*** Exception: Prelude.undefined
*Strict SJust (undefined::Int)
SJust *** Exception: Prelude.undefined

I can't figure out why this last one is different from the one before it,
or the one before that.  Interestingly, Hugs disagrees (the previous was
with ghc 5.04.1):

Strict Just (undefined::Int)
Just 
Program error: {undefined}
Strict Just $! (undefined::Int)
Program error: {undefined}
Strict SJust $! (undefined::Int)
Program error: {undefined}
Strict SJust (undefined::Int)
Program error: {undefined}

Which is what I expected.  Can someone clarify here?

 - Hal

___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell