On Fri, May 14, 2010 at 02:18:42PM +0200, Julian Fleischer wrote: > Hello, > > i'm playin' around with GHCs Haskell and some extensions. I'm already aware > of that functional dependencies are "very very tricky", but there is > something I don't understand about there implementation in GHC. I've > constructed my own TypeClass "Num" providing a signature for (+), having > multiple params a, b and c. I'm than declaring a (flexible) Instance for > Prelude.Num, simply using (Prelude.+) for the definition of my (+) - and it > does not work as I expect it to. > > First, this is the code: > > {-# LANGUAGE MultiParamTypeClasses, FunctionalDependencies, > > TypeSynonymInstances, FlexibleInstances #-} > > import qualified Prelude > > > > class Num a b c | a b -> c where > > (+) :: a -> b -> c > > > > instance (Prelude.Num x) => Num x x x where > > (+) = (Prelude.+) > > now if I load it into GHCi and type "3 + 4" i get a whole bunch of > error-messages. > > I do understand that > > (3::Prelude.Int) + (4::Prelude.Int) > works, since I've explicitly declared 3 and 4 to be Prelude.Int and there is > a functional dependency stating that (+) :: a b determines the results type > c, by the Instance declaration cleary c will be the same as a and b. > > Now, if I type > > 3 + 4 > it does not work, and i really don't understand why. If i ask GHCi for 3's > type ($ :t 3) it will answer "3 :: (Prelude.Num t) => t". But, if 3 and 4 are > Prelude.Nums and there is an instanfe Num x x x for x of Prelude.Num - than > why can't GHC deduce from the definitions that 3 and 4, both Prelude.Nums, > can be used with (+) since there is an instance for Prelude.Num and my class > Num - and the result will of course be something of Prelude.Num?
The reason "3 + 4" works in GHCi ordinarily but not with your redefined (+) has to do with the rules for type-defaulting. In the ordinary case, GHCi is really evaluating "show (3 + 4)", which has a type like (Num a, Show a) => String. We still have a free type variable a, and the resulting value depends on our choice for this type (consider Integer vs. Double). In this situation, there are rules (Haskell '98 Report section 4.3.4) for making this choice, but they only apply in very specific situations: in particular all of the relevant classes (here Num and Show) must be among those defined in the standard library. You can demonstrate that type-defaulting is at work by trying to load the following into GHCi: > default () > x = show (3 + 4) -- error: Ambiguous type variable (GHCi actually has slighly relaxed defaulting rules, see [1], and it seems to be impossible to turn off defaulting within GHCi, which is why the expression "show (3 + 4)" must be in a module for this demonstration.) There is no provision for extending the defaulting mechanism to your own type classes. Arguably this is a good thing, since defaulting can sometimes behave surprisingly already under GHCi's rules, as anyone who's run QuickCheck on a test with a type variable instantiated to () can attest to. [1]: http://www.haskell.org/ghc/docs/6.12.2/html/users_guide/interactive-evaluation.html#extended-default-rules Regards, Reid Barton _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe