[For those with an interest in the intricacies of type classes]:
N K Anil ([EMAIL PROTECTED]) asks:
| Haskell 1.2 report on page 33 says that
| > If the two instance declarations read like this
| >
| > instance Num a => Foo [a] where ...
| >
| > instance (Eq a, Text a) => Bar [a] where ...
| >
| > then the program would be illegal.
|
| I cannot find out the exact reason for this restriction. If
| you allow this , then any pathological case can still be
| found out during type checking. I checked this in gofer
| and it works.
Let's make this more concrete and fill out those definitions... and also
include the important class definition from p.32. The definitions are a
bit contrived to get the types right:
| class Foo a where foo :: a -> a
| class Foo a => Bar a where bar :: a -> a
|
| instance Num a => Foo [a] where
| foo [] = []
| foo (x:xs) = map (x+) xs
|
| instance (Eq a, Text a) => Bar [a] where
| bar [] = []
| bar (x:xs) = foo xs where u = x==x
| v = show x
|
Now suppose that you wanted to evaluate bar "abcde". This would require that
you also have instances:
o Eq Char and Text Char (from the second instance declaration). Neither
of these is a problem.
o Foo [Char]. Well this isn't a problem at first because there is a
matching instance declaration. But that declaration also requires that
there is an instance Num Char. No such instance exists!
So, when you try this in Gofer, you get:
| ? bar "abcde"
| ERROR: Cannot derive instance in expression
| *** Expression : bar d133 "abcde"
| *** Required instance : Bar [Char]
| *** No subdictionary : Num Char
The part of the report that you refer to is intended to describe the static
checks that will be performed in a proper Haskell system to detect this error
at an earlier stage. (Remember, Gofer is not a proper Haskell system.) On
the other hand, the Gofer approach has its advantages. For example, in Gofer
we can evaluate:
| ? bar [1..4]
| [5, 6]
The fact that the existence of instances Eq a and Text a does not in general
imply the existence of an instance Num a does not matter here because, when
a is replaced by the type Int of integers, all three instances are defined.
It's a bit like the observation that there are some programs that are not
type-correct and yet will run without a runtime error. There is always a
compromise between the size of the class of programs that will be accepted by
a type system and the degree of static checks that are actually performed.
Hope that helps!
Mark (please ignore the return address above;
it should say [EMAIL PROTECTED])