referring to the Serial instances in
http://hackage.haskell.org/package/smallcheck :
the idea is that  series d  gives all objects of depth <= d.

depth of a term from an algebraic data type
is the standard "depth" concept for trees
(maximum nesting of constructors = longest path from root to node)

1. why are the tuple constructors treated differently?
I'd expect depth (x,y) = succ $ max (depth x) (depth y)
but the succ is missing.

2. why depth and not size (= total number of constructors)?
it seems that the number of objects (trees) of given depth
rises much more drastically than number of trees of given size.

just wondering - J.W.

Attachment: signature.asc
Description: OpenPGP digital signature

_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to