On 09/09/2010 00:54, wren ng thornton wrote:
On 9/7/10 3:10 PM, Ben Millwood wrote:
So I wonder what people
think of the use of CPP in Haskell code, what alternatives people can
propose, or what people hope to see in future to make conditional
compilation of Haskell code more elegant and simple?
The only thing I ever use CPP for in Haskell is top-level conditional
definitions.
* That is, say I have a function foo which has a vanilla Haskell
definition, but also has a special definition for GHC performance
hackery, or which needs a special definition on some compilers in order
to correct compiler-specific bugs. I'll use #ifdef here to give the
different versions. I'll also occasionally do this for things like
choosing whether to use the FFI vs a native definition, for debugging
purposes.
* Another example is when using GeneralizedNewtypeDeriving in GHC, but
still wanting to give normal definitions for other compilers to use.
* The only other example I can think of is when defining Applicative
instances, since I only want to do that when linking against versions of
base which are new enough to have it. Occasionally you can get similar
issues re ByteString vs base.
In general, I think using CPP for actual macro processing is extremely
poor style and can easily make code inscrutable (and no doubt
bug-prone). If the Haskell spec were to add support for this sort of
top-level compiler/compiletime-flag conditional definition, I'd switch
over.
This matches the style in most of the code I've looked at. And it also
means that the incompatibilities are localized and hidden from most
client code. Depending on the nature of your library API conflict, if
you can localize things into a few definitions of the core functions you
use in the rest of your code, then that'd be best. But that's not always
possible. I've yet to run into the case where I really need to support
incompatible versions of a library when it's that closely integrated, so
I don't have much advice there.
I'm afraid I instigated the CPP-vs-no-CPP issue with a patch I sent to
Ben that adds backwards compatibility for GHC 6.10 to the
haskell-src-meta package. Unfortunately the Template Haskell AST data
type changed between the version of the library that works with 6.10 and
the version that works with 6.12, and I used CPP and cabal's
MIN_VERSION_* macro to conditionally compile the appropriate code.
For example, unlike the previous version of the library, the AST data
type for patterns in the 6.12-compatible version of the template-haskell
package includes a case for bang patterns. I worked around this by using
CPP to conditionally compile the case that handles translating bang
patterns between the template-haskell and haskell-src-exts
representations only when the new version of the template-haskell
library is present. The problem with having completely separate
implementations in different source code files for different versions of
the template-haskell library instead of using CPP is that the rest of
the pattern translation function then has to be duplicated.
The question isn't whether or not to use CPP for macro expansion---I
think we all can agree that that is in bad taste ;) To quote Ben, the
choice is between the following two options:
1. Use CPP and the macros defined by Cabal to conditionally include or
exclude code for each version.
2. Use Cabal file conditionals to select hs-source-dirs containing
those parts of the code (or even TH to help generate those parts of
the code) that are specific to each configuration.
I too dislike CPP, but I dislike duplicated code more. Within reason, I
would prefer to be able to find a function's implementation in a single
file even if that definition has a few conditionally-compiled parts.
Geoff
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe