Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-12-01 Thread Neil Mitchell
While that's true, Haskell also makes it easy to make the same sort of error with IO (or any other Monad) values, whether created with the FFI or not. If you say f = do x y z and y has type IO CInt then you won't get an error (and I don't think you can even ask for

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-28 Thread Duncan Coutts
On Thu, 2008-11-27 at 17:20 +, Ian Lynagh wrote: On Wed, Nov 26, 2008 at 10:28:21PM +, Duncan Coutts wrote: I should note that one moral of this story is to check that your FFI imports are correct. That is, check they import the foreign functions at the right Haskell types. In this

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-27 Thread Ketil Malde
Jason Dagit [EMAIL PROTECTED] writes: That is, if you use the optional specification of a header file for each foreign import, and if your Haskell compiler can compile via C, then any checking that types match between Haskell and C can be performed automatically, by the

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-27 Thread Duncan Coutts
On Wed, 2008-11-26 at 23:16 +, Malcolm Wallace wrote: ... to work out the C types and then map them to Haskell ones, to check they're the same as the declared types in the .hs files. I'd like to point out that the FFI specification already has such a mechanism. That is, if you use

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-27 Thread Ian Lynagh
On Wed, Nov 26, 2008 at 10:28:21PM +, Duncan Coutts wrote: On Wed, 2008-11-26 at 14:38 +, Eric Kow wrote: Older versions of darcs can to produce gzipped files with broken CRCs. We never noticed this because our homegrown wrapper around the C libz library does not pick up these

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-27 Thread Luke Palmer
On Thu, Nov 27, 2008 at 10:20 AM, Ian Lynagh [EMAIL PROTECTED] wrote: On Wed, Nov 26, 2008 at 10:28:21PM +, Duncan Coutts wrote: On Wed, 2008-11-26 at 14:38 +, Eric Kow wrote: Older versions of darcs can to produce gzipped files with broken CRCs. We never noticed this because our

[Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-26 Thread Eric Kow
Hi everybody, This advisory is for people who have installed darcs 2.1.2 via the Cabal build method. As you may have noticed, the cabalised darcs sometimes fails with errors like Codec.Compression.Zlib: incorrect data check Why this happens Older versions of darcs can to

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-26 Thread Duncan Coutts
On Wed, 2008-11-26 at 14:38 +, Eric Kow wrote: Hi everybody, This advisory is for people who have installed darcs 2.1.2 via the Cabal build method. As you may have noticed, the cabalised darcs sometimes fails with errors like Codec.Compression.Zlib: incorrect data check Why this

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-26 Thread Don Stewart
duncan.coutts: On Wed, 2008-11-26 at 14:38 +, Eric Kow wrote: Hi everybody, This advisory is for people who have installed darcs 2.1.2 via the Cabal build method. As you may have noticed, the cabalised darcs sometimes fails with errors like Codec.Compression.Zlib: incorrect

Re[2]: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-26 Thread Bulat Ziganshin
Hello Duncan, Thursday, November 27, 2008, 1:28:21 AM, you wrote: checking mode rather than in a generating mode. It would use much of the same code as c2hs but it would read the C header files and the .hs file (via ghc api) and check that the FFI imports are using the right types. there is

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-26 Thread Duncan Coutts
On Wed, 2008-11-26 at 14:30 -0800, Don Stewart wrote: I think there is a need for a tool like c2hs but that works in a checking mode rather than in a generating mode. It would use much of the same code as c2hs but it would read the C header files and the .hs file (via ghc api) and check

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-26 Thread Malcolm Wallace
... to work out the C types and then map them to Haskell ones, to check they're the same as the declared types in the .hs files. I'd like to point out that the FFI specification already has such a mechanism. That is, if you use the optional specification of a header file for each foreign

Re: [Haskell-cafe] workarounds for Codec.Compression.Zlib errors in darcs

2008-11-26 Thread Jason Dagit
On Wed, Nov 26, 2008 at 3:16 PM, Malcolm Wallace [EMAIL PROTECTED] wrote: ... to work out the C types and then map them to Haskell ones, to check they're the same as the declared types in the .hs files. I'd like to point out that the FFI specification already has such a mechanism. That