Re: New validate failures
Hi, Am Donnerstag, den 13.03.2014, 21:13 +0100 schrieb Johan Tibell: Did something change in pretty-printing recently? I saw these new validate failures today: Unexpected failures: ghci.debugger/scripts break008 [bad stdout] (ghci) ghci.debugger/scripts break012 [bad stdout] (ghci) ghci.debugger/scripts break026 [bad stdout] (ghci) ghci.debugger/scripts hist001 [bad stdout] (ghci) ghci/scripts T5545 [bad stdout] (ghci) ghci/scripts ghci008 [bad stdout] (ghci) ghci/scripts ghci012 [bad stdout] (ghci) ghci/scripts ghci023 [bad stdout] (ghci) ghci/scripts ghci025 [bad stdout] (ghci) ghci/scripts ghci026 [bad stdout] (ghci) ghci/scripts ghci055 [bad stdout] (ghci) confirmed by travis, according to https://travis-ci.org/nomeata/ghc-complete/builds the breaking change was one of commit 065c35a9d6d48060c8fac8d755833349ce58b35b Author: Dr. ERDI Gergo ge...@erdi.hu Date: Thu Mar 13 21:18:39 2014 +0800 Pretty-print the following TyThings via their IfaceDecl counterpart: * AnId * ACoAxiom * AConLike commit 24eea38c70eae90d166de26d71a178fb0c1ffc30 Author: Dr. ERDI Gergo ge...@erdi.hu Date: Wed Mar 12 20:38:54 2014 +0800 pprIfaceDecl for IfacePatSyn: use pprPatSynSig commit 23c0f1ec2cf06c0178c2ae7414fe57ea648689e7 Author: Dr. ERDI Gergo ge...@erdi.hu Date: Wed Mar 12 20:38:26 2014 +0800 pprIfaceContextArr: print a context including the = arrow commit 4d1b7b4a9b986e87755784478b4ea4883a5e203e Author: Dr. ERDI Gergo ge...@erdi.hu Date: Wed Mar 12 20:37:22 2014 +0800 Add OutputableBndr instance for OccName Gergo, can you have a look and fix that? Thanks, Joachim -- Joachim “nomeata” Breitner m...@joachim-breitner.de • http://www.joachim-breitner.de/ Jabber: nome...@joachim-breitner.de • GPG-Key: 0x4743206C Debian Developer: nome...@debian.org signature.asc Description: This is a digitally signed message part ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
haddock issue building ghc-7.8 from git
Hi, When building from git (branch ghc-7.8 as of today), I run into a haddock issue because __GLASGOW_HASKELL__ appearently is not 709 on my system. Not sure whether this is a bug, and if it should go to trac/haddock or trac/ghc, so I decided to post it here. My fix is easy enough (even though I ran into other problems later, so I don't really know how well it works): | ~/src/ghc/utils/haddock/src/Haddock$ git diff | diff --git a/src/Haddock/InterfaceFile.hs b/src/Haddock/InterfaceFile.hs | index 924829d..19a742f 100644 | --- a/src/Haddock/InterfaceFile.hs | +++ b/src/Haddock/InterfaceFile.hs | @@ -76,14 +76,14 @@ binaryInterfaceMagic = 0xD0Cface | -- (2) set `binaryInterfaceVersionCompatibility` to [binaryInterfaceVersion] | -- | binaryInterfaceVersion :: Word16 | -#if __GLASGOW_HASKELL__ == 709 | +-- #if __GLASGOW_HASKELL__ == 709 | binaryInterfaceVersion = 25 | | binaryInterfaceVersionCompatibility :: [Word16] | binaryInterfaceVersionCompatibility = [binaryInterfaceVersion] | -#else | -#error Unsupported GHC version | -#endif | +-- #else | +-- #error Unsupported GHC version | +-- #endif | | | initBinMemSize :: Int thanks, matthias ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: haddock issue building ghc-7.8 from git
On 14/03/14 09:01, Matthias Fischmann wrote: Hi, When building from git (branch ghc-7.8 as of today), I run into a haddock issue because __GLASGOW_HASKELL__ appearently is not 709 on my system. Not sure whether this is a bug, and if it should go to trac/haddock or trac/ghc, so I decided to post it here. My fix is easy enough (even though I ran into other problems later, so I don't really know how well it works): | ~/src/ghc/utils/haddock/src/Haddock$ git diff | diff --git a/src/Haddock/InterfaceFile.hs b/src/Haddock/InterfaceFile.hs | index 924829d..19a742f 100644 | --- a/src/Haddock/InterfaceFile.hs | +++ b/src/Haddock/InterfaceFile.hs | @@ -76,14 +76,14 @@ binaryInterfaceMagic = 0xD0Cface | -- (2) set `binaryInterfaceVersionCompatibility` to [binaryInterfaceVersion] | -- | binaryInterfaceVersion :: Word16 | -#if __GLASGOW_HASKELL__ == 709 | +-- #if __GLASGOW_HASKELL__ == 709 | binaryInterfaceVersion = 25 | | binaryInterfaceVersionCompatibility :: [Word16] | binaryInterfaceVersionCompatibility = [binaryInterfaceVersion] | -#else | -#error Unsupported GHC version | -#endif | +-- #else | +-- #error Unsupported GHC version | +-- #endif | | | initBinMemSize :: Int thanks, matthias ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs The master branch of Haddock has that test for a reason: anyone working on Haddock will be doing so using GHC HEAD which is at 7.9. Haddock has a separate branch (named ghc-7.8) which is the candidate that will go into 7.8. If you're building GHC 7.8, you should be on that branch for Haddock and all the other libraries. IIRC you can pass some arguments to the sync-all script which will do all the switching for you but I forgot what it was. I'm sure someone else can chime in. -- Mateusz K. ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: haddock issue building ghc-7.8 from git
My goodness am I clumsy sometimes :( It's actually: $ git clone -b ghc-7.8 https://git.haskell.org/ghc ghc-7.8 $ cd ghc-7.8 $ ./sync-all get -b ghc-7.8 I messed up the -b argument. Again. :( As a rule, sync-all for 'get' will take 'git' arguments, so the 'get -b' translates to 'git clone -b ghc-7.8 ...' Also, yes, there are already fingerprints. See the source code here: https://github.com/nomeata/ghc-complete/tree/ghc-7.8 - Note specifically it is on the 7.8 branch. Also see: https://ghc.haskell.org/trac/ghc/wiki/Building/GettingTheSources#Trackingthefullrepositorystate to use the fingerprint utility. Urgh. Sorry again! On Fri, Mar 14, 2014 at 5:03 AM, Matthias Fischmann m...@zerobuzz.net wrote: On Fri, Mar 14, 2014 at 04:45:58AM -0500, Austin Seipp wrote: Date: Fri, 14 Mar 2014 04:45:58 -0500 From: Austin Seipp aus...@well-typed.com To: ghc-devs@haskell.org ghc-devs@haskell.org Subject: Re: haddock issue building ghc-7.8 from git Oh bother: $ ./sync-all -b ghc-7.8 Should be: ./sync-all -b ghc-7.8 get of course. ok, that makes a lot of sense. (i was wondering about demanding version 709... :-) now i get this: [...] ~/src/ghc-7.8$ ./sync-all -b ghc-7.8 get Unrecognised flag: -b at ./sync-all line 850. == Checking for old haddock repo == Checking for old binary repo == Checking for old mtl repo == Checking for old Cabal repo == Checking for old time from tarball ATTENTION! You have an old time package in your GHC tree! Please remove it (e.g. rm -r libraries/time), and then run ./sync-all get to get the new repository. == Checking for obsolete Git repo URL ~/src/ghc-7.8$ rm -rf libraries/time/ ~/src/ghc-7.8$ ./sync-all -b ghc-7.8 get Unrecognised flag: -b at ./sync-all line 850. == Checking for old haddock repo == Checking for old binary repo == Checking for old mtl repo == Checking for old Cabal repo == Checking for old time from tarball == Checking for obsolete Git repo URL ~/src/ghc-7.8$ perl boot Error: libffi-tarballs/LICENSE doesn't exist. [...] i also tried to pull with -b ghc-7.8, but sync-all doesn't do anything in either case. ./sync-all get pull master (which i don't want, appearently). './sync-all checkout ghc-7.8' cannot find branch 7.8 on several repos. (how easy would it be to implement exact version pinning in sync-all? could i do it in a day or two? would there be any drawbacks from doing this?) cheers, matthias On Fri, Mar 14, 2014 at 4:45 AM, Austin Seipp aus...@well-typed.com wrote: You need to say: $ git clone -b ghc-7.8 https://git.haskell.org/ghc ghc-7.8 $ cd ghc-7.8 $ ./sync-all -b ghc-7.8 That should do the trick and get you a clean, working repository (sync-all is fiddly with checkout at the moment, if I remember correctly). On Fri, Mar 14, 2014 at 4:29 AM, Mateusz Kowalczyk fuuze...@fuuzetsu.co.uk wrote: On 14/03/14 09:01, Matthias Fischmann wrote: Hi, When building from git (branch ghc-7.8 as of today), I run into a haddock issue because __GLASGOW_HASKELL__ appearently is not 709 on my system. Not sure whether this is a bug, and if it should go to trac/haddock or trac/ghc, so I decided to post it here. My fix is easy enough (even though I ran into other problems later, so I don't really know how well it works): | ~/src/ghc/utils/haddock/src/Haddock$ git diff | diff --git a/src/Haddock/InterfaceFile.hs b/src/Haddock/InterfaceFile.hs | index 924829d..19a742f 100644 | --- a/src/Haddock/InterfaceFile.hs | +++ b/src/Haddock/InterfaceFile.hs | @@ -76,14 +76,14 @@ binaryInterfaceMagic = 0xD0Cface | -- (2) set `binaryInterfaceVersionCompatibility` to [binaryInterfaceVersion] | -- | binaryInterfaceVersion :: Word16 | -#if __GLASGOW_HASKELL__ == 709 | +-- #if __GLASGOW_HASKELL__ == 709 | binaryInterfaceVersion = 25 | | binaryInterfaceVersionCompatibility :: [Word16] | binaryInterfaceVersionCompatibility = [binaryInterfaceVersion] | -#else | -#error Unsupported GHC version | -#endif | +-- #else | +-- #error Unsupported GHC version | +-- #endif | | | initBinMemSize :: Int thanks, matthias ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs The master branch of Haddock has that test for a reason: anyone working on Haddock will be doing so using GHC HEAD which is at 7.9. Haddock has a separate branch (named ghc-7.8) which is the candidate that will go into 7.8. If you're building GHC 7.8, you should be on that branch for Haddock and all the other libraries. IIRC you can pass some arguments to the sync-all script which will do all the switching for you but I forgot what it was. I'm sure someone else can chime in. -- Mateusz K. ___ ghc-devs mailing list ghc-devs@haskell.org
testing ghc-7.8 rc2: linker issues
hi, this time i am running 7.8 rc2 (binary for x64 on debian testing). after a successful '/usr/bin/cabal install cabal-install', i am running into different instances of this: | [...] | [25 of 25] Compiling Text.ParserCombinators.Parsec.Perm ( Text/ParserCombinators/Parsec/Perm.hs, dist/build/Text/ParserCombinators/Parsec/Perm.o ) | /usr/bin/ld: cannot find -lHSmtl-2.1.2-ghc7.8.0.20140228 | collect2: error: ld returned 1 exit status | Failed to install parsec-3.1.5 | cabal: Error: some packages failed to install: | network-2.4.2.2 depends on parsec-3.1.5 which failed to install. | parsec-3.1.5 failed during the building phase. The exception was: | ExitFailure 1 what makes me suspect this could be relevant to this list is that the symptoms are weirdly sporadic. my story: 1. cabal install string-conversions fails because -lHStext-1.1.0.1 cannot be found. 2. ghc-pkg unregister a few packages 3. cabal install network (which pulls text) = fails (see above) 4. cabal install string-conversions = works! i will read up on dynamic linking in ghc now. cheers, matthias ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: haddock issue building ghc-7.8 from git
On Fri, Mar 14, 2014 at 05:08:35AM -0500, Austin Seipp wrote: Date: Fri, 14 Mar 2014 05:08:35 -0500 From: Austin Seipp aus...@well-typed.com To: Matthias Fischmann m...@zerobuzz.net Cc: Austin Seipp aus...@well-typed.com, ghc-devs@haskell.org ghc-devs@haskell.org Subject: Re: haddock issue building ghc-7.8 from git My goodness am I clumsy sometimes :( i bet you i'm clumsier :) It's actually: $ git clone -b ghc-7.8 https://git.haskell.org/ghc ghc-7.8 $ cd ghc-7.8 $ ./sync-all get -b ghc-7.8 ok, that works. i had to ^C and re-try a number of times due to connect timeouts (i think there is a thread about this on this list a few days back), but now it's compiling smoothly, and the sub-repos are all on branch 7.8. also thanks for the fingerprint hint. cheers, matthias ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: testing ghc-7.8 rc2: linker issues
Make sure you are building the dynamic / shared lib versions and using cabal install 1.18 On Friday, March 14, 2014, Matthias Fischmann m...@zerobuzz.net wrote: hi, this time i am running 7.8 rc2 (binary for x64 on debian testing). after a successful '/usr/bin/cabal install cabal-install', i am running into different instances of this: | [...] | [25 of 25] Compiling Text.ParserCombinators.Parsec.Perm ( Text/ParserCombinators/Parsec/Perm.hs, dist/build/Text/ParserCombinators/Parsec/Perm.o ) | /usr/bin/ld: cannot find -lHSmtl-2.1.2-ghc7.8.0.20140228 | collect2: error: ld returned 1 exit status | Failed to install parsec-3.1.5 | cabal: Error: some packages failed to install: | network-2.4.2.2 depends on parsec-3.1.5 which failed to install. | parsec-3.1.5 failed during the building phase. The exception was: | ExitFailure 1 what makes me suspect this could be relevant to this list is that the symptoms are weirdly sporadic. my story: 1. cabal install string-conversions fails because -lHStext-1.1.0.1 cannot be found. 2. ghc-pkg unregister a few packages 3. cabal install network (which pulls text) = fails (see above) 4. cabal install string-conversions = works! i will read up on dynamic linking in ghc now. cheers, matthias ___ ghc-devs mailing list ghc-devs@haskell.org javascript:; http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: FFI: c/c++ struct on stack as an argument or return value
I spent some time hacking around on this from a library perspective when I had to interoperate with a bunch of Objective C on a 64-bit mac as many of the core library functions you need to FFI out to pass around pairs of Int32s as a struct small enough by the x64 ABI to get shoehorned into one register, and as I was programmatically cloning Objective C APIs via template haskell I couldn't use the usual clunky C shims. What I was doing was just using libffi with a lot of work to cache the results of ffi_prep_cif for each signature. It worked reasonably well for my purposes, but my need for it vanished and I abandoned the code in the middle of refactoring it for grander things. So if nothing else, you can at least take this as a vote of confidence that your idea isn't crazy. =) I'd also be happy to answer questions if you get stuck or need help. -Edward On Fri, Mar 14, 2014 at 7:50 AM, Yuras Shumovich shumovi...@gmail.comwrote: Hi, Right now ghc's FFI doesn't support c/c++ structures. Whenever we have foreign function that accepts or returns struct by value, we have to create wrapper that accepts or returns pointer to struct. It is inconvenient, but actually not a big deal. But there is no easy workaround when you want to export haskell function to use it with c/c++ API that requires structures to be passed by value (Usually it is a callback in c/c++ API. You can't change it's signature, and if it doesn't provide some kind of void* userdata, then you are stuck.) I'm interested in fixing that. I'm going to start with 'foreign import wrapper ...' stuff. Calling conventions for passing c/c++ structures by value are pretty tricky and platform/compiler specific. So initially I'll use libffi for that (it will work when USE_LIBFFI_FOR_ADJUSTORS is defined, see rts/Adjustor.c). It will allow me to explore design space without bothering about low level implementation details. Later it could be implemented for native (non-libffi) adjustors. Is anybody interested it that? I appreciate any comments/ideas. Right now I don't have clear design. It would be nice to support plain haskell data types that are 1) not recursive, 2) has one constructor and 3) contains only c/c++ types. But it doesn't work with c/c++ unions. Any ideas are welcome. An example how to use libffi with structures: http://www.atmark-techno.com/~yashi/libffi.html#Structures Thanks, Yuras ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: Proposal: Partial Type Signatures
On 03/13/2014 09:56 PM, Richard Eisenberg wrote: First of all: Yay! I've been wanting this for some time. I'm sorry I missed your presentation at PADL about this. I, personally, rather like the design. There may be fine points of discussion as it all becomes reality, but I think this is a great approach -- much like what I've envisioned whenever thinking about the problem. Thanks so much for doing this! You're welcome! Thank you too for the work you have done. I very much liked your presentation at POPL on closed type families :) I would allow _ only as a constraint extension and _a in a constraint only when it also appears in the mono-type. Right, we are now convinced (thanks to SPJ's comment on the wiki page) that the small benefits constraint wildcards bring are not worth the trouble and confusion. This makes things easier for us too :) I think, contrary to the wiki page, that the scope of named wildcards should mirror the behavior of normal type variables. Yes, it's weird that scoped type variables don't work by default, but I think it's weirder still if some scoped type-variable-like things work and others don't. I don't imagine that this fine control is hard to implement. If more people feel like this, we have no problem with mirroring the scoped type variables behaviour. The change will be simple. I think Austin's suggestion below is equally great. Just has 7.8 will now report informative errors when _ is used in an expression position, the implementation for partial type signatures can easily spit out informative errors when _ is used in a type. This would not be an extension, because it does not change the set of programs that compile nor the behavior of compiled programs. However, if a user specifies -XPartialTypeSignatures, then the errors are not reported -- the inferred type is simply used. We also like Austin's idea :) I updated the wiki page with a section about borrowing ideas from the Holes proposal: https://ghc.haskell.org/trac/ghc/wiki/PartialTypeSignatures#holes What we plan on doing next: 1. We will update the code to disallow constraint wildcards. Only named wildcards also occurring in the monotype and the extra-constraints wildcard will be allowed. 2. We will try to implement what we proposed in the Holes section on the wiki page. Comments and feedback are still welcome. There are still some unanswered design questions on the wiki page, e.g. what about generalisation and the extra-constraints wildcard in local partial type signatures? Cheers, Thomas On Mar 13, 2014, at 4:22 PM, Austin Seipp wrote: On Thu, Mar 13, 2014 at 7:18 AM, Thomas Winant thomas.win...@cs.kuleuven.be wrote: However, we have the impression that no Hole should remain unfilled, whereas using a partial type signature doesn't necessarily mean the program is incomplete. A partial type signature can still be a (stylistic) choice. Yes, this is the main hold up I was thinking of. Thinking about it a little more, one way to resolve it perhaps would be to do something similar to we did to typed holes at the term level: make 'holes' at the type level an error by default, which when encountered spit out the error containing the type each hole was instantiated to, and what the partial type signatures would be inferred as. Then, if you enable -XPartialTypeSignatures, these would cease to be errors providing the compiler can infer the types sensibly (and it wouldn't alert you in that case). That might be a half baked idea (I just sat here for about 5 minutes), but either way I say again I do support -XPartialTypeSignatures anyway, and it sounds like a step in the right direction. :) -- Regards, Austin Seipp, Haskell Consultant Well-Typed LLP, http://www.well-typed.com/ ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs Disclaimer: http://www.kuleuven.be/cwis/email_disclaimer.htm ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: FFI: c/c++ struct on stack as an argument or return value
On Fri, 2014-03-14 at 09:08 -0400, Edward Kmett wrote: I spent some time hacking around on this from a library perspective when I had to interoperate with a bunch of Objective C on a 64-bit mac as many of the core library functions you need to FFI out to pass around pairs of Int32s as a struct small enough by the x64 ABI to get shoehorned into one register, and as I was programmatically cloning Objective C APIs via template haskell I couldn't use the usual clunky C shims. Was it related to language-c-inline package? So if nothing else, you can at least take this as a vote of confidence that your idea isn't crazy. =) I'd also be happy to answer questions if you get stuck or need help. Thank you, Edward Since there is at least one person how is interested in, I'll start asking questions. Please let me know when I become too noisy :) For now I'm focused on desugaring phase. Right now type Fn = CInt - CInt - IO () foreign import ccall wrapper f :: Fn - IO (FunPtr Fn) is desugared into f :: Fn - IO (FunPtr Fn) f hsFunc = do sPtr - newStablePtr hsFunc createAdjustor sPtr staticWrapper ... Here staticWrapper -- address of C function. It will dereference the sPtr, cast to StgClosure* and call with appropriate arguments. All the arguments are primitive C types (int, char, pointer, etc), so it is easy to convert them to corresponding haskell types via rts_mkInt, rts_mkChar etc. But I want to allow argument to be C structs. data CStruct { i :: CInt, j :: CInt } type Fn = CStruct - IO () foreign import ccall wrapper f :: Fn - IO (FunPtr Fn) Looks like it is impossible to instantiate CStruct from C function. Is it true? Is it easy to add such functionality? The only solution I see is to flatten CStruct before creating StablePtr: f :: Fn - IO (FunPtr Fn) f hsFunc = do sPtr - newStablePtr $ \i j - hsFunc (CStruct i j) createAdjustor sPtr staticWrapper ... Does it make sense? It will add performance overhead because of additional indirection. Better ideas are welcome. Thanks, Yuras -Edward On Fri, Mar 14, 2014 at 7:50 AM, Yuras Shumovich shumovi...@gmail.comwrote: Hi, Right now ghc's FFI doesn't support c/c++ structures. Whenever we have foreign function that accepts or returns struct by value, we have to create wrapper that accepts or returns pointer to struct. It is inconvenient, but actually not a big deal. But there is no easy workaround when you want to export haskell function to use it with c/c++ API that requires structures to be passed by value (Usually it is a callback in c/c++ API. You can't change it's signature, and if it doesn't provide some kind of void* userdata, then you are stuck.) I'm interested in fixing that. I'm going to start with 'foreign import wrapper ...' stuff. Calling conventions for passing c/c++ structures by value are pretty tricky and platform/compiler specific. So initially I'll use libffi for that (it will work when USE_LIBFFI_FOR_ADJUSTORS is defined, see rts/Adjustor.c). It will allow me to explore design space without bothering about low level implementation details. Later it could be implemented for native (non-libffi) adjustors. Is anybody interested it that? I appreciate any comments/ideas. Right now I don't have clear design. It would be nice to support plain haskell data types that are 1) not recursive, 2) has one constructor and 3) contains only c/c++ types. But it doesn't work with c/c++ unions. Any ideas are welcome. An example how to use libffi with structures: http://www.atmark-techno.com/~yashi/libffi.html#Structures Thanks, Yuras ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: FFI: c/c++ struct on stack as an argument or return value
On Fri, Mar 14, 2014 at 2:00 PM, Yuras Shumovich shumovi...@gmail.comwrote: On Fri, 2014-03-14 at 09:08 -0400, Edward Kmett wrote: I spent some time hacking around on this from a library perspective when I had to interoperate with a bunch of Objective C on a 64-bit mac as many of the core library functions you need to FFI out to pass around pairs of Int32s as a struct small enough by the x64 ABI to get shoehorned into one register, and as I was programmatically cloning Objective C APIs via template haskell I couldn't use the usual clunky C shims. Was it related to language-c-inline package? It was an exercise in serial yak shaving brought about by writing a realtime GPU-based Metropolis light transport raytracer... er.. nevermind. =) So if nothing else, you can at least take this as a vote of confidence that your idea isn't crazy. =) I'd also be happy to answer questions if you get stuck or need help. Thank you, Edward Since there is at least one person how is interested in, I'll start asking questions. Please let me know when I become too noisy :) For now I'm focused on desugaring phase. Right now type Fn = CInt - CInt - IO () foreign import ccall wrapper f :: Fn - IO (FunPtr Fn) is desugared into f :: Fn - IO (FunPtr Fn) f hsFunc = do sPtr - newStablePtr hsFunc createAdjustor sPtr staticWrapper ... Here staticWrapper -- address of C function. It will dereference the sPtr, cast to StgClosure* and call with appropriate arguments. All the arguments are primitive C types (int, char, pointer, etc), so it is easy to convert them to corresponding haskell types via rts_mkInt, rts_mkChar etc. But I want to allow argument to be C structs. data CStruct { i :: CInt, j :: CInt } type Fn = CStruct - IO () foreign import ccall wrapper f :: Fn - IO (FunPtr Fn) Looks like it is impossible to instantiate CStruct from C function. Is it true? Is it easy to add such functionality? The only solution I see is to flatten CStruct before creating StablePtr: f :: Fn - IO (FunPtr Fn) f hsFunc = do sPtr - newStablePtr $ \i j - hsFunc (CStruct i j) createAdjustor sPtr staticWrapper ... Does it make sense? It will add performance overhead because of additional indirection. Better ideas are welcome. Not sure. This is a much lower level (and more correct .. and likely faster) approach than I was taking. I'd just built all my functions in a way that would cache the resulting ffi_prep_cif for each signature using typeclass magic. I had to do some allocations on each ffi_call though as well for struct avalues though, so I'm guessing you'd have to do at least that much. Thanks, Yuras -Edward On Fri, Mar 14, 2014 at 7:50 AM, Yuras Shumovich shumovi...@gmail.com wrote: Hi, Right now ghc's FFI doesn't support c/c++ structures. Whenever we have foreign function that accepts or returns struct by value, we have to create wrapper that accepts or returns pointer to struct. It is inconvenient, but actually not a big deal. But there is no easy workaround when you want to export haskell function to use it with c/c++ API that requires structures to be passed by value (Usually it is a callback in c/c++ API. You can't change it's signature, and if it doesn't provide some kind of void* userdata, then you are stuck.) I'm interested in fixing that. I'm going to start with 'foreign import wrapper ...' stuff. Calling conventions for passing c/c++ structures by value are pretty tricky and platform/compiler specific. So initially I'll use libffi for that (it will work when USE_LIBFFI_FOR_ADJUSTORS is defined, see rts/Adjustor.c). It will allow me to explore design space without bothering about low level implementation details. Later it could be implemented for native (non-libffi) adjustors. Is anybody interested it that? I appreciate any comments/ideas. Right now I don't have clear design. It would be nice to support plain haskell data types that are 1) not recursive, 2) has one constructor and 3) contains only c/c++ types. But it doesn't work with c/c++ unions. Any ideas are welcome. An example how to use libffi with structures: http://www.atmark-techno.com/~yashi/libffi.html#Structures Thanks, Yuras ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: FFI: c/c++ struct on stack as an argument or return value
Yuras, I’m not convinced that the compiler is the right place for this kind of functionality. In fact, when we designed the Haskell FFI, we explicit decided against what you propose. There are a few reasons for this. Firstly, compilers are complex beasts, and secondly, it takes a long time until a change in the compiler goes into production. Hence, as a general rule, it is advisable to move complexity from the compiler into libraries as this reduces compiler complexity. Libraries are less complex and changes can be rolled out much more quickly (it’s essentially a Hackage upload versus waiting for the next GHC and Haskell Platform release). Thirdly, we have got the Haskell standard for a reason and modifying the compiler implies a language extension. The design goal for the Haskell FFI was to provide the absolute minimum as part of the language and compiler, and to layer additional conveniences on top of that in the form of libraries and tools. Have you considered the library or tool route? Manuel Yuras Shumovich shumovi...@gmail.com: Hi, Right now ghc's FFI doesn't support c/c++ structures. Whenever we have foreign function that accepts or returns struct by value, we have to create wrapper that accepts or returns pointer to struct. It is inconvenient, but actually not a big deal. But there is no easy workaround when you want to export haskell function to use it with c/c++ API that requires structures to be passed by value (Usually it is a callback in c/c++ API. You can't change it's signature, and if it doesn't provide some kind of void* userdata, then you are stuck.) I'm interested in fixing that. I'm going to start with 'foreign import wrapper ...' stuff. Calling conventions for passing c/c++ structures by value are pretty tricky and platform/compiler specific. So initially I'll use libffi for that (it will work when USE_LIBFFI_FOR_ADJUSTORS is defined, see rts/Adjustor.c). It will allow me to explore design space without bothering about low level implementation details. Later it could be implemented for native (non-libffi) adjustors. Is anybody interested it that? I appreciate any comments/ideas. Right now I don't have clear design. It would be nice to support plain haskell data types that are 1) not recursive, 2) has one constructor and 3) contains only c/c++ types. But it doesn't work with c/c++ unions. Any ideas are welcome. An example how to use libffi with structures: http://www.atmark-techno.com/~yashi/libffi.html#Structures Thanks, Yuras ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: FFI: c/c++ struct on stack as an argument or return value
I don't care enough to fight and try to win the battle, but I just want to point out that Storable structs are far more brittle and platform dependent than borrowing the already correct platform logic for struct passing from libffi. I do think the existing FFI extension made the right call under the 32 bit ABIs that were in use at the time it was defined. That said, with 64-bit ABIs saying that 2 32-bit ints should be passed in a single 64 bit register, you wind up with large chunks of third party APIs we just can't call out to directly any more, requiring many one-off manual C shims. -Edward On Sat, Mar 15, 2014 at 12:17 AM, Carter Schonwald carter.schonw...@gmail.com wrote: indeed, its very very easy to do storable instances that correspond to the struct type you want, the ``with`` function in http://hackage.haskell.org/package/base-4.6.0.1/docs/Foreign-Marshal-Utils.htmlforeigh.marshal.utils actually gets you most of the way there! On Sat, Mar 15, 2014 at 12:00 AM, Manuel M T Chakravarty c...@cse.unsw.edu.au wrote: Yuras, I’m not convinced that the compiler is the right place for this kind of functionality. In fact, when we designed the Haskell FFI, we explicit decided against what you propose. There are a few reasons for this. Firstly, compilers are complex beasts, and secondly, it takes a long time until a change in the compiler goes into production. Hence, as a general rule, it is advisable to move complexity from the compiler into libraries as this reduces compiler complexity. Libraries are less complex and changes can be rolled out much more quickly (it’s essentially a Hackage upload versus waiting for the next GHC and Haskell Platform release). Thirdly, we have got the Haskell standard for a reason and modifying the compiler implies a language extension. The design goal for the Haskell FFI was to provide the absolute minimum as part of the language and compiler, and to layer additional conveniences on top of that in the form of libraries and tools. Have you considered the library or tool route? Manuel Yuras Shumovich shumovi...@gmail.com: Hi, Right now ghc's FFI doesn't support c/c++ structures. Whenever we have foreign function that accepts or returns struct by value, we have to create wrapper that accepts or returns pointer to struct. It is inconvenient, but actually not a big deal. But there is no easy workaround when you want to export haskell function to use it with c/c++ API that requires structures to be passed by value (Usually it is a callback in c/c++ API. You can't change it's signature, and if it doesn't provide some kind of void* userdata, then you are stuck.) I'm interested in fixing that. I'm going to start with 'foreign import wrapper ...' stuff. Calling conventions for passing c/c++ structures by value are pretty tricky and platform/compiler specific. So initially I'll use libffi for that (it will work when USE_LIBFFI_FOR_ADJUSTORS is defined, see rts/Adjustor.c). It will allow me to explore design space without bothering about low level implementation details. Later it could be implemented for native (non-libffi) adjustors. Is anybody interested it that? I appreciate any comments/ideas. Right now I don't have clear design. It would be nice to support plain haskell data types that are 1) not recursive, 2) has one constructor and 3) contains only c/c++ types. But it doesn't work with c/c++ unions. Any ideas are welcome. An example how to use libffi with structures: http://www.atmark-techno.com/~yashi/libffi.html#Structures Thanks, Yuras ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs
Re: FFI: c/c++ struct on stack as an argument or return value
I'm not opposing that, in fact, theres a GHC ticket discussing some stuff related to this (related to complex numbers). i think the crux of Manuel's point is mainly that any good proposal has to at least give a roadmap to support on all the various platforms etc etc On Sat, Mar 15, 2014 at 12:33 AM, Edward Kmett ekm...@gmail.com wrote: I don't care enough to fight and try to win the battle, but I just want to point out that Storable structs are far more brittle and platform dependent than borrowing the already correct platform logic for struct passing from libffi. I do think the existing FFI extension made the right call under the 32 bit ABIs that were in use at the time it was defined. That said, with 64-bit ABIs saying that 2 32-bit ints should be passed in a single 64 bit register, you wind up with large chunks of third party APIs we just can't call out to directly any more, requiring many one-off manual C shims. -Edward On Sat, Mar 15, 2014 at 12:17 AM, Carter Schonwald carter.schonw...@gmail.com wrote: indeed, its very very easy to do storable instances that correspond to the struct type you want, the ``with`` function in http://hackage.haskell.org/package/base-4.6.0.1/docs/Foreign-Marshal-Utils.htmlforeigh.marshal.utils actually gets you most of the way there! On Sat, Mar 15, 2014 at 12:00 AM, Manuel M T Chakravarty c...@cse.unsw.edu.au wrote: Yuras, I’m not convinced that the compiler is the right place for this kind of functionality. In fact, when we designed the Haskell FFI, we explicit decided against what you propose. There are a few reasons for this. Firstly, compilers are complex beasts, and secondly, it takes a long time until a change in the compiler goes into production. Hence, as a general rule, it is advisable to move complexity from the compiler into libraries as this reduces compiler complexity. Libraries are less complex and changes can be rolled out much more quickly (it’s essentially a Hackage upload versus waiting for the next GHC and Haskell Platform release). Thirdly, we have got the Haskell standard for a reason and modifying the compiler implies a language extension. The design goal for the Haskell FFI was to provide the absolute minimum as part of the language and compiler, and to layer additional conveniences on top of that in the form of libraries and tools. Have you considered the library or tool route? Manuel Yuras Shumovich shumovi...@gmail.com: Hi, Right now ghc's FFI doesn't support c/c++ structures. Whenever we have foreign function that accepts or returns struct by value, we have to create wrapper that accepts or returns pointer to struct. It is inconvenient, but actually not a big deal. But there is no easy workaround when you want to export haskell function to use it with c/c++ API that requires structures to be passed by value (Usually it is a callback in c/c++ API. You can't change it's signature, and if it doesn't provide some kind of void* userdata, then you are stuck.) I'm interested in fixing that. I'm going to start with 'foreign import wrapper ...' stuff. Calling conventions for passing c/c++ structures by value are pretty tricky and platform/compiler specific. So initially I'll use libffi for that (it will work when USE_LIBFFI_FOR_ADJUSTORS is defined, see rts/Adjustor.c). It will allow me to explore design space without bothering about low level implementation details. Later it could be implemented for native (non-libffi) adjustors. Is anybody interested it that? I appreciate any comments/ideas. Right now I don't have clear design. It would be nice to support plain haskell data types that are 1) not recursive, 2) has one constructor and 3) contains only c/c++ types. But it doesn't work with c/c++ unions. Any ideas are welcome. An example how to use libffi with structures: http://www.atmark-techno.com/~yashi/libffi.html#Structures Thanks, Yuras ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs ___ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs