Visual Haskell: Cannot Create Project
Title: Visual Haskell: Cannot Create Project I downloaded and installed the Visual Haskell addin for Visual Studio 2003 today, but Visual Studio displays the following error message when I attempt to create my first Haskell Console Application project: The application for project C:\Program Files\Visual Haskell\Templates\Console Application.cabal is not installed. Make sure the application for the project type (.cabal) is installed. -- I verified that Windows has the CABAL file extension mapped to the File Type Visual Haskell Project. -- I verified that ghci.exe (in the Visual Haskell\bin directory) runs successfully. -- I also tried adding the Visual Haskell\bin directory to my PATH environment variable (had no effect.) Do I have to do something manually to install the Cabal libraries? (I am a complete beginner with Haskell.) System Info: Windows XP SP1 Visual Studio 2003 Version 7.1.3088 .Net Framework Version 1.1.4322 SP1 Thanks, Ed B. ___ Glasgow-haskell-bugs mailing list Glasgow-haskell-bugs@haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs
ghc needing ghc is the biggest bug in ghc
Are you fully nerd How can ghc expect an installed ghc for the first build stage? What is that See how gcc does it. It creates a basic C compiler in the first stage and then compiles the further stages with that. Your strategy shows that you just don't care. So don't expect that admins are happy about your offer... Man, unbelievable ___ Glasgow-haskell-bugs mailing list Glasgow-haskell-bugs@haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs
Re: ghc needing ghc is the biggest bug in ghc
On 10-mrt-2006, at 8:54, Dennis Heuer wrote: Are you fully nerd How can ghc expect an installed ghc for the first build stage? What is that See how gcc does it. It creates a basic C compiler in the first stage and then compiles the further stages with that. Your strategy shows that you just don't care. So don't expect that admins are happy about your offer... Man, unbelievable Yes, and no. It is perfectly possible to bootstrap from .hc files, which basically is a partial compilation of the ghc sources to C. Unfortunately, these .hc files are (by necessity) machine dependent, so you are not normally encouraged to take that route. It is documented fairly well though, even so, at http://www.haskell.org/ghc/docs/latest/html/building/sec-porting- ghc.html#sec-booting-from-hc With regards, Arthur van Leeuwen. PS: you *do* need a C compiler to bootstrap GCC... in fact, to bootstrap the GCC ADA compiler, you already need an installation of same. Do you also consider that to be a serious bug? -- /\/ | [EMAIL PROTECTED] | Work like you don't need the money /__\ / | A friend is someone with whom | Love like you have never been hurt /\/__ | you can dare to be yourself | Dance like there's nobody watching PGP.sig Description: This is a digitally signed message part ___ Glasgow-haskell-bugs mailing list Glasgow-haskell-bugs@haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs
Re: ghc releasing memory during compilation
Bulat Ziganshin wrote: Sunday, March 12, 2006, 8:00:05 PM, you wrote: ghbrho Subject: ghc releasing memory during compilation ghbrho Am I right in thinking that ghc's rts can free memory back to the system ghbrho when the heap pressure reduces (at least if it's doing a compacting GC)? ghbrho In this case if it can do so, it should be quite dramatic. It'd ought to ghbrho be able to go from 400Mb back down to just a few Mb or so. i've suggested the same just 11 months ago. after long discussion Simon Marlow declined my proposal because this will raise gc times in the situations when memory is enough by whole 10%! when i suggested just to check available PHYSICAL memory he answered that he don't know that is physical memory, only virtual memory matters I think that's a mischaracterisation of what I said. Actually I said that I didn't understand your comment about physical vs. virtual memory, and in fact, looking back at the message, I still don't understand it :) http://www.haskell.org//pipermail/glasgow-haskell-users/2005-April/008373.html I think what you're suggesting is that the runtime should detect the amount of physical memory on the system and auto-tune itself to switch to compacting collection when its residency reaches that amount. This is certainly something we could do. Bear in mind that GHC is not necessarily the only process running on the machine, though, and what about running multiple GHCs? Also, you can do this by setting your GHC_RTS environment variable. Suppose you have 1G of physical mem. You want GHC to give up when it reaches 1.5G, and you want to switch to compacting collection when the residency reaches 300M. You could do this: export GHC_RTS='-M1.5G -c20' because 300M is 20% of 1.5G. Perhaps a better interface would be to allow you to specify exactly the residency at which to switch to compaction. Cheers, Simon ___ Glasgow-haskell-bugs mailing list Glasgow-haskell-bugs@haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs
Re: ghc releasing memory during compilation
Duncan Coutts wrote: As some are aware some Wash modules cause ghc to take a very large amount of memory. It also generates very large C and assembler files (the .raw_s file for one module is nearly 50Mb). So unsurprisingly it also makes gcc take a very large amount of memory. Unfortunately for people with weaker machines these happen at the same time. That is at the same time that gcc starts taking 300Mb, ghc is still taking up 400Mb. Even on machines with 1Gb of ram this pushes everything else out into swap. Note that unless constrained, ghc will take even more than 400Mb to build wash (I noticed it using over 750Mb). The Gentoo ebuild already limits ghc to 400Mb on 64 bit machines (and 200Mb on 32bit ones). What I was wondering is if ghc could do a major GC and free most of it's memory back to the system just before it calls gcc to compile the .hc code that ghc has generated. That way the memory spike of ghc and gcc would not coincide and our machines would not be brought to a crawl. Am I right in thinking that ghc's rts can free memory back to the system when the heap pressure reduces (at least if it's doing a compacting GC)? No, not at the moment. One thing we planned to do but never got around to is to use madvise() to improve swapping behaviour when memory is tight (see the thread that Bulat refferred to). In this case if it can do so, it should be quite dramatic. It'd ought to be able to go from 400Mb back down to just a few Mb or so. Yes it ought to. A related problem is that the block allocator is really stupid. It makes no attempt to reduce fragmentation, and in fact freeing blocks is O(n) because the freelist is kept sorted in address order. This isn't usually an issue, although it has been reported to be noticeable with very large residencies (500M+). It's on my list to fix at some point. I mention this because freeing all that memory might not be possible if it is highly fragmented. Cheers, Simon ___ Glasgow-haskell-bugs mailing list Glasgow-haskell-bugs@haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs
Re: ghc releasing memory during compilation
On Mon, 2006-03-13 at 12:47 +, Simon Marlow wrote: Am I right in thinking that ghc's rts can free memory back to the system when the heap pressure reduces (at least if it's doing a compacting GC)? No, not at the moment. One thing we planned to do but never got around to is to use madvise() to improve swapping behaviour when memory is tight (see the thread that Bulat refferred to). In this case if it can do so, it should be quite dramatic. It'd ought to be able to go from 400Mb back down to just a few Mb or so. Yes it ought to. A related problem is that the block allocator is really stupid. It makes no attempt to reduce fragmentation, and in fact freeing blocks is O(n) because the freelist is kept sorted in address order. This isn't usually an issue, although it has been reported to be noticeable with very large residencies (500M+). It's on my list to fix at some point. I mention this because freeing all that memory might not be possible if it is highly fragmented. Ah, I hoped that if we were using the compacting GC then it might be able to defragment (since it is copying anyway) and thus free large contiguous blocks, eg munmap()ing whole MBlocks. Duncan ___ Glasgow-haskell-bugs mailing list Glasgow-haskell-bugs@haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs
[GHC] #723: The package database should be a directory of files instead of a single file
#723: The package database should be a directory of files instead of a single file -+-- Reporter: simonmar |Owner: Type: task | Status: new Priority: normal|Milestone: Component: Compiler | Version: 6.4.1 Severity: normal| Keywords: Os: Unknown | Difficulty: Moderate (1 day) Architecture: Unknown | -+-- This would help package systems that want to install packages by just unpacking a bunch of files onto the filesystem, amongst other things. See: [http://www.haskell.org//pipermail/glasgow-haskell-users/2006- March/009838.html] -- Ticket URL: http://hackage.haskell.org/trac/ghc/ticket/723 GHC http://www.haskell.org/ghc/ The Glasgow Haskell Compiler___ Glasgow-haskell-bugs mailing list Glasgow-haskell-bugs@haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs
Re: ghc needing ghc is the biggest bug in ghc
Dennis Heuer [EMAIL PROTECTED] writes: Man, unbelievable Indeed. Immanuel ___ Glasgow-haskell-bugs mailing list Glasgow-haskell-bugs@haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs
Re: ghc needing ghc is the biggest bug in ghc
dh: Are you fully nerd How can ghc expect an installed ghc for the first build stage? What is that I know! What is wrong with all these functional programmers defining everything with recursion I think they do it just to be fully nerd!!! ;) -- Don ___ Glasgow-haskell-bugs mailing list Glasgow-haskell-bugs@haskell.org http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs