Re: 64-bit windows version?

2007-06-26 Thread Simon Marlow

Peter Tanski wrote:

I keep on referring to this as temporary because there are two different 
builds here:

(1) the build using the old mingw-GHC, without option support for CL; and,
(2) the build using the new Windows-native GHC.


Yes.  And what I'm suggesting is the following - what I've been suggesting all 
along, but we keep getting sidetracked into sub-discussions:


 - the initial bootstrap is done by hand. (your (1) above).  All you need to
   do is build stage 1 using an existing mingw-GHC.  The stage 1 that you
   build will be capable of creating native objects and binaries.  Any hacks
   you have to apply to do this part are certainly temporary.

 - we adapt the current build system to use the native GHC.  I really don't
   think this is hard, and it's way quicker than replacing significant chunks
   of the build system, as you seem to be suggesting.

So the result is a build system that can build a win-native GHC using another 
win-native GHC, but not necessarily build a win-native GHC using a mingw GHC.



I'm not against VS in particular - I'm against duplication.  Build systems rot 
quickly.  By all means discuss a wonderful replacement for the current build 
system -- I'm aware that the current system is far from perfect -- but I'm not 
at all convinced that it is a necessity for building win-native GHC.


I could be wrong.  If I am wrong, then constructing a convincing argument might 
be difficult, because it's a matter of details - the cumulative weight of things 
you have to hack is too great.  So go ahead and use VS or whatever; but please 
think very carefully before doing so, because it's not cost-free.  We're not 
going to drop the mingw port of GHC, and it's hard to keep everything working as 
it is.  We'll have to import new hackers who understand VS builds, because none 
of the current GHC maintainers do!



Use GHC as your C compiler, i.e. don't invoke CL directory from make, 
and add the INCLUDE/LIB directories to the RTS's package.conf.


Certainly doable but it does present a conundrum: for the old GHC 
(without builtin cl-support) the order for compilation seems to be:
compile/link command compile/link flags output source/object 
files other flags
while for cl running link.exe or link.exe, it is better to put all the 
files at the end of the command line:
compile/link command compile/link flags output other flags 
source/object files


Why is that a conundrum?  GHC can invoke CL with the arguments in whatever 
order it likes.  Sorry, but this just seems like a trivial detail to me.



It also adds one more layer of indirection a that delicate stage.


huh?  GHC already invokes gcc, you need to change it to invoke CL anyway.

Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-26 Thread Peter Tanski

On Jun 26, 2007, at 4:59 AM, Simon Marlow wrote:


Peter Tanski wrote:

I keep on referring to this as temporary because there are two  
different builds here:
(1) the build using the old mingw-GHC, without option support for  
CL; and,

(2) the build using the new Windows-native GHC.


Yes.  And what I'm suggesting is the following - what I've been  
suggesting all along, but we keep getting sidetracked into sub- 
discussions:


 - we adapt the current build system to use the native GHC.  I  
really don't
   think this is hard, and it's way quicker than replacing  
significant chunks

   of the build system, as you seem to be suggesting.


I don't have to replace large chunks of the system, although I have  
added several separate makefiles--an mk/msvc_tools.mk and mk/ 
build.mk.msvc (which configure will copy into mk/build.mk).  It is  
almost done (the current system, I mean)--although I do have one  
detail question: Clemens Fruhwirth sent a patch to add shared library  
support for all architectures, i.e., MkDLL - MkDSO (in compiler/main/ 
DriverPipeline.hs).  I haven't seen the patch after I did my last  
pull, yesterday. So I assume it has not been applied yet.  How do you  
want autoconf to detect the shared library extension and libtool  
support?  AC_PROG_LIBTOOL does not seem to work well on OS X: OS X  
libtool is Apple, not GNU (it is also a binary, not a driver-script  
for libltdl); that macro failed the last time I built GMP and I had  
to make shared libraries manually).  This is precient because the  
default build for Windows should be DLLs but I want the configuration  
(at least) to mesh with the rest of the system: I wanted to add $ 
(libext) and $(shlibext); as it is, I vary them by a simple case in  
*windows), *darwin*) or *) (unix) but this does not seem correct.


So the result is a build system that can build a win-native GHC  
using another win-native GHC, but not necessarily build a win- 
native GHC using a mingw GHC.


I could set it up so configure could detect which GHC is available  
and build using that GHC (Mingw or Windows-native).  (Just add a C- 
compiler string to 'ghc -v' or 'ghc --version' and grep for it.)


I'm not against VS in particular - I'm against duplication.  Build  
systems rot quickly.  By all means discuss a wonderful replacement  
for the current build system -- I'm aware that the current system  
is far from perfect -- but I'm not at all convinced that it is a  
necessity for building win-native GHC.


VS is not necessary; it is aesthetic and may offer other benefits for  
those who wish to hack on GHC.  It would require many bits of glue  
code and careful alignment of tasks so the entire build would not be  
transparent to any but the most experienced VS programmers.  It  
would, however, be much easier to the more casual developer and it  
may not be as brittle: shallow build settings, compiler settings,  
source files included, a bureaucratic notion of ways, would be  
available from a simple point and click.  If I have time I will at  
least do a prototype (base-compiler only) and see if people like it.


I could be wrong.  If I am wrong, then constructing a convincing  
argument might be difficult...
We'll have to import new hackers who understand VS builds, because  
none of the current GHC maintainers do!


New blood!  :)  I'm joking--there have been forks of GHC in the past  
but they generally don't last long because GHC moves too fast and  
that's because the Architects are still at work.  The only convincing  
argument here would be a prototype that even the GHC maintainers  
would be able to understand.


Certainly doable but it does present a conundrum: for the old GHC  
(without builtin cl-support) the order for compilation seems to be:
compile/link command compile/link flags output source/ 
object files other flags
while for cl running link.exe or link.exe, it is better to put all  
the files at the end of the command line:
compile/link command compile/link flags output other flags  
source/object files


Why is that a conundrum?  GHC can invoke CL with the arguments in  
whatever order it likes.  Sorry, but this just seems like a trivial  
detail to me.


Mingw GHC can't do that.  I simply added some conditional changes to  
the rules in mk/suffix.mk.


Cheers,
Pete




___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-25 Thread Simon Marlow

Peter Tanski wrote:


On Jun 22, 2007, at 11:42 AM, Simon Marlow wrote:


Peter Tanski wrote:
A bit invasive (it involves modifying the make rules so they take an 
object-suffix variable).  Instead of the current suffix.mk:

$(odir_)%.$(way_)o : %.hc
it should be:
$(odir_)%.$(way_)$(obj_sfx) : %.hc
or some such.  This may affect other builds, especially if for some 
reason autoconf can't determine the object-suffix for a platform, 
which is one reason I suggested a platform-specific settings file.  I 
could handle this by having autoconf set the target variable, put all 
the windows-specific settings in a settings.mk file (including a 
suffix.mk copy) and have make include that file.


Surely this isn't hard?

ifeq $(TargetOS) windows
osuf=obj
else
osuf=o
endif

and then use $(osuf) wherever necessary.


Yes it is easy but now all Makefiles must be changed to use $(osuf), 
such as this line in rts/Makefile:


378: %.$(way_)o : %.cmm $(H_FILES),

for what will be a (hopefully) temporary Windows build.


I bet there are only a few makefiles that explicitly refer to o as the 
object-file suffix.


I don't understand why you see this as a temporary measure.  Surely we'll need a 
way to build GHC again for this platform?  Unless you intend to replace the 
whole build system?  (which I strongly recommend *not* doing, at least not yet)



 4. modify the core packages to use Win32 calls only (no mingw)
That is where a lot of preparation is going.  This is *much* harder 
to do from mingw than from VS tools since you have to set up all the 
paths manually.


I don't understand the last sentence - what paths?  Perhaps I wasn't 
clear here: I'm talking about the foreign calls made by the base 
package and the other core packages; we can't call any functions 
provided by the mingw C runtime, we can only call Win32 functions.  
Similarly for the RTS.  I have no idea how much needs to change here, 
but I hope not much.


To use the MS tools with the standard C libraries and include 
directories, I must either gather the environment variables separately 
and pass them to cl/link on the command line or I must manually add them 
to my system environment (i.e., modify msys.bat, or the windows 
environment) so msys will use them in its environment.


The other problem is the old no-pathnames-with-spaces in Make, since 
that must be made to quote all those environment variables when passing 
them to cl.  I could use the Make-trick of filling the spaces with a 
character and removing that just before quoting but that is a real hack 
and not very reliable--it breaks $(word ...).


Use GHC as your C compiler, i.e. don't invoke CL directory from make, and add 
the INCLUDE/LIB directories to the RTS's package.conf.


Altogether it is a pain to get going and barely reproducible.  That is 
why I suggested simply producing .hc files and building from .hc using VS.


Doing an unregisterised build, you mean?  Sounds a bit scary!

Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-25 Thread Peter Tanski


On Jun 25, 2007, at 5:19 AM, Simon Marlow wrote:
Yes it is easy but now all Makefiles must be changed to use $ 
(osuf), such as this line in rts/Makefile:

378: %.$(way_)o : %.cmm $(H_FILES),
for what will be a (hopefully) temporary Windows build.


I bet there are only a few makefiles that explicitly refer to o  
as the object-file suffix.


After poking around I found that my fears were unfounded.  Simply  
pass cl the /TC (or -TC) option--same as the gcc option '-x c'.   
Object files are also fine since cl assumes any file with an  
unrecognised suffix is an object file.


The environment variables problem is also solved: either have the  
environment set up automatically by placing a batch-script 'call' to  
the MS PSDK 'SetEnv.Cmd' before the shell login in msys.bat or start  
the DOS shell from the MS PSDK shortcut and log into the msys shell  
manually--or run the whole thing from DOS.  Shows how much I know of  
msys.  Passing flags to cl would be best in a command file (at least  
I have done _that_ before).


I don't understand why you see this as a temporary measure.  Surely  
we'll need a way to build GHC again for this platform?  Unless you  
intend to replace the whole build system?  (which I strongly  
recommend *not* doing, at least not yet)


I keep on referring to this as temporary because there are two  
different builds here:
(1) the build using the old mingw-GHC, without option support for CL;  
and,

(2) the build using the new Windows-native GHC.

You will almost certainly keep mingw-GHC around but users should not  
have to download a mingw-GHC to build Windows-native from source  
(they can't start at a stage1 build), so the Windows-native requires  
a separate setup.  That might as well be Windows-native itself, in  
other words, use VS--it is the quickest and easiest build to put  
together.  I do not suggest CMake because CMake is a sledgehammer  
when it comes to managing projects and sub-projects: all paths are  
absolute (you cannot move the source directories around), there is  
only one major Project in a system--it only really builds 'all',  
not sub-targets and build variants beyond the buitin Debug,  
MinSizeRel, Release, etc., have to be custom-added; it would not  
integrate well with the current $(way) system.  If you are heavily  
against using VS, maybe an Eclipse/Ant-based build would do.  I might  
use Bakefile.


It would be much better to have a single build system.  I would  
gladly replace the whole thing for three reasons:
(1) it is a source of many build bugs and it makes them much more  
difficult to track down; and,
(2) it seems to be a serious hurdle for anyone who wants to build and  
hack on GHC--this is true for most other compiler systems that use  
the autoconf and Make; and,
(3) if GHC is ever going to have cross-compilation abilities itself,  
the current build system must go, while cross-compiling GHC with the  
current system requires access to the actual host-system hardware.

The reasons I don't are:
(1) time (parallel to money);
(2) I wouldn't undertake such an effort unless we were all pretty  
sure what you want to change the build system to;
(3) an inevitable side-effect of the move would be loss of old (or  
little-used) build settings, such as GranSim, and a change to the  
build system would propagate to parallel projects; and,
(4) it is a huge project: both the compiler and libraries must change  
and the change must integrate with the Cabal system.


Work on the mingw-make system is progressing fairly well.

The reason to make a special VS build are:
(1) Windows programmer familiarity;
(2) reduction in the number of build bugs;
(3) ease of extension or integration with other VS tools, such  
as .NET; and,

(4) speed--VS builds are much faster than Make.
I should also add that when building the RTS it is simply much easier  
to have a build problem reported in VS than search back through Make- 
output and manually go to the offending line in a source file.
The reason not to make a special VS build is you would have to  
support it--one more thing to check when new source files are added.   
As I said before, this may be scripted and if Windows programmers  
have something familiar to work with there may be more of them to  
help.  (You probably have better reasons than that one.)


Use GHC as your C compiler, i.e. don't invoke CL directory from  
make, and add the INCLUDE/LIB directories to the RTS's package.conf.


Certainly doable but it does present a conundrum: for the old GHC  
(without builtin cl-support) the order for compilation seems to be:
compile/link command compile/link flags output source/object  
files other flags
while for cl running link.exe or link.exe, it is better to put all  
the files at the end of the command line:
compile/link command compile/link flags output other flags  
source/object files


It also adds one more layer of indirection a that delicate stage.

I am in the process of modifying and testing 

Re: 64-bit windows version?

2007-06-25 Thread skaller
On Mon, 2007-06-25 at 11:43 -0400, Peter Tanski wrote:

 It would be much better to have a single build system.  I would  
 gladly replace the whole thing for three reasons:

 (1) it is a source of many build bugs and it makes them much more  
 difficult to track down; and,
 (2) it seems to be a serious hurdle for anyone who wants to build and  
 hack on GHC--this is true for most other compiler systems that use  
 the autoconf and Make; and,
 (3) if GHC is ever going to have cross-compilation abilities itself,  
 the current build system must go, while cross-compiling GHC with the  
 current system requires access to the actual host-system hardware.
 The reasons I don't are:
 (1) time (parallel to money);
 (2) I wouldn't undertake such an effort unless we were all pretty  
 sure what you want to change the build system to;
 (3) an inevitable side-effect of the move would be loss of old (or  
 little-used) build settings, such as GranSim, and a change to the  
 build system would propagate to parallel projects; and,
 (4) it is a huge project: both the compiler and libraries must change  
 and the change must integrate with the Cabal system.

I am thinking of starting a new project (possibly as sourceforge)
to implement a new build system. I think Erick Tryzelaar might
also be interested. The rule would be: it isn't just for GHC.
So any interested people would have to thrash out what to
implement it in, and the overall requirements and design ideas.

My basic idea is that it should be generic and package based,
that is, it does NOT include special purpose tools as might
be required to build, say, Haskell programs: these are
represented by 'plugin' components.

A rough model of this: think Debian package manager, but
for source code not binaries.


-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-25 Thread Gour
On Tue, 26 Jun 2007 02:06:25 +1000
skaller [EMAIL PROTECTED] wrote:


 My basic idea is that it should be generic and package based,
 that is, it does NOT include special purpose tools as might
 be required to build, say, Haskell programs: these are
 represented by 'plugin' components.

Have you seen Aap (http://www.a-a-p.org/) ?


Sincerely,
Gour


signature.asc
Description: PGP signature
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-25 Thread Peter Tanski

On Jun 25, 2007, at 12:06 PM, skaller wrote:


On Mon, 2007-06-25 at 11:43 -0400, Peter Tanski wrote:


It would be much better to have a single build system.  I would
gladly replace the whole thing ...


I am thinking of starting a new project (possibly as sourceforge)
to implement a new build system. I think Erick Tryzelaar might
also be interested. The rule would be: it isn't just for GHC.
So any interested people would have to thrash out what to
implement it in, and the overall requirements and design ideas.

My basic idea is that it should be generic and package based,
that is, it does NOT include special purpose tools as might
be required to build, say, Haskell programs: these are
represented by 'plugin' components.

A rough model of this: think Debian package manager, but
for source code not binaries.


I have been considering the same thing for some time, partly because  
the specification properties of most available build systems are  
terrible: XML is not a language (it is better as a back-end for a gui- 
based build system); current plug-in systems (similar to what WAF  
uses) are object-oriented and require a deep knowledge of the build  
system API; others are manual hacks.  One big thing to avoid are  
cache-files: CMake, SCons, WAF, autotools, all use cache-files and  
all run into problems when the cache files aren't cleaned or include  
errors.  (WAF has the best cache-file system--they are designed to be  
human-readable.)  I will gladly lend support to this.


An idea I have been kicking around is a hybrid between the autoconf  
strategy and the commercial-setup strategy: add support for a  
specification of program requirements, i.e., stdint.h, gcc/cl/icl,  
things like that in a simple spec-document with dead-simple syntax,  
then let the build system handle the rest--it would know what to do  
for each architecture.  That seems similar to the Debian package  
maker, right?


What language are you thinking about using?  Scheme seems good but  
the build-language itself might be different; gui-support should be  
available which says to me (horrors!) Java AWT--a cross-platform gui- 
supported build system (not an IDE) would rock the world because it  
doesn't exist.  There are tons of Python packages out there (A-A-P,  
hasn't been updated since 2003; SCons, WAF, Bakefile (uses Python)).   
I don't know if this is possible in Felix.


Other requirements might be:
(1) never alter anything in the source directories--everything is  
managed through the build directory
(2) ability to easily build separate targets from the command line,  
similar to 'make test_shifts'--externally scriptable.


One thing other systems seem to fail at is building off the enormous  
trove of information in autoconf--it's right there, open source, and  
they reinvent the wheel when it comes to making configuration tests  
or finding platform information (config.guess is essentially a  
database of platform-specific information but is somewhat dated with  
regard to newer systems, including OS X).  On that note, a different  
approach to configuration tests might be clearer knowledge about the  
compilers: all these systems build small C programs to test certain  
compiler characteristics and test for a 0 exit value.  Well, CMake  
does actually 'read' the output files for some things, such as  
compiler version.


Cheers,
Pete

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-25 Thread kyra
Certainly doable but it does present a conundrum: for the old GHC 
(without builtin cl-support) the order for compilation seems to be:
compile/link command compile/link flags output source/object 
files other flags
while for cl running link.exe or link.exe, it is better to put all the 
files at the end of the command line:
compile/link command compile/link flags output other flags 
source/object files


It also adds one more layer of indirection a that delicate stage.


Maybe some gcc mimicing cl wrapper tailored specifically for GHC 
building system could help? One more layer of indirection, but could 
leave ghc driver relatively intact.


Cheers,
Kyra
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-25 Thread Peter Tanski

On Jun 25, 2007, at 12:55 PM, kyra wrote:

Certainly doable but it does present a conundrum: for the old GHC  
(without builtin cl-support) the order for compilation seems to be:
compile/link command compile/link flags output source/ 
object files other flags
while for cl running link.exe or link.exe, it is better to put all  
the files at the end of the command line:
compile/link command compile/link flags output other flags  
source/object files

It also adds one more layer of indirection a that delicate stage.


Maybe some gcc mimicing cl wrapper tailored specifically for GHC  
building system could help? One more layer of indirection, but  
could leave ghc driver relatively intact.


That's a good idea!  Do you know if or how the mingw-gcc is able to  
do that?  Does mingw-gcc wrap link.exe?  It sounds silly that someone  
relatively inexperienced with mingw should be doing this but it  
_really_needs doing and no one else seems to want it (besides, from  
my perspective, once I get through the build-system drudgery it lets  
me handle the fun stuff like adding inline MASM to the RTS, such as  
ghc/includes/SMP.h).


Cheers,
Pete

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-25 Thread skaller
On Mon, 2007-06-25 at 13:35 -0400, Peter Tanski wrote:

  Maybe some gcc mimicing cl wrapper tailored specifically for GHC  
  building system could help? One more layer of indirection, but  
  could leave ghc driver relatively intact.
 
 That's a good idea!  Do you know if or how the mingw-gcc is able to  
 do that?  Does mingw-gcc wrap link.exe?  

There's more to portable building than the build system.
For example, for C code, you need a system of macros to support

void MYLIB_EXTERN f();

where MYLIB_EXTERN can be empty, say  __declspec(dllexport)
on Windows when building a DLL, and  __declspec(dllimport)
when using it. This is *mandatory*.

The build system controls the command line switches that
turn on We're building a DLL flag. A distinct macro is needed
for every DLL.

In Felix, there is another switch which tells the source
if the code is being built for static linkage or not:
some macros change when you're linking symbols statically
compared to using dlsym().. it's messy: the build system
manages that too. 

Building Ocaml, you have a choice of native or bytecode,
and there are some differences. Probably many such things
for each and every language and variation of just about
anything .. eg OSX supports two kinds of dynamic libraries.

The point is that a 'Unix' oriented build script probably
can't be adapted: Unix is different to Windows. The best
way to adapt to Windows is to use Cygwin.. if you want
a Windows native system, you have to build in the Windows 
way and make Windows choices. A silly example of that
is that (at least in the past) Unix lets you link at
link time against a shared library, whereas Windows
requires to link against a static thunk ..
so building a shared library produces TWO outputs
on Windows.

OTOH, Unix has this woeful habit of naming shared libraries
like libxxx.so.1.2 which really makes a complete mess
of build systems.

What I'm saying is you just can't wrap Windows tools
inside a Unix build script.

You have to write an abstract script, and implement
the abstractions for each platform.

-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-25 Thread Peter Tanski

On Jun 25, 2007, at 3:34 PM, skaller wrote:


On Mon, 2007-06-25 at 13:35 -0400, Peter Tanski wrote:


Maybe some gcc mimicing cl wrapper tailored specifically for GHC
building system could help? One more layer of indirection, but
could leave ghc driver relatively intact.


That's a good idea!  Do you know if or how the mingw-gcc is able to
do that?  Does mingw-gcc wrap link.exe?


There's more to portable building than the build system.
For example, for C code, you need a system of macros to support

void MYLIB_EXTERN f();

where MYLIB_EXTERN can be empty, say  __declspec(dllexport)
on Windows when building a DLL, and  __declspec(dllimport)
when using it. This is *mandatory*.


Of course--one thing I would add to a build system, instead of  
compiling little C files and testing the return value to detect some  
compiler functionality, is the ability to read builtin macros, say,  
by telling the compiler to dump all macros like 'gcc -E -dM'  and  
then read through the macros.


As for the Windows-native build, I am pretty far long with that but  
the idea was to hijack the gcc executable with a script that would  
convert the gcc arguments to cl arguments.  The one thing such a  
script would not do is compile everything at once.  So far that is  
one thing I am adding to the Make system here: since dependancy  
generation is good for Haskell files but is not necessary for C files  
since I can bunch the C sources together with the compiler flags and  
pass them cl all at once in a command file.  This should be faster  
than Make.



The build system controls the command line switches that
turn on We're building a DLL flag. A distinct macro is needed
for every DLL.


That is part of the modifications to the runtime system (RTS).


In Felix, there is another switch which tells the source
if the code is being built for static linkage or not:
some macros change when you're linking symbols statically
compared to using dlsym().. it's messy: the build system
manages that too.


Sometimes this is better in header files and change the macros with  
defines the build system passes to the c compiler but Felix's system  
is much more flexible than that (it builds the source files  as  
interscript extracts them, right?).



Building Ocaml, you have a choice of native or bytecode,
and there are some differences. Probably many such things
for each and every language and variation of just about
anything .. eg OSX supports two kinds of dynamic libraries.


GHC's interpreter (GHCi) does have to be built.  I have not found a  
libReadline DLL, but I am sure I can scrounge something--possibly  
from Python since they had this same problem back around 2000.



The point is that a 'Unix' oriented build script probably
can't be adapted: Unix is different to Windows. The best
way to adapt to Windows is to use Cygwin.. if you want
a Windows native system, you have to build in the Windows
way and make Windows choices. A silly example of that
is that (at least in the past) Unix lets you link at
link time against a shared library, whereas Windows
requires to link against a static thunk ..
so building a shared library produces TWO outputs
on Windows.


I am building with Mingw because that is better supported by the GHC  
build system (Cygwin is somewhat defunct); the end result should  
build from source in Visual Studio/Visual C++ Express.


Cheers,
Pete
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-24 Thread Claus Reinke

 Don't forget .. Mingw has to be installed too .. and in fact
 that is much harder. I tried to install MSYS and gave up.

You're kidding right?  There's Windows installer .exes for MinGW and
MSYS.  You download it, run it, and click Next a few times.


Its far from that easy! Its loads of steps, and figuring out what to
download is quite challenging on its own.


I had a non trivial problem understanding what parts of which versions of Mingw 
and/or MSYS I should download to get ghc compiling on windows.  I had to try at 
least three times before I setup something that works.  And I assert that I am a 
competent person.  I think their web site seriously under-explains things from 
an outsider's point of view.  But the user base is made up of programmers, so 
they obviously are not under pressure to make a clearer site.


so it is probably easier to install that to figure out what to install?

yes, i remember the download site being rather confusing in its
variety of packages and versions on offer, the all-in-one package 
being much older than the individual packages, because one is

meant to install the all-in-one, then update it by installing the
more frequently updated individual packages on top of that. and
then were stable and unstable versions of those individual packages.
or something like that..

personally, i use the cygwin environment, so only needed to
install mingw gccco. but it would be nice if one of you
who have figured out what and how to install a minimal 
mingw/msys sufficient for ghc builds could write it up in

a detailed log, similar to the existing one for mingw/cygwin:

http://hackage.haskell.org/trac/ghc/wiki/Building/Windows#AWindowsbuildlogusingCygwin

that would help others in the future, and they can send 
updates to the log when the details change. if you don't

remember all the details, just write up what you remember,
and let others fill in when they are actually doing an install.

[simon: i recall sending a minor update to the cygwin log, 
including where to get darcs, and changing to the current 
mailing list setup; has that disappeared?]


claus

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-24 Thread Matthew Danish
On Sun, Jun 24, 2007 at 12:45:33PM +0100, Claus Reinke wrote:
 http://hackage.haskell.org/trac/ghc/wiki/Building/Windows#AWindowsbuildlogusingCygwin
 
 that would help others in the future, and they can send 
 updates to the log when the details change. if you don't
 remember all the details, just write up what you remember,
 and let others fill in when they are actually doing an install.

Instead of grabbing Cygwin, grab two other files along with MinGW
installer.

Presume you are at the SourceForge download page for MinGW (yes, I
know SF is annoying):

Under Current, unfold the full listing of packages:

Unfold MinGW, MSYS, and MSYS Developer Tool Kit.

Download the latest of 
  + MinGW-*.exe, 
  + MSYS-*.exe, 
  + msysDTK-*.exe

There aren't many .exe files, so it should be easy to pick them out.

Install MinGW, then MSYS, then msysDTK.  They are all ordinary Windows
installers (NSIS I think).  This installs your system in the
approved manner, so it should be easy to later grab individual
updates and simply unpack them into here.

That should take care of your non-Haskell related packages.

-- 
-- Matthew Danish -- user: mrd domain: cmu.edu
-- OpenPGP public key: C24B6010 on keyring.debian.org
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-24 Thread skaller
On Sun, 2007-06-24 at 13:10 -0400, Matthew Danish wrote:

 Unfold MinGW, MSYS, and MSYS Developer Tool Kit.

Hmm .. well I'm not sure if this is still correct, but Mingw,
being a Windows program, has 255 character limit on command line
.. which makes it useless for building anything complex.

Ocaml had this problem: the MinGW version of Ocaml has to be
built using Cygwin, with gcc using the -mnocygwin option.

The thing is .. Cygwin, despite being rather large,
is much easier to install than MSYS because of the 
package manager.

Also there are many emulations of Unix programs for Windows,
located at:

http://gnuwin32.sourceforge.net/

which are also fairly easy to install and work from
the CMD.EXE command prompt.

The problem with building on Windows is that many scripts
assume bash, and it just doesn't work 'right' outside
a well configured Unix environment. Cygwin does this quite
well .. MSYS etc doesn't.

I'm not intending to knock MSYS .. but I wouldn't rely on
it for building complex projects 'transparently' .. Cygwin
has enough problems doing that, and it's quite a sophisticated
environment.

The thing is .. Windows *.bat files, though a bit clumsy,
work better than trying to get a bash emulation .. but really,
designed-to-be-portable code written in Python, Perl, Scheme,
or even Haskell is better .. because it eliminates uncertainty
and gives you full control of how build actions are implemented
on each platform.

-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-23 Thread Neil Mitchell

Hi


 Don't forget .. Mingw has to be installed too .. and in fact
 that is much harder. I tried to install MSYS and gave up.

You're kidding right?  There's Windows installer .exes for MinGW and
MSYS.  You download it, run it, and click Next a few times.


Its far from that easy! Its loads of steps, and figuring out what to
download is quite challenging on its own.

Thanks

Neil
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-23 Thread Chris Kuklewicz

Neil Mitchell wrote:

Hi


 Don't forget .. Mingw has to be installed too .. and in fact
 that is much harder. I tried to install MSYS and gave up.

You're kidding right?  There's Windows installer .exes for MinGW and
MSYS.  You download it, run it, and click Next a few times.


Its far from that easy! Its loads of steps, and figuring out what to
download is quite challenging on its own.

Thanks

Neil


I agree with Neil.

I had a non trivial problem understanding what parts of which versions of Mingw 
and/or MSYS I should download to get ghc compiling on windows.  I had to try at 
least three times before I setup something that works.  And I assert that I am a 
competent person.  I think their web site seriously under-explains things from 
an outsider's point of view.  But the user base is made up of programmers, so 
they obviously are not under pressure to make a clearer site.

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-22 Thread Simon Marlow

Peter Tanski wrote:

Maybe this depends on the type of convenience you want to offer 
GHC-developers.  With the autoconf system they are required (for 
Windows) to download and install: Mingw, perl, python (for the 
testsuite), flex, happy, alex and some others I can't remember right 
now.  Oh yeah, GMP.


In fact, to build a source distribution on Windows, there are only 3 
dependencies:  GHC, Mingw and (either MSYS or Cygwin).


To build from darcs, you also need: darcs, Happy, and Alex.  To build docs, you 
also need Haddock.  To run the testsuite you need Python.


 When I did that, autoconf gave me the convenience 
of having those programs somewhere in my $PATH and if autoconf found 
them I was good to go.  The way I would change things would not be so 
different, except that the developer would have to set up the build 
environment: run from under Mingw, etc.  The benefit would be that 
instead of testing all the extra things autoconf tests for--path 
separator, shell variables, big/little endian, the type of ld and the 
commands to execute compile--those would be hard-wired because the 
system is known.  When things break down you don't have to search 
through the long lines of output because you know what the initial 
settings are and can even rely on them to help debug Make.

 This is the
way it is done for several systems: NextStep (now Apple) project 
makefiles, Jam, and many of the recent build systems, including CMake, 
Scons and WAF.


Ok, you clearly have looked at a lot more build systems than I have.  So you 
think there's a shift from autoconf-style figure out the configuration by 
running tests to having a database of configuration settings for various 
platforms?  I'm surprised - I thought conventional wisdom was that you should 
write your build system to be as independent as possible from the name of the 
build platform, so that the system is less sensitive to changes in its 
environment, and easier to port.  I can see how wiring-in the parameters can 
make the system more concrete, transparent and predictable, and perhaps that 
makes it easier to manage.  It's hard to predict whether this would improve our 
situation without actually doing it, though - it all comes down to the details.


On the other hand, we do hard-wire a lot of knowledge about Windows rather than 
autoconfing it.  This works because Windows is a fixed point; in contrast every 
Linux system is different in various ways.  So I guess I don't find it 
problematic to wire-in what happens on Windows, but I do try to avoid it where 
possible.



Getting back to the Windows native port, I'm pretty sure you're making more of a 
meal of this than necessary.  There's no need to port via HC files, unless I'm 
missing something.


 Whatever the end result is, GHC must be able to operate without Mingw
 and the GNU toolset.

That's the whole point of doing the port!

However, what I'm saying is that we can continue to use Cygwin as a set of tools 
for doing the build.  I don't see any problems with this (except that Cygwin is 
slow and clunky), and it keeps the changes to the current system to a minimum, 
and means we can continue to share the build system with Posixy systems.  Here's 
the plan I had in mind:


 1. modify GHC so that:
a) it can invoke CL instead of gcc to compile C files
b) its native code generator can be used to create native .obj files,
   I think you kept the syntax the same and used YASM, the other
   alternative is to generate Intel/MS syntax and use MASM.
c) it can link a binary using the MS linker
 2. modify Cabal so that it can use this GHC, and MS tools
 3. modify the build system where necessary to know about .obj .lib etc.
 4. modify the core packages to use Win32 calls only (no mingw)
 5. Use the stage 1 GHC to compile the RTS and libraries
 6. Build a stage 2 compiler: it will be a native binary
 7. Build a binary distribution

Regarding autoconf, for the time being, just supply ready-made output files 
(mk/config.h, libraries/base/include/HsBaseConfig.h, etc.).


Cheers,
Simon

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-22 Thread skaller
On Fri, 2007-06-22 at 12:03 +0100, Simon Marlow wrote:

 
 Ok, you clearly have looked at a lot more build systems than I have.  So you 
 think there's a shift from autoconf-style figure out the configuration by 
 running tests to having a database of configuration settings for various 
 platforms?  I'm surprised - I thought conventional wisdom was that you should 
 write your build system to be as independent as possible from the name of the 
 build platform, so that the system is less sensitive to changes in its 
 environment, and easier to port.  I can see how wiring-in the parameters can 
 make the system more concrete, transparent and predictable, and perhaps that 
 makes it easier to manage.  It's hard to predict whether this would improve 
 our 
 situation without actually doing it, though - it all comes down to the 
 details.

This misses the point. The 'suck it and see' idea fails totally for
cross-compilation. It's a special case.

The right way to do things is to separate the steps:

(a) make a configuration
(b) select a configuration

logically. This is particularly important for developers who are using
the same code base to build for multiple 'platforms' on the 
same machine.

With the above design you can have your cake and eat it too .. :)

Thats the easy part.. the HARD part is: every 'system' comes with
optional 'add-on' facilities. These add-ons may need configuration
data. Often, you want to add the 'add-on' after the system is built.
So integrating the configuration data is an issue.

Felix build system allows add-on packages to have their own
configuration model. It happens to be executed on the fly,
and typically does your usual suck it and see testing
(eg .. where are the SDL headers? Hmm ..) 

This design is wrong of course ;(

-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-22 Thread Simon Marlow

skaller wrote:

On Fri, 2007-06-22 at 12:03 +0100, Simon Marlow wrote:

Ok, you clearly have looked at a lot more build systems than I have.  So you 
think there's a shift from autoconf-style figure out the configuration by 
running tests to having a database of configuration settings for various 
platforms?  I'm surprised - I thought conventional wisdom was that you should 
write your build system to be as independent as possible from the name of the 
build platform, so that the system is less sensitive to changes in its 
environment, and easier to port.  I can see how wiring-in the parameters can 
make the system more concrete, transparent and predictable, and perhaps that 
makes it easier to manage.  It's hard to predict whether this would improve our 
situation without actually doing it, though - it all comes down to the details.


This misses the point. The 'suck it and see' idea fails totally for
cross-compilation. It's a special case.

The right way to do things is to separate the steps:

(a) make a configuration
(b) select a configuration

logically.


Hmm, I don't see how the approach fails totally for cross-compilation.  You 
simply have to create the configuration on the target machine, which is exactly 
what we do when we cross-compile GHC.  Admittedly the process is a bit ad-hoc, 
but it works.


Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-22 Thread Peter Tanski


On Jun 22, 2007, at 9:45 AM, Simon Marlow wrote:


skaller wrote:

On Fri, 2007-06-22 at 12:03 +0100, Simon Marlow wrote:
Ok, you clearly have looked at a lot more build systems than I  
have.  So you think there's a shift from autoconf-style figure  
out the configuration by running tests to having a database of  
configuration settings for various platforms?


I shouldn't overstate the situation: the other complete build  
systems, CMake and SCons do have autoconf capabilities in the way of  
finding headers and programs and checking test-compiles, the basic  
sanity checks--CMake has many more autoconf-like checks than SCons.   
Where they differ from the automake system seems to be their setup,  
which like Make has hard-coded settings for compilers, linkers, etc.   
(Some standard cmake settings are wrong for certain targets.)  I  
don't know if you have any interest in pursuing or evaluating CMake  
(certainly not now) but the standard setup is stored in a standard  
directory on each platform, say, /usr/share/cmake-2.4/Modules/ 
Platform/$(platform).cmake and may be overridden by your own cmake  
file in, say, $(srcdir)/cmake/UserOverride.cmake.


The preset-target-configuration build model I was referring to is a  
scaled-down version of the commercial practice which allows you to  
have a single system and simultaneously compile for many different  
architecture-platform combinations--once you have tested each and  
know how everything works.  For the initial exploration, it is a  
different (more anal) strategy: before invading, get all the  
intelligence you can and prepare thoroughly.  The GNU-Autoconf  
strategy is to keep a few troops who have already invaded many other  
places, adjust their all-purpose equipment a little for the mission  
and let them have at it.  My gripe is that their equipment isn't very  
good.


Cheers,
Pete


___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-22 Thread skaller
On Fri, 2007-06-22 at 14:45 +0100, Simon Marlow wrote:
 skaller wrote:

  This misses the point. The 'suck it and see' idea fails totally for
  cross-compilation. It's a special case.
  
  The right way to do things is to separate the steps:
  
  (a) make a configuration
  (b) select a configuration
  
  logically.
 
 Hmm, I don't see how the approach fails totally for cross-compilation.  You 
 simply have to create the configuration on the target machine, which is 
 exactly 
 what we do when we cross-compile GHC.  Admittedly the process is a bit 
 ad-hoc, 
 but it works.

But that consists of:

(a) make a configuration (on the target machine)
(b) select that configuration (on the host machine)

which is actually the model I suggest. To be more precise, the
idea is a 'database' of configurations, and building by selecting
one from that database as the parameter to the build process.

The database would perhaps consists of 

(a) definitions for common architectures
(b) personalised definitions

You would need a tool to copy and edit existing definitions
(eg .. a text editor) and a tool to 'autogenerate' prototype
definitions (autoconf for example).

What I meant failed utterly was simply building the sole
configuration by inspection of the properties of a the
host machine (the one you will actually build on).

That does work if

(a) the auto-dectect build scripts are smart and
(b) the host and target machines are the same

BTW: Felix has a 4 platform build model:

* build
* host
* target
* run

The build machine is the one you build on. Example: Debian
autobuilder.

The host is the one you intend to translate Felix code to
C++ code on, typically your workstation. In Windows environment
this might be Cygwin.

The target is the one you actually compile the C++ code on.
In Windows environment, this might be WIN32 native (MSVC++).

The run machine is where you actually execute the code.

The 'extra' step here is because it is a two stage compiler.
Some code has to be built twice: for example the GLR parser
elkhound executable runs on the host machine to generate
C++ and it uses a library. The same library is required
at run time, but has to be recompiled for the target.
EG: Elkhound built on cygwin to translate grammar to C++,
and Elkhound on MSVC++ for the run time automaton.

I'm not sure GHC as such need cross-cross compilation model,
but bootstrapping a cross compiler version almost certainly does.


-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-22 Thread Peter Tanski

On Jun 22, 2007, at 7:03 AM, Simon Marlow wrote:
In fact, to build a source distribution on Windows, there are only  
3 dependencies:  GHC, Mingw and (either MSYS or Cygwin).


To build from darcs, you also need: darcs, Happy, and Alex.  To  
build docs, you also need Haddock.  To run the testsuite you need  
Python.


True, Mingw does come standard with perl and a version of flex.   
There are Windows-native versions of Perl and flex available (i.e.,  
ActivePerl).  Now you are familiar with Mingw.  Imagine being a  
standard Windows programmer, trying to choose which version of Mingw  
to download--some are minimal installations--and going over the build  
requirements: perl, flex, happy, alex and, haddock are listed.  That  
is quite a bit of preparation.  There are minimal-effort ways to go  
about this (I will look into updating the wiki.)


 Whatever the end result is, GHC must be able to operate without  
Mingw

 and the GNU toolset.

That's the whole point of doing the port!


For running GHC--how about being able to build a new version of GHC  
from source?



 1. modify GHC so that:
a) it can invoke CL instead of gcc to compile C files


Mostly done (not completely tested).

b) its native code generator can be used to create native .obj  
files,

   I think you kept the syntax the same and used YASM, the other
   alternative is to generate Intel/MS syntax and use MASM.


This is as easy as simply using Yasm--also mostly done (not  
completely tested).  By the way, by testing I mean doing more than  
a simple -optc... -optc... -optl... addition to the command line,  
although an initial build using a current mingw version of GHC may  
certainly do this.



c) it can link a binary using the MS linker
 2. modify Cabal so that it can use this GHC, and MS tools
 3. modify the build system where necessary to know about .obj .lib  
etc.


A bit invasive (it involves modifying the make rules so they take an  
object-suffix variable).  Instead of the current suffix.mk:


$(odir_)%.$(way_)o : %.hc

it should be:

$(odir_)%.$(way_)$(obj_sfx) : %.hc

or some such.  This may affect other builds, especially if for some  
reason autoconf can't determine the object-suffix for a platform,  
which is one reason I suggested a platform-specific settings file.  I  
could handle this by having autoconf set the target variable, put all  
the windows-specific settings in a settings.mk file (including a  
suffix.mk copy) and have make include that file.



 4. modify the core packages to use Win32 calls only (no mingw)


That is where a lot of preparation is going.  This is *much* harder  
to do from mingw than from VS tools since you have to set up all the  
paths manually.



 5. Use the stage 1 GHC to compile the RTS and libraries
 6. Build a stage 2 compiler: it will be a native binary
 7. Build a binary distribution


I told Torkil I would have a version of the replacement library  
available for him as soon as possible.  I'll shut up now.  It looks  
like a long weekend.


Cheers,
Pete

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-22 Thread Simon Marlow

Peter Tanski wrote:

A bit invasive (it involves modifying the make rules so they take an 
object-suffix variable).  Instead of the current suffix.mk:


$(odir_)%.$(way_)o : %.hc

it should be:

$(odir_)%.$(way_)$(obj_sfx) : %.hc

or some such.  This may affect other builds, especially if for some 
reason autoconf can't determine the object-suffix for a platform, which 
is one reason I suggested a platform-specific settings file.  I could 
handle this by having autoconf set the target variable, put all the 
windows-specific settings in a settings.mk file (including a suffix.mk 
copy) and have make include that file.


Surely this isn't hard?

ifeq $(TargetOS) windows
osuf=obj
else
osuf=o
endif

and then use $(osuf) wherever necessary.


 4. modify the core packages to use Win32 calls only (no mingw)


That is where a lot of preparation is going.  This is *much* harder to 
do from mingw than from VS tools since you have to set up all the paths 
manually.


I don't understand the last sentence - what paths?  Perhaps I wasn't clear here: 
I'm talking about the foreign calls made by the base package and the other core 
packages; we can't call any functions provided by the mingw C runtime, we can 
only call Win32 functions.  Similarly for the RTS.  I have no idea how much 
needs to change here, but I hope not much.



 5. Use the stage 1 GHC to compile the RTS and libraries
 6. Build a stage 2 compiler: it will be a native binary
 7. Build a binary distribution


I told Torkil I would have a version of the replacement library 
available for him as soon as possible.  I'll shut up now.  It looks like 
a long weekend.


:-)

Cheers,
Simon

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-22 Thread Peter Tanski


On Jun 22, 2007, at 11:42 AM, Simon Marlow wrote:


Peter Tanski wrote:
A bit invasive (it involves modifying the make rules so they take  
an object-suffix variable).  Instead of the current suffix.mk:

$(odir_)%.$(way_)o : %.hc
it should be:
$(odir_)%.$(way_)$(obj_sfx) : %.hc
or some such.  This may affect other builds, especially if for  
some reason autoconf can't determine the object-suffix for a  
platform, which is one reason I suggested a platform-specific  
settings file.  I could handle this by having autoconf set the  
target variable, put all the windows-specific settings in a  
settings.mk file (including a suffix.mk copy) and have make  
include that file.


Surely this isn't hard?

ifeq $(TargetOS) windows
osuf=obj
else
osuf=o
endif

and then use $(osuf) wherever necessary.


Yes it is easy but now all Makefiles must be changed to use $(osuf),  
such as this line in rts/Makefile:


378: %.$(way_)o : %.cmm $(H_FILES),

for what will be a (hopefully) temporary Windows build.


 4. modify the core packages to use Win32 calls only (no mingw)
That is where a lot of preparation is going.  This is *much*  
harder to do from mingw than from VS tools since you have to set  
up all the paths manually.


I don't understand the last sentence - what paths?  Perhaps I  
wasn't clear here: I'm talking about the foreign calls made by the  
base package and the other core packages; we can't call any  
functions provided by the mingw C runtime, we can only call Win32  
functions.  Similarly for the RTS.  I have no idea how much needs  
to change here, but I hope not much.


To use the MS tools with the standard C libraries and include  
directories, I must either gather the environment variables  
separately and pass them to cl/link on the command line or I must  
manually add them to my system environment (i.e., modify msys.bat, or  
the windows environment) so msys will use them in its environment.


The other problem is the old no-pathnames-with-spaces in Make, since  
that must be made to quote all those environment variables when  
passing them to cl.  I could use the Make-trick of filling the spaces  
with a character and removing that just before quoting but that is a  
real hack and not very reliable--it breaks $(word ...).


Altogether it is a pain to get going and barely reproducible.  That  
is why I suggested simply producing .hc files and building from .hc  
using VS.


Cheers,
Pete 
___

Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-22 Thread Matthew Danish
On Thu, Jun 21, 2007 at 12:35:15AM +1000, skaller wrote:
 Don't forget .. Mingw has to be installed too .. and in fact
 that is much harder. I tried to install MSYS and gave up.

You're kidding right?  There's Windows installer .exes for MinGW and
MSYS.  You download it, run it, and click Next a few times.

-- 
-- Matthew Danish -- user: mrd domain: cmu.edu
-- OpenPGP public key: C24B6010 on keyring.debian.org
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-21 Thread Dinko Tenev

On 6/20/07, Isaac Dupree [EMAIL PROTECTED] wrote:

yes, binutils written in Haskell!  Will never happen!  :))


It's crossed my mind as well, once or twice -- maybe it's not such a bad idea.


Cheers,
   Dinko
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-21 Thread Simon Marlow

Peter Tanski wrote:




skaller wrote:


On Tue, 2007-06-19 at 12:23 +0100, Simon Marlow wrote:


Bulat Ziganshin wrote:


Hello glasgow-haskell-users,

are you plan to implement 64-bit windows GHC version?

The main thing standing in the way of this is the lack of a 64-bit 
port of

mingw.



Why do you need mingw? What's wrong with MSVC++?


The largest problem is the build system: GHC uses autoconf with custom 
makefiles.


So autoconf won't work with MSVC++, that is indeed a problem.  But this doesn't 
mean we have to stop using Makefiles and GNU make - the rest of the build system 
will work fine, provided it's told about the different conventions for names of 
object files etc.  I don't see a compelling enough reason to stop using GNU 
make.  The build system doesn't even need to invoke CL directly, since we can 
use GHC as a driver (isn't this the way we agreed to do it before?).


We use autoconf in a pretty limited way (no automake), so I don't think it will 
be hard to work around, even if we have to just hard-code all the configuration 
results for Windows.


Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-21 Thread Peter Tanski


On Jun 21, 2007, at 4:16 AM, Simon Marlow wrote:

Peter Tanski wrote:

skaller wrote:


Why do you need mingw? What's wrong with MSVC++?
The largest problem is the build system: GHC uses autoconf with  
custom makefiles.


So autoconf won't work with MSVC++, that is indeed a problem.  But  
this doesn't mean we have to stop using Makefiles and GNU make -  
the rest of the build system will work fine, provided it's told  
about the different conventions for names of object files etc.  I  
don't see a compelling enough reason to stop using GNU make.  The  
build system doesn't even need to invoke CL directly, since we can  
use GHC as a driver (isn't this the way we agreed to do it before?).


We use autoconf in a pretty limited way (no automake), so I don't  
think it will be hard to work around, even if we have to just hard- 
code all the configuration results for Windows.


The make system does work well and must be kept in order to port GHC  
to a new posix platform--too many parallel projects (pun intended)  
work with the current system.  I have not kept a good count of  
monthly configuration-based bugs but there are at least a few a  
month, for known platforms, including OS X (a significant user base)  
and Mingw.  If I could change one feature of the current system I  
would set up a wiki page with specific build requirements (I mean  
location, program/library with function declaration), and for known  
systems use autoconf only to determine what the $(build) system is  
and to ensure those programs are available, then jump into make which  
would call pre-set configuration makefiles for that system.


I spent a good amount of time writing the replacement library build  
system in GNU Make (min. 3.8--the current min is 3.79.1) to blend  
seamlessly with the current system.  It does use a custom configure  
script written in Python (more consistently portable, no temporary  
files of any kind in $(srcdir))--John, that is where I originally  
used Interscript: to bake configuration settings into the setup  
files.   The configuration determines what system it is on and the  
relative-path location of certain requirements if they are not  
already available--for testing the processor type and os support  
(when it can't read from something cool like /proc/cpuinfo) it does  
build small programs but all building is done in the build directory  
which may be located anywhere you want.  It then sets those  
parameters for configuration files that already contain other presets  
for that platform; general guesses may go into the main GHC autoconf  
and I will keep them very simple (new architectures get the generic C  
library by default).  I simply can't convince myself that it is  
better to use a guess-based system for architectures I already know,  
especially when it also makes cross-compiling more complex than  
necessary.  For Windows it uses a VS project and calls that from a  
DOS-batch file (for setup parameters) so you can run it from the  
command line.


What I hope you would agree on for Windows-GHC is a build that ran  
parallel to the autoconf-make system.  Of course that would require  
some maintenance when things change in the main system but I could  
write update scripts for trivial changes; I believe anything more  
complex should be carefully checked in any case.  VS is troublesome  
(its project files are written in XML, but that may be automated).   
If you would rather use a Make-like system I could do it in Jam and  
then you would add only a few extra Jamfiles to the current system.   
As a bonus either VS or Jam would reduce build times, especially re- 
build times, would and probably reduce the number of configuration  
bugs we see around here.  I would not suggest CMake, SCons or WAF;  
John wisely advised against anything invasive.


Cheers,
Pete

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-21 Thread Simon Marlow

Peter Tanski wrote:

The make system does work well and must be kept in order to port GHC to 
a new posix platform--too many parallel projects (pun intended) work 
with the current system.  I have not kept a good count of monthly 
configuration-based bugs but there are at least a few a month, for known 
platforms, including OS X (a significant user base) and Mingw.  If I 
could change one feature of the current system I would set up a wiki 
page with specific build requirements (I mean location, program/library 
with function declaration), and for known systems use autoconf only to 
determine what the $(build) system is and to ensure those programs are 
available, then jump into make which would call pre-set configuration 
makefiles for that system.


So you'd hard-wire a bunch of things based on the platform name?  That sounds 
like entirely the wrong approach to me.  It makes the build system too brittle 
to changes in its environment: exactly the problem that autoconf was designed to 
solve.


I spent a good amount of time writing the replacement library build 
system in GNU Make (min. 3.8--the current min is 3.79.1) to blend 
seamlessly with the current system.  It does use a custom configure 
script written in Python (more consistently portable, no temporary files 
of any kind in $(srcdir))--John, that is where I originally used 
Interscript: to bake configuration settings into the setup files.   The 
configuration determines what system it is on and the relative-path 
location of certain requirements if they are not already available--for 
testing the processor type and os support (when it can't read from 
something cool like /proc/cpuinfo) it does build small programs but all 
building is done in the build directory which may be located anywhere 
you want.  It then sets those parameters for configuration files that 
already contain other presets for that platform; general guesses may go 
into the main GHC autoconf and I will keep them very simple (new 
architectures get the generic C library by default).  I simply can't 
convince myself that it is better to use a guess-based system for 
architectures I already know, especially when it also makes 
cross-compiling more complex than necessary.  For Windows it uses a VS 
project and calls that from a DOS-batch file (for setup parameters) so 
you can run it from the command line.


Adding a dependency on Python is already something I want to avoid.  One way we 
try to keep the GHC build system sane is by keeping the external dependencies to 
a minimum (yes I know the testsuite requires Python, but the build itself doesn't).


However, I admit I don't fully understand the problem you're trying to solve, 
not having tried to do this myself.  The GHC build system now uses Cabal to 
build libraries (actually Cabal + make + a bit of autoconf for some libraries). 
 Why can't this method work for building libraries on Windows native?  We must 
port Cabal to Windows native anyway, and then you have a library build system.


What I hope you would agree on for Windows-GHC is a build that ran 
parallel to the autoconf-make system.


What I hope is that we don't have to do this :-)

Of course that would require some 
maintenance when things change in the main system but I could write 
update scripts for trivial changes; I believe anything more complex 
should be carefully checked in any case.  VS is troublesome (its project 
files are written in XML, but that may be automated).  If you would 
rather use a Make-like system I could do it in Jam and then you would 
add only a few extra Jamfiles to the current system.


If we were to use something other than GNU make, we should do it wholesale, or 
not at all, IMO.


Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-21 Thread skaller
On Thu, 2007-06-21 at 14:40 -0400, Peter Tanski wrote:
 On Jun 21, 2007, at 11:48 AM, Simon Marlow wrote:

  So you'd hard-wire a bunch of things based on the platform name?   
  That sounds like entirely the wrong approach to me. 

FYI: there is a rather nice set of platform data in the
ACE package.


-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version? (Haskell is a scripting language too!)

2007-06-21 Thread Brian Hulley

skaller wrote:

The key thing for the building portability is that the C and C++
compilers are represented by Python classes. There is a pre-programmed
class for gcc, and another for MSVC++.
  
I suggest (for GHC) a Haskell class with instances for the different 
combinations of

compilers and platforms and optimization levels.

(a) Pick a portable scripting language which is readily available
on all platforms. I chose Python. Perl would also do.
If I had time to look into improving the GHC build system I'd definitely 
use Haskell as the scripting language. What's the point of having to 
learn more than one language to accomplish a task? Surely it is more 
productive to add functionality to a single language (especially when 
it's only a case of writing some libraries) rather than grappling with 
the different syntaxes and limitations and quirks of a multitude of 
languages/tools and the problems of trying to get them all to work 
harmoniously together on different platforms?


In concrete terms, for each sub-project within GHC ie each directory of 
source files I'd have a little Haskell program that was responsible for 
building that sub-project. Kind of like CM.make or the ML Basis system, 
except with the full power of Haskell so that complicated things such as 
building successive boot levels could easily be automated. (Ie instead 
of having a special tool like make that is invoked on dead data 
(supplied by autoconf etc) to drive the build process, instead use 
Haskell as a scripting language.)


The top level build would be done by another Haskell program which 
delegated responsibility to the sub-programs in each directory.


I'd include the complete source code for all the Haskell tools like Alex 
and Happy so they could be built the same way.


In other words, the entire GHC project would be built by running a 
single Haskell program, and every single piece of source would either be 
a Haskell module or a file that is the input to a Haskell program that 
the main build-driver has built first (from Haskell sources). Therefore 
there would be no dependency on anything except GHC itself.


To port GHC to a completely new platform, you'd of course need a Haskell 
compiler or interpreter already. However to bootstrap the process only a 
slow interpreter would be needed so as long as a portable pre-built 
bytecode version was available for download the only thing left to port 
would be a byte code interpreter which could just be a single file of c 
code.


Perhaps this is too long-term a vision for solving the short term 
problems, but maybe it could be the basis of some future projects?


Best regards,

Brian.
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version? (Haskell is a scripting language too!)

2007-06-21 Thread Greg Fitzgerald

each sub-project...have a...Haskell program...building that sub-project


I was trying to build something like this recently but hit a roadblock.
Rather than execute the script in each directory, I wanted to import it as a
module instead.  This way you can, for example, pass functions, like a
logger, to a function in the imported module.  Also, rather than execute the
script immediately, you can combine many sub-projects into a big graph, have
a look at common dependencies and use Concurrent Haskell to parallelize the
build.

Unfortunately, the imported module needs to have the line module
X.Y.Zwhere, which means the file needs to be aware of its parent
directories.  I
think that's too harsh a constraint, and makes it a pain to move things
around (true in everyday Haskell projects with local modules too!).

My plan for a workaround was to try to use a preprocessor, but would really
rather avoid that if anyone else has any ideas.

Thanks,
Greg
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version? (Haskell is a scripting language too!)

2007-06-21 Thread skaller
On Fri, 2007-06-22 at 02:06 +0100, Brian Hulley wrote:
 skaller wrote:

  (a) Pick a portable scripting language which is readily available
  on all platforms. I chose Python. Perl would also do.
 If I had time to look into improving the GHC build system I'd definitely 
 use Haskell as the scripting language. 

Two difficulties. The first, obviously, is that this will
only work for building Haskell when you already have Haskell,
so no good for the initial bootstrap of a first port.

The second is simply that dynamic typing is generally
better for build systems, because it allows code to
'self-adapt'. To do this with a statically typed language
you would need to generate text, compile it, and run it,
which is not possible because by specification you don't
have a compiler yet. Even at the point you have bootstrapped
far enough that you do have one .. it's still very messy
to get programlets to communicate using shells in a portable
way.. in some sense that's the problem you're trying to solve!

An alternative is to implement the build system in, say,
Scheme, and then write a Scheme interpreter in Haskell.
Scheme can self-adapt internally because its compiler
is built-in.

This approach removes the dependency on external vendors,
and you might even make the initial bootstrap builder
simple enough you could use a drop-in replacement,
eg Guile (GNU scheme) on unix systems.


-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version? (Haskell is a scripting language too!)

2007-06-21 Thread Peter Tanski

Brian Hulley wrote:
To port GHC to a completely new platform, you'd of course need a  
Haskell
compiler or interpreter already. However to bootstrap the process  
only a

slow interpreter would be needed so as long as a portable pre-built
bytecode version was available for download the only thing left to  
port
would be a byte code interpreter which could just be a single file  
of c

code.


This was one void Yhc was designed to fill, especially by compiling  
to java bytecode.  At the rate I work--if I'm the only one  
deconstructing the current build system--by the time I'm done the Yhc  
team will have everything running.


Greg Fitzgerald wrote:
I was trying to build something like this recently but hit a  
roadblock.

...
Unfortunately, the imported module needs to have the line module
X.Y.Zwhere, which means the file needs to be aware of its parent
directories.  I
think that's too harsh a constraint, and makes it a pain to move  
things

around (true in everyday Haskell projects with local modules too!).


Have you looked at Neptune?  It is a scriptable build system based on  
Jaskell, which allows dynamic imports.
An example from the page at http://jaskell.codehaus.org/Jaskell+for 
+Neptune#JaskellforNeptune-90:


//in script1.jsl
1+x
where
  x = import {file=script2.jsl};
end

You may imagine that the string for script2.jsl may be computed.   
Of course this sort of thing breaks the type system in Haskell and  
the result is more Make-like, but that is the tradeoff.  Now why did  
I not try a build system using Neptune?  Probably because I had  
already spent the last three weeks learning CMake, the pecularities  
of SCons, WAF (weird bugs!), m4 (I never had to write tests in  
Autoconf before or debug my own configure files), and higher level  
Make (so it would do what I can do in Jam)--I guess it got lost by  
the wayside...  I was looking at things most people would either  
already know or would want to learn and that should already be  
available on new platforms.


skaller wrote:

The second is simply that dynamic typing is generally
better for build systems, because it allows code to
'self-adapt'.


There is a somewhat slow-going scheme-based-build-system project  
called Conjure, at http://home.gna.org/conjure/ but it only supports  
Linux and OS X.



An alternative is to implement the build system in, say,
Scheme, and then write a Scheme interpreter in Haskell.
Scheme can self-adapt internally because its compiler
is built-in.


That is why I was looking into using SISC--it is self-contained and  
may even be distributed along with the source code (SISC itself is  
GPLv2 but that doesn't matter for _using_ it)--by default it looks in  
the current directory.  The downside is the lack of a library with  
directed graphs.


Cheers,
Pete
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Simon Marlow

skaller wrote:

On Tue, 2007-06-19 at 12:23 +0100, Simon Marlow wrote:

Bulat Ziganshin wrote:

Hello glasgow-haskell-users,

are you plan to implement 64-bit windows GHC version?
The main thing standing in the way of this is the lack of a 64-bit port of 
mingw.  


Why do you need mingw? What's wrong with MSVC++?


We have talked (extensively) about doing a Windows port using the MS tools 
instead of mingw.  Check the archives of cvs-ghc, search for windows native. 
There's no problem in theory, but it's a lot of work.  Peter Tanski has done 
some work in this direction, he should be able to tell you more.


I don't think we'll be able to drop the mingw route either, mainly because while 
the MS tools are free to download, they're not properly free, and we want to 
retain the ability to have a completely free distribution with no dependencies.


There are people that want a Cygwin port too; personally I think this is heading 
in the wrong direction, we want to be more native on Windows, using the native 
object format and interoperating directly with the native Windows tools.


Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Simon Marlow

Bulat Ziganshin wrote:

Hello skaller,

Tuesday, June 19, 2007, 8:15:19 PM, you wrote:

are you plan to implement 64-bit windows GHC version?



Why do you need mingw? What's wrong with MSVC++?


really! Simon, how about unregisterised build?


Unregisterised would still need a C compiler capable of generating 64-bit code. 
 Are you talking about using the MS compiler for that?  Certainly possible, but 
I'm not sure why you'd want to do it - you'd end up with much slower code than 
running the 32-bit compiler.


Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread skaller
On Wed, 2007-06-20 at 08:49 +0100, Simon Marlow wrote:

 I don't think we'll be able to drop the mingw route either, mainly because 
 while 
 the MS tools are free to download, they're not properly free, and we want 
 to 
 retain the ability to have a completely free distribution with no 
 dependencies.

I'm not sure I understand this. MS tools are free to download
by anyone, but not redistributable. The binaries needed by
programs *built* by those tools are not only free to download,
they're free to redistribute, and they're less encumbered than
almost all so-called 'free software' products.

Don't forget -- Windows isn't a free operating system.
You're juggling some possible problem with a single source
vendor withdrawing supply (possible) against open source
products which are late to market (definite :)

64 bit Mingw .. will already be years out of date when
it turns up, since MS is focusing on .NET platform.
MSVC++ tools already support CLR, assemblies and .NET:
even if Mingw supported that .. you'd still need Mono
(does it work, really?) for a 'free' platform .. but .NET
is redistributable and available on most modern Windows
platforms already ..

I doubt the Open Source community is as reliable a supplier
for the Windows market as Microsoft. It's really a boutique 
market. Cygwin was a major platform in the past, for running
Unix software on Windows.

But now we're talking about a Windows *native* version of GHC,
there's no Unix in it. I see no real reason not to build
for the native toolchain .. and plenty of reasons not
to bother with others.

Hmm .. can't MS be coaxed into supplying some support to the
developers? After all, Haskell IS a major lazily evaluated
statically typed functional programming language. Why wouldn't
MS be interested  in bringing GHC on board? They have an
Ocaml (called F#) now..

-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Simon Marlow

skaller wrote:

On Wed, 2007-06-20 at 08:49 +0100, Simon Marlow wrote:

I don't think we'll be able to drop the mingw route either, mainly because while 
the MS tools are free to download, they're not properly free, and we want to 
retain the ability to have a completely free distribution with no dependencies.


I'm not sure I understand this. MS tools are free to download
by anyone, but not redistributable. The binaries needed by
programs *built* by those tools are not only free to download,
they're free to redistribute, and they're less encumbered than
almost all so-called 'free software' products.


The binaries needed by programs built by these tools..., you're referring to 
the C runtime DLLs?  Why does that matter?


Note I said with no dependencies above.  A Windows native port of GHC would 
require you to go to MS and download the assembler and linker separately - we 
couldn't automate that, there are click-through licenses and stuff.



Hmm .. can't MS be coaxed into supplying some support to the
developers? After all, Haskell IS a major lazily evaluated
statically typed functional programming language. Why wouldn't
MS be interested  in bringing GHC on board? They have an
Ocaml (called F#) now..


MS pays for Ian Lynagh, who works full time on GHC as a contractor.  MS puts 
roughly as much money into GHC as it does into F#, FWIW.


Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Neil Mitchell

Hi


 I'm not sure I understand this. MS tools are free to download
 by anyone, but not redistributable. The binaries needed by
 programs *built* by those tools are not only free to download,
 they're free to redistribute, and they're less encumbered than
 almost all so-called 'free software' products.

The binaries needed by programs built by these tools..., you're referring to
the C runtime DLLs?  Why does that matter?

Note I said with no dependencies above.  A Windows native port of GHC would
require you to go to MS and download the assembler and linker separately - we
couldn't automate that, there are click-through licenses and stuff.


I don't compile GHC on Windows, as its kind of annoying to do, and the
binaries are usually sufficient for my needs. Typically MS tools are
well packaged and even if there is a click through license, it usually
involves checking a box and clicking next. I can't believe that anyone
is going to have any difficulty installing Visual Studio express.

Compare this to Cygwin/Mingw where the packaging is frankly awful, and
makes my head hurt every time I have to install it.

I'm looking forward to having GHC built with Visual Studio, but I can
understand why its not a priority - the advantages are relatively
minimal. What I keep hoping is that Microsoft will put some serious
thought into debugging Haskell - the MS tools for debugging blow away
everything else. (I realise a start is being made in GHCi, and am
looking forward to the end results!)

Thanks

Neil
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread skaller
On Wed, 2007-06-20 at 14:42 +0100, Simon Marlow wrote:

 The binaries needed by programs built by these tools..., you're referring 
 to 
 the C runtime DLLs?  Why does that matter?
 
 Note I said with no dependencies above.  A Windows native port of GHC would 
 require you to go to MS and download the assembler and linker separately - we 
 couldn't automate that, there are click-through licenses and stuff.

So what? Felix requires:

(a) C/C++ compiler
(b) Python
(c) Ocaml

you have to download and install these tools on ANY platform,
including Ubuntu Linux. gcc isn't installed on a basic system.
True, with Debian, this can be automated, so you only have
to click on the main package.

I need THREE external tools. Is this a pain? YES!
[On Windows .. it's a breeze on Ubuntu .. :]

Is it too much effort to ask, for someone to use a major
advanced programming language like Haskell? 

Don't forget .. Mingw has to be installed too .. and in fact
that is much harder. I tried to install MSYS and gave up.

 MS pays for Ian Lynagh, who works full time on GHC as a contractor.  MS puts 
 roughly as much money into GHC as it does into F#, FWIW.

I'm happy to hear that!

Now let me turn the argument around. Mingw is a minor bit player.
The MS toolchain is the main toolchain to support. C++ can't
run on Mingw for example (MS and gcc C++ are incompatible).

GHC needs to target *professional windows programmers*.
They're going to have VS installed already. Haskell is far
too important a language (IMHO) not to have an entry in
the commercial programming arena.

Commercial programming is in a bad way! It NEEDS stuff like
Haskell available.

BTW: I don't really like Windows .. but I want to see Haskell
succeed. Trying to do Haskell on Windows without MSVC++ toolchain
is like trying to work on Linux without binutils... :)


-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Simon Marlow

Neil Mitchell wrote:

Hi


 I'm not sure I understand this. MS tools are free to download
 by anyone, but not redistributable. The binaries needed by
 programs *built* by those tools are not only free to download,
 they're free to redistribute, and they're less encumbered than
 almost all so-called 'free software' products.

The binaries needed by programs built by these tools..., you're 
referring to

the C runtime DLLs?  Why does that matter?

Note I said with no dependencies above.  A Windows native port of 
GHC would
require you to go to MS and download the assembler and linker 
separately - we

couldn't automate that, there are click-through licenses and stuff.


I don't compile GHC on Windows, as its kind of annoying to do, and the
binaries are usually sufficient for my needs. Typically MS tools are
well packaged and even if there is a click through license, it usually
involves checking a box and clicking next. I can't believe that anyone
is going to have any difficulty installing Visual Studio express.

Compare this to Cygwin/Mingw where the packaging is frankly awful, and
makes my head hurt every time I have to install it.


Not a fair comparison - I'm talking about *users* of GHC, who currently do not 
have to download anything except GHC itself.  With a Windows native port they'd 
have to also get VS Express and the MASM package separately.


GHC *developers* wouldn't be any better off either.  You'd still need either 
Cygwin or MSYS for the build environment.  There's no way I'm using MS build 
tools, ugh.


Cheers,
Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Simon Marlow

skaller wrote:


GHC needs to target *professional windows programmers*.
They're going to have VS installed already. Haskell is far
too important a language (IMHO) not to have an entry in
the commercial programming arena.

Commercial programming is in a bad way! It NEEDS stuff like
Haskell available.

BTW: I don't really like Windows .. but I want to see Haskell
succeed. Trying to do Haskell on Windows without MSVC++ toolchain
is like trying to work on Linux without binutils... :)


This is a fine point, and probably the biggest reason for doing a Windows native 
port.  I'd like to see it happen, but we need help!


Cheers,
Simon

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Isaac Dupree
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Neil Mitchell wrote:
 Typically MS tools are
 well packaged and even if there is a click through license, it usually
 involves checking a box and clicking next. I can't believe that anyone
 is going to have any difficulty installing Visual Studio express.

I would have some difficulty, because I would feel obliged to read the
license first and decide whether it felt acceptable for me to agree to.
 That's the same reason I haven't started up iTunes on my MacBook -
reading the general Apple-software license was tiring enough!  This is
one place GNU/Linux/(other Free systems) really shine, even compared to
OS X: you don't have to explicitly accept a click-through license the
first time you start everything up (iterated for every new installation
and computer, and they don't tell you whether it's the same version of
the license that you read earlier). (Copyright doesn't require you to
click to agree; I've already read the GPL and a few other Free licenses;
and I trust the FSF's judgment in what freedoms the other miscellaneous
Free licenses grant.)

But I guess that doesn't matter to most Windows users... even if
they're developers of FOSS ... ?

Isaac
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFGeUihHgcxvIWYTTURAiSmAJ4yy6SinJgfUKARozwcYuxvSoUgdwCgpZD9
JFI0TddUPvYjGogtgjQnVM8=
=FInN
-END PGP SIGNATURE-
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


RE: 64-bit windows version?

2007-06-20 Thread Green Bryan - bgreen
I would be more than happy to help.  Maybe we need to get a sub-team
together and start plowing through this mine-field?

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Simon
Marlow
Sent: Wednesday, June 20, 2007 10:29 AM
To: skaller
Cc: glasgow-haskell-users@haskell.org; Bulat Ziganshin
Subject: Re: 64-bit windows version?

skaller wrote:

 GHC needs to target *professional windows programmers*.
 They're going to have VS installed already. Haskell is far
 too important a language (IMHO) not to have an entry in
 the commercial programming arena.
 
 Commercial programming is in a bad way! It NEEDS stuff like
 Haskell available.
 
 BTW: I don't really like Windows .. but I want to see Haskell
 succeed. Trying to do Haskell on Windows without MSVC++ toolchain
 is like trying to work on Linux without binutils... :)

This is a fine point, and probably the biggest reason for doing a
Windows native 
port.  I'd like to see it happen, but we need help!

Cheers,
Simon

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users
*
The information contained in this communication is confidential, is
intended only for the use of the recipient named above, and may be
legally privileged.

If the reader of this message is not the intended recipient, you are 
hereby notified that any dissemination, distribution or copying of this
communication is strictly prohibited.

If you have received this communication in error, please resend this
communication to the sender and delete the original message or any copy
of it from your computer system.

Thank you.
*
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Isaac Dupree
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

skaller wrote:
 (MS and gcc C++ are incompatible).

is this still true? GCC has been standardizing its C++ ABI for a while,
and I think there actually weren't any ABI changes noted between 4.1 and
4.2 for most platforms (I don't know if MS C++ is compatible with that
common ABI though).  I could be confused here though.

 BTW: I don't really like Windows .. but I want to see Haskell
 succeed. Trying to do Haskell on Windows without MSVC++ toolchain
 is like trying to work on Linux without binutils... :)

yes, binutils written in Haskell!  Will never happen!  :))

Isaac
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFGeUpvHgcxvIWYTTURAimNAKClTAkuRU3pr/ASsfSZdGiYoNLsmwCgo4G2
Oh/mK9MQ2vcRLAeaT4bOsdo=
=0nHh
-END PGP SIGNATURE-
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Peter Tanski




skaller wrote:


On Tue, 2007-06-19 at 12:23 +0100, Simon Marlow wrote:


Bulat Ziganshin wrote:


Hello glasgow-haskell-users,

are you plan to implement 64-bit windows GHC version?

The main thing standing in the way of this is the lack of a 64- 
bit port of

mingw.



Why do you need mingw? What's wrong with MSVC++?


The largest problem is the build system: GHC uses autoconf with  
custom makefiles.  I have looked into porting the whole thing to a  
Visual Studio project, using SCons (unreliable), CMake (limited  
command line abilities--good for a one-shot build but really just a  
safe lowest-common-denominator version of Make), Waf (another  
python-based build system that started as a fork of SCons for the  
KDevelop changeover from Autotools) and Jam.  I would prefer to use  
Jam but I'm afraid I would be the only one who would ever want to  
support it.  Nothing has the auto-configuration abilities you (John)  
built into the Felix Interscript-based system but I do not porting  
the build system (at least) to Interscript would go over well with  
anyone else who wanted to maintain it and the build itself would  
require heavy customisation.  I have tested all of these on a small  
scale (the replacement-Integer library).  The best option seems to be  
to create a VS project (not trivial--lots of glue) so a user may also  
call that from Make (if under Mingw) or pure DOS.


There is also some gcc-specific code in the RTS (inline assembler,  
use of extern inline, etc.)  By the way, as of gcc-4.2 (I believe; I  
know it is true for gcc-4.3)  the use of 'extern inline' now conforms  
strictly to the C99 standard so we will have to add the option '- 
fgnu-89-inline' to get the old behaviour back--'extern inline' is  
used in some of the headers.  Converting those 'extern inline's to  
'static inline' or best yet plain 'inline' would also solve the  
problem.  Ian Taylor's message at http://gcc.gnu.org/ml/gcc/2006-11/ 
msg6.html describes this in greater detail; his proposal was  
implemented.


I don't think we'll be able to drop the mingw route either, mainly  
because while
the MS tools are free to download, they're not properly free, and  
we want to
retain the ability to have a completely free distribution with no  
dependencies.


I don't know of any completely free 64-bit compilers for Windows.   
The Intel compilers are free for 30-day evaluation but everything  
else is for Win32.  For the base Win32-native port there are many  
compilers available but I have mostly worked on using CL and Yasm  
(assembler) as replacement back-end compilers for GHC.


There are people that want a Cygwin port too; personally I think  
this is heading
in the wrong direction, we want to be more native on Windows,  
using the native
object format and interoperating directly with the native Windows  
tools.


Cygwin has a real problem with gcc: it is far behind everything else  
(gcc-3.4.4, though Mingw isn't much better) and it doesn't look like  
that will change anytime soon.  It  is also only 32-bit, I believe.


Cheers,
Pete
 
___

Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


RE: 64-bit windows version?

2007-06-20 Thread Simon Peyton-Jones
|  BTW: I don't really like Windows .. but I want to see Haskell
|  succeed. Trying to do Haskell on Windows without MSVC++ toolchain
|  is like trying to work on Linux without binutils... :)
|
| This is a fine point, and probably the biggest reason for doing a
| Windows native
| port.  I'd like to see it happen, but we need help!


| I would be more than happy to help.  Maybe we need to get a sub-team
| together and start plowing through this mine-field?


That'd be great!  A good way to start might be to start GHC-Trac Wiki page, 
identify who wants to be involved, and sketch the challenges.

thanks!

Simon
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread Peter Tanski

Simon Marlow wrote:
GHC *developers* wouldn't be any better off either.  You'd still  
need either
Cygwin or MSYS for the build environment.  There's no way I'm using  
MS build

tools, ugh.


The way I have it set up (so far) is as simple as running configure  
and make--all from the command line, under DOS or Mingw, although  
someone with VS tools may open up the VSproject in the IDE.  Would  
that be o.k.?


I am not particularly enamored with VS, myself but that may be a  
consequence of having a small monitor for my Windows machine and  
constantly comparing it to the Xcode/Emacs combination I normally  
use.  The VS debugger *is* very good and helped me pick out some bugs  
in Yasm quickly--when I only really know how to use gdb.


Cheers,
Pete
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-20 Thread skaller
On Wed, 2007-06-20 at 11:39 -0400, Peter Tanski wrote:

 The largest problem is the build system: GHC uses autoconf with  
 custom makefiles. 

Well, that needs to be fixed. Autoconf and make are rubbish.

  I have looked into porting the whole thing to a  
 Visual Studio project, using SCons (unreliable), CMake (limited  
 command line abilities--good for a one-shot build but really just a  
 safe lowest-common-denominator version of Make), Waf (another  
 python-based build system that started as a fork of SCons for the  
 KDevelop changeover from Autotools) and Jam.  I would prefer to use  
 Jam but I'm afraid I would be the only one who would ever want to  
 support it.  Nothing has the auto-configuration abilities you (John)  
 built into the Felix Interscript-based system but I do not porting  
 the build system (at least) to Interscript would go over well with  
 anyone else who wanted to maintain it and the build itself would  
 require heavy customisation.

The Felix build system is more or less independent of Interscript.
There's NO WAY GHC should be ported to use Interscript.
We don't want to touch any of the source files.

For information of other readers: Felix uses two pieces of advanced
technology for building.

1. Interscript is a general purpose extensible literate programming (LP)
tool. The idea of LP is that code is embedded in documents. Interscript
documents are *.pak files, which when 'tangled' create the actual
sources. Interscript is different to other LP tools in that
you can embed arbitrary Python script inside a document and
use it to *generate* code (or documentation).

This just wipes out rubbish like autotools method of filling
in Makefile.am templates because it is (a) programmatic
and (b) applies to all sources the same way. I regularly use
tables to generate various parts of a program, eg a list of
tokens to generate lexing components as well as a pretty
printing function.

But LP is an invasive technology. You pay for using it:
it pervades the whole software base and it typically defeats
IDE's and syntax colouring etc.

2. The Felix build system is basically a 'debian like' package
manager. Although it hooks interscript, that's just another plugin
of the build system.

The key thing for the building portability is that the C and C++
compilers are represented by Python classes. There is a pre-programmed
class for gcc, and another for MSVC++. The way this works is we have
identified build abstractions like:

* make an object file for static linking
* make an object file for dynamic linking (-fPIC thing)
* make a dynamic link library from object files
* make a static link library from object files
* static link a program
* link a program for dynamic loading

plus some things peculiar to Felix. Each of these functionalities
is represented by a method of the Python class.

So the build scripts are portable, provided you use these methods
on an object of the appropriate compiler class (gcc or msvc).

Similarly, to manage files, scripts say stuff like:

fileparts = string.split(filename,/)
osfilename = string.join(fileparts,os.sep)

which converts a 'unix' filename to your local OS convention.
I typically mandate Unix style filename literals even on Windows,
but it is tricky to get this right.

To build a library a package typically has a meta-description,
which is itself an executable Python script which is requires
to set some variables, such as a list of source files to
be compiled. The build system compiles them using both
the static and dynamic methods, and makes both shared and
static archive libraries.

Clearly GHC will have different requirements to Felix.
I'm not suggesting copying the Felix build system verbatim!

What I actually recommend is:

(a) Pick a portable scripting language which is readily available
on all platforms. I chose Python. Perl would also do. I can't
recommend Tcl, it's too messy. Scheme may be an excellent choice
I'm only just learning it and I'm not sure about availability,
but it seems really well suited if you can hack all the brackets :)

(b) Write the WHOLE build system using that language.
For parts that differ between OS, you use parameters.
These parameters can be simple things like 

EXE_EXT = .exe # Windows
EXE_EXT =  # Unix

etc, or they can be classes encapsulating complex behaviour.
Too much abstraction is bad .. environments are too quirky,
lots of cases with minor variations is the way to go unless
you're REALLY sure what you have is an abstraction.

(c) provide 'values' for the parameters for the platform
combinations you want to build on

(d) write configuration scripts to create a file of these
parameters  -- if that fails the user can edit it. You can
also supply 'prebuilt' configurations for common platforms.

BTW: doing all this was more or less mandatory for Felix,
since it is a cross-cross-compiler. The Felix build system
actually splits building 

Re: 64-bit windows version?

2007-06-19 Thread Simon Marlow

Bulat Ziganshin wrote:

Hello glasgow-haskell-users,

are you plan to implement 64-bit windows GHC version?


The main thing standing in the way of this is the lack of a 64-bit port of 
mingw.  The latest status update I could find is here:


http://sourceforge.net/mailarchive/message.php?msg_id=460D8FC1.64E689DB%40dessent.net

Cheers,
Simon

___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users


Re: 64-bit windows version?

2007-06-19 Thread skaller
On Tue, 2007-06-19 at 12:23 +0100, Simon Marlow wrote:
 Bulat Ziganshin wrote:
  Hello glasgow-haskell-users,
  
  are you plan to implement 64-bit windows GHC version?
 
 The main thing standing in the way of this is the lack of a 64-bit port of 
 mingw.  

Why do you need mingw? What's wrong with MSVC++?

-- 
John Skaller skaller at users dot sf dot net
Felix, successor to C++: http://felix.sf.net
___
Glasgow-haskell-users mailing list
Glasgow-haskell-users@haskell.org
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users