Re: rpath command?

2000-11-27 Thread Marc van Woerkom

> > Yeah, that's how it works.  I realize this is less than good.  Can you
> > suggest how it ought to work?
> 
> I think the current behavior is fine, since it is indeed invalid to
> create a library or libtool archive out of no source files.  However,
> we might output a warning (or even an error) if we find a request to
> create a library out of no source files.

It should be made clear, if it is a bug or a feature.
And a sentence or two in the .info documentation about this 
situation would be welcome too - just to prevent that one
wastes time trying to achieve something that is not supposed
to work that way.

Regards,
Marc




Re: how to realize a medium coupled build environment?

2000-11-24 Thread Marc van Woerkom

Hello Stephan,

I looked at the build management in kdebase-2.0. 
Very interesting to see how you guys did it.

Two questions:

1. I am still not sure, if modules should have their
   own configure script. 
   My position so far was that there is only one
   configuration, for the whole tree.
   What is the benefit of sub-configure scripts?
   
2. You use this sequence in your Makefile.cvs bootstrap:

@echo "*** Building Makefile templates (step one)"
@automake
@echo "*** Scanning for moc files and kde tags"
@perl admin/am_edit
@echo "*** Building Makefile templates (step two)"
@autoconf

   Why does one need to have a autoconf call after
   the automake call?
   My belief was that automake is responsible for the
   autoconf calls.


Regards,
Marc




Re: how to realize a medium coupled build environment?

2000-11-23 Thread Marc van Woerkom

> What you describe is exactly what KDE uses. 

Yea. Makes perfect sense for such a large project.
Thank you very much for that hint. 

>From the larger open projects I have had a look at Mozilla only 
so far. I will check it.


> We have configure.in.ins in every subdirectory that needs 
> configure checks. 

I suppose that is to generate the AC_OUTPUT list.

The scheme I arrived at lists every thinkable module

mod1/Makefile
mod2/Makefile
 .
 .

in the argument list of AC_OUTPUT.

This is ugly because it generates warnings for every module
that has not been checked out at configure time.

Further one has to provide 

--without-FooBar

switches to cut down SUBDIRS internally to the list of 
available modules, otherwise a "make all" will fail.


> The Makefile.cvs you run after checking out creates one 
> configure.in with the used parts and then it all works from 
> there. 

Yes, adding another .in layer for generating configure.in
would have been my next evolution step too.

Together with some scanning of what is there (perhaps every
subdirectory it finds, except certain standard ones) would
make it unnecessary to 

a) provide a list of all possible modules in AC_OUTPUT in advance
b) provide "--with-FooBar" switch logic for all possible modules
   in advance.

On the other hand, the switches solution has the feature,
that I can keep a full tree with lots of modules around,
while telling configure, which ones to use actually.

Otherwise I would have to remove the modules I don't need,
thus I would need to work on a fresh copy for every
different module subset selection. OK, that sort of throw
away tree is quite fitting for cvs based development.


> SUBDIRS are created from the existant subdirs with a Makefile.am, 
> and we also have COMPILE_FIRST and COMPILE_LAST to manipulate the 
> subdirs order in case you need libraries for example

>From what you write that seems to cover build order, makes 
sense because not every project might be built in the lexical
order that their directory names suggest.

You mention library dependency.

I used this construct

dnl  treat module dependencies

if ! ${module_base}; then
  if ${module_dataserver}; then 
AC_MSG_WARN([module Dataserver depends on module Base => Base will be used])
module_base=true;
  elif ${module_dataserver2}; then 
AC_MSG_WARN([module Dataserver2 depends on module Base => Base will be used])
module_base=true;
  elif ${module_dataserver3}; then 
AC_MSG_WARN([module Dataserver3 depends on module Base => Base will be used])
module_base=true;
  fi
fi

to ensure that the build of a Dataserver* module will trigger 
the build of a Base library first.

Again this is ugly, because all dependencies are coded into
configure.in which might grow into large list over time.

In a collection scheme I think it would make sense to 
deposit dependency hint files in the module subdirectories
that tell the module collection phase what modules they
depend on:

Base/
   ("I need nothing")
.
.
Dataserver2/
   DepHint ("I need Base")
Dataserver3/
   DepHint ("I need Base")

So that information would not been kept central in one giant
configure.in section but kept distributed.

Regards,
Marc




Template instantiation and autotools

2000-11-22 Thread Marc van Woerkom

> > buildme.c:
> >   /* void do_really_nothing_useful() { } */
> 
> This is not a valid C program.  A C program must contain at least one
> valid declaration.  Try something like `static int i;'

Ok.

What still bothers me are C++ templates.

Before autoconfication, the project in foo.h  slurped in some
foo.c file with template code. 
I changed those into fooT.h, to have them treated at least
as headers (automake <= 1.4, only headers in differen dirs).

The goat book lists the problem in 15.3.1, but it does not
feature a solution.

The link to that Mozilla portable C++ doc is nice, but if 
I remember right, it basically says use nothing more
recent than Stroustrup 2nd Ed. (no templates, no rtti, no
exceptions..).

So back to all that bs about template instantiation.. 
this is soo depressing.. did anything at all improve since '95? 

Thanks again!
Marc










Re: rpath command?

2000-11-22 Thread Marc van Woerkom

> automake should define it to `$(LIBTOOL) $(CC/CXX)...'.  Maybe it only
> does this when there is at least one source file to compile in the
> current directory?

Bingo!

After introducing this additions

Makefile.am:

libSciFacBase_la_SOURCES = \
   buildme.c

buildme.c:
 
  /*
   void do_really_nothing_useful()
   {
   }
  */
  
it works. 
  
Bizarre, I can't believe it. :-)

This project now features a source tree with customizable modules,
handles module dependencies, builds in separate build tree, installs
in seperate build tree..

[mvw@pcserver scifac.build]$ ../scifac/configure 
--prefix=/home/mvw/work/scifac.install --without-Dataserver --without-Dataserver2 
--without-Dataserver3

.. and stumbles over an empty _SOURCES list. ROTFL.

Thanks for your remark, Alexandre!
Marc





variable substitution in AC_OUTPUT

2000-11-22 Thread Marc van Woerkom

I need to have the list of files in AC_OUTPUT variable, but I 
can't make it work with automake.

This is the full list

AC_OUTPUT([
Makefile 
macros/Makefile
Base/Makefile
Dataserver/Makefile
Dataserver2/Makefile
Dataserver3/Makefile
src/Makefile
test/Makefile
])

from inetutils-1.3.2 I took this trick:

MY_MODULES="Base Dataserver2"
MY_MAKEFILES="`for D in $MY_MODULES; do echo $ac_n ' '$D/Makefile; done`"

AC_OUTPUT([
Makefile 
macros/Makefile
$MY_MAKEFILES
src/Makefile
test/Makefile
])

The interesting thing is, that autoconf seems to be happy with this,
but automake is not:

configure.in: 229: required file `./$MY_MAKEFILES.in' not found

Is there a way to have a variable in AC_OUTPUT where automake
during configure.in scanning will be happy with?

Regards,
Marc





how to realize a medium coupled build environment?

2000-11-22 Thread Marc van Woerkom

Hi!

I would need some input (or perhaps you know some existing example) on 
how to set up what I would like to call a "medium coupled build environment" 
due to the lack of a better name.

In my present view, the autotools provide either a "strong coupled" or a 
"weak coupled build environment" regarding sub projects. 
Alas it seems I need something inbetween.

What do I mean with that?


1. strong coupled modules

This is what typically goes into a project tarball.

Having some directory structure like this one:

  game/
Makefile.am
config/
configure.in
src/
snd/
www/

where we have central Makefile.am and configure.in files, the sources 
to eg a game in directory src/, and the sources to a sound library in 
sound/, and some web related code in www/.

So we have three modules (1 app, 2 libs) or subprojects.

The interesting bit about this is, that there is one configuration and 
Makefile generation step for the whole package.

Especially make dependencies are generated among modules.

If we change a line in www/foo.c and try to compile within the src/ 
subdirectory, the change in www/foo.c will be noticed and that source 
recompiled before compilation of stuff in src/ starts. 
This way I can change stuff in all module subdirectories, and still risk 
no getting out of sync (linking an outdated library/objects into the 
main app).

This project setup is typically used by a developer group that works 
on one package, ranging from small GNU hello to godzilla sized mozilla.


2. weak coupled modules

That kind of interaction is used by packages installed on a system.

A game for X11 would use the GUI via header files and libraries installed 
in certain standard locations like /usr/X11R6/include and /usr/X11R6/lib.

This is a rather loose coupling. 

Either some headerfile foo.h is there or not. 

With libraries, access is bit more fine tuned, via some library version 
naming scheme (eg libfoo.so.x.y.z under Linux or just libfoo.so.x under 
FreeBSD).

Returning to the example scenario from above this would mean that we 
could have three indvidual source trees for that game, its sound and web 
libraries:

  game/
Makefile.am
config/
configure.in
src/

  snd/
Makefile.am
config/
configure.in
src/

  www/
Makefile.am
config/
configure.in
src/

Now, if www/src/foo.c was touched, and one would compile in game/src, 
there would be no automatic recompilation of the www/ project prior to 
the compilation of src/.

What would be possible is, that the configure script in game checks for 
an libwww of a certain version installed in /usr/local/lib (or some other 
predefined system wide library location) and stops if is not there.

Or some outer package management system, like FreeBSD's ports collection, 
would know what libraries are required to generate the game project, and 
thus fetches, configures, compiles and installs the libwww if the proper 
version has not been installed on the system.

Again, getting out of sync between game/ and www/ is prevented, to a 
certain degree at least. 

While in the strong coupled case a simple change would force recompilation, 
now the only change dramatic enough is bumping the library versions in  
game/ or accidental deinstallation of libwww in /usr/local/lib
(or similiar).

This is the typical build infrastructure used in case of different developer
groups.


3. medium coupled modules

I got the task to set up the build infrastructure for a large project that
will consist of dozen or so sub projects, with a group of developers working
on that.

Discussing with those guys, I presented traditional scenarios 1. and 2.
What was wanted seems to be something in between.

The loose coupled scenario 2. was liked because of the possibility to
checkout only parts of an eventually large source tree.
While a scenario 1. solution would force one to checkout all modules.

On the other side they still want to have make dependencies among that 
checked out subset of modules.

Tricky.

So what I try right now is another configuration step, something
like a source tree configuration, where parts of a larger tree
will be molded into a simple scenario 1. type project.

The idea was to allow the user checking out only part of the
source tree e.g.


  Makefile.am.in
  bootstrap
  config/
  configure.in.in
  game/
  www/

(note that snd/ has not been checked out)

and then having a bootstrap script inspecting the tree for checked out 
modules, and if that subset makes sense to compile, generates 
Makefile.am and configure.in from templates where e.g. the SUBDIRS variable
is set to 

SUBDIRS = (standard stuff) game www

in this example case.

The hope is that this can be done in a way, that the resulting generated
configuration script will have the desired dependencies among modules.

Perhaps it is even possible to use automake/autoconf in a way, that
I just list the module list ("game www") in some fill, and only the
traditiona

Re: Attempting to unit test

2000-11-22 Thread Marc van Woerkom

> Pavel> Don't start lines in Makefile.am with spaces. Automake will not
> Pavel> recognize them, neither will it warn you.
> 
> Do you think it would be better to warn or to simply recognize lines
> like that?

At least warn.

Regards,
Marc




Re: libraries built from subdirectories

2000-11-22 Thread Marc van Woerkom

> This is possible with the cvs automake.

Has the different behaviour between 1.4 and 1.4a (cvs automake)
been documented anywhere?

Just treating my present project with cvs automake I found it to
complain on AM_PROG_LIBTOOL (deprecated message) and compilation
broke off.. so I switched back to 1.4 because I had no much 
time for experiments.

Regards,
Marc




Re: documentation

2000-11-13 Thread Marc van Woerkom

> I'd lihe to know if there any gurther documentation about automake.
> 
> I've looked at www.gnu.org/software/automake. Is there any other
> location ?

There is brand new book out about it.

You can read it online as well

http://sources.redhat.com/autobook

Do yourself a favour and order a copy. 

Regards,
Marc





libraries built from subdirectories

2000-11-03 Thread User Marc van Woerkom

Hello,

I have a question regarding automake 1.4/1.4a:
Is it possible to build a single library from source files
that reside in a couple of subdirectories beneath?

Something like this:

   dir1/libfoo.a
   dir1/dir2/a.c
   dir1/dir3/b.c

Regards,
Marc