Re: Extended test suites

2009-08-29 Thread Ralf Wildenhues
* Ralf Wildenhues wrote on Fri, Aug 28, 2009 at 08:20:30PM CEST:
> * Martin Quinson wrote on Thu, Aug 27, 2009 at 06:33:05PM CEST:
> > TESTS?=regular mandatory tests
> > TESTS_EXTRA=other, more fragile tests.
> > check-extra: $(TESTS_EXTRA)
> > for d in $(SUBDIRS) ; do $(MAKE) check-extra $$d ; done
> > TESTS="$(TESTS) $(TESTS_EXTRA)" $(MAKE) check
> 
> Why does this not work?
> 
> TESTS = regular mandatory tests
> TESTS_EXTRA = other, more fragile tests.
> check-extra: $(TESTS_EXTRA)
>   for d in $(SUBDIRS) ; do $(MAKE) check-extra $$d ; done
>   $(MAKE) check TESTS="$(TESTS) $(TESTS_EXTRA)"

Hmm, now that I've slept over it, my suggestion will also override TESTS
in subdirectories, at least when you use GNU make.  You could ignore it
by running an internal rule that doesn't recurse like check does, or
override the variable for the called instance only: that should work
portably with the suggestion from
<http://thread.gmane.org/gmane.comp.sysutils.automake.general/10961/focus=10962>

Cheers,
Ralf




Re: Extended test suites

2009-08-29 Thread Ralf Wildenhues
[ removed subscriber-only list from Cc: ]

Hello Mike,

please don't top-post, and please lower the wrap column a bit, thank
you.

* Mike Zraly wrote on Fri, Aug 28, 2009 at 05:17:48PM CEST:
> (Hmm, if a test is listed in both XFAIL and TESTS, does it get run
> once with an expectation of failure?)

A test needs to be listed in TESTS to be run, so yes.  The variable to
list expected failures in is XFAIL_TESTS, however, not XFAIL.

>  I don't think generating macro values at runtime is recommended
>  though.

Not quite sure what you mean with this, but it sounds like you're
hinting at GNU make-specific functionality.

Cheers,
Ralf




Re: Extended test suites

2009-08-28 Thread Ralf Wildenhues
Hello Martin,

* Martin Quinson wrote on Thu, Aug 27, 2009 at 06:33:05PM CEST:
> I already know about the XFAIL variable and the possibility to mark that
> some tests are known to fail (and unfortunately, we also have to use
> this). But I'm here speaking of something like a TEST_EXTRA variable and
> a check-extra: make target testing the base tests plus the extra ones.

You could let some tests skip (exit 77) if some condition (in a file) is
or is not set.

> I tried to do it with something like the following Makefile.am chunk:
> TESTS?=regular mandatory tests
> TESTS_EXTRA=other, more fragile tests.
> check-extra: $(TESTS_EXTRA)
>   for d in $(SUBDIRS) ; do $(MAKE) check-extra $$d ; done
>   TESTS="$(TESTS) $(TESTS_EXTRA)" $(MAKE) check

Why does this not work?

TESTS = regular mandatory tests
TESTS_EXTRA = other, more fragile tests.
check-extra: $(TESTS_EXTRA)
for d in $(SUBDIRS) ; do $(MAKE) check-extra $$d ; done
$(MAKE) check TESTS="$(TESTS) $(TESTS_EXTRA)"

It should be portable.  If you want to override portably from
environment variables, you need to use 'make -e'.

Do you know the 'parallel-tests' option from Automake 1.11?  It allows
you to run tests in parallel, and allows some more features like recheck
and partial check runs based on dependencies.  You could use those too
to avoid checking some fragile tests.

Hope that helps.

Cheers,
Ralf




Re: Fwd: Making only some targets on cross-compilation

2009-08-27 Thread Ralf Wildenhues
* Philip A. Prindeville wrote on Thu, Aug 27, 2009 at 01:32:33AM CEST:
> Ralf Wildenhues wrote:
> >   echo "SUBDIRS = src utils" | make -f Makefile -f -

> Yeah, I thought about that or some "makefile.wrapper" like:
> 
> SUBDIRS=src utils
> include Makefile

That wouldn't work.  You've got the order of your lines wrong.

> That said, I have to believe that this is a broad/generic enough issue
> that people must run into it elsewhere, and was wondering if it's worth
> including a fix in the generated Makefiles from automake to not inherit
> SUBDIRS into sub-makes from the parent.

Cross compilation is a broad/generic enough issue that package authors
adjust their packages to allow to build even in that case.  That said,
not building parts of the tree is usually chosen at configure time, not
at make time, precisely because portable make is such an inflexible
beast.

> Passing overriding values via the command-line *is* a basic capability
> in gmake, so if automake is to be robust is should probably do the right
> thing in all cases (or all reasonable cases).

I'm not sure you understand.  Passing overriding values is sufficiently
non-portable to non-GNU make implementations (see the Autoconf manual
section on make portability), and the most common need for override is
one that happens in *all* makefiles and not just the toplevel one, that
trying to support something from automake which isn't really supported
portably from make seems weird.  Besides, I just gave you a working
command line.

Cheers,
Ralf




Re: Fwd: Making only some targets on cross-compilation

2009-08-26 Thread Ralf Wildenhues
Hello Philip,

* Philip A. Prindeville wrote on Wed, Aug 26, 2009 at 08:23:06AM CEST:
> Sent this email out, but then didn't hear back... it struck me that it's
> not an ALSA bug so much as an automake bug.
> 
> The problem is simple: normally, targets like "all" use the
> "all-recursive" indirect target (or "install" and "install-recursive",
> etc), which then looks like:
[...]
> 
> so the problem is this. If you don't want to patch the Makefile,
> Makefile.in, or Makefile.am, but want to only build or install a subset
> of directories (say for instance your building binary only targets for a
> cross-compiled system), then the "include", "doc", and "test" targets
> are useless.

The most common way to address this would be to let the developer add
logic to the Makefile.am to not build these directories, when the user
(you) enabled some switch, which could be seeded by the cross compiling
state.

That said, however,

> So you try to build as:
> 
> make all ... SUBDIRS='src utils'

if you're desperate to have it in a command line, you can also
just override SUBDIRS in your own little additional makefile:
  echo "SUBDIRS = src utils" | make -f Makefile -f -

Hope that helps.

Cheers,
Ralf




Re: How to specify a directory for INSTALL, NEWS & Co

2009-08-22 Thread Ralf Wildenhues
Hello Hilco,

* Hilco Wijbenga wrote on Sat, Aug 22, 2009 at 09:01:06AM CEST:
> project/INSTALL
[...]
> project/COPYING
> project/build/Makefile.am
> project/build/configure.ac
> project/src/main.c++
> 
> Automake complains about missing these files because it's looking in
> the current directory (build) instead of the root directory (..). I
> went through the Autoconf macros but didn't find anything obviously
> applicable.
> 
> How do I tell Automake to look in '..' instead of '.'?

You can't, really.  Automake assumes the location of the top
configure.ac file coincides with the toplevel directory of the package;
`foreign' is the way around that.

Cheers,
Ralf




Re: Integration of git-based release workflow into "make dist"

2009-08-17 Thread Ralf Wildenhues
Hello,

* Peter Johansson wrote on Sat, Aug 15, 2009 at 10:48:44PM CEST:
> Bob Friesenhahn wrote:
> >I have not verified if referencing a file outside of the source
> >tree really works properly.
> >
> My experience is that inclusion of an external file will be ignored
> by Automake, in other words, the include statement ends up in the
> Makefile. It might still work locally, but quite an ugly
> Makefile(.in) to distribute.

In the Makefile.am,
  include $(srcdir)/fragment.am

or
  include $(top_srcdir)/fragment.am

will have the same effect as if the text in the fragment.am file is
copied verbatim into the Makefile.am file.  IOW, automake-special
variables are handled etc.  In addition to this, the fragment.am file
is made a prerequisite of the Makefile.in file, so that an update will
trigger a rerun of automake (except in disabled maintainer-mode of
course).

If you have experienced different, then please report a bug to the
bug-automake list with a small reproducible example.

Thanks,
Ralf




Re: AC_PROG_AS / AM_PROG_AS

2009-08-16 Thread Ralf Wildenhues
Hello,

* NightStrike wrote on Sun, Aug 16, 2009 at 12:08:49AM CEST:
> (autoconf 2.63, automake 1.11)
> 
> Why is AS found with AM_PROG_AS instead of AC_PROG_AS?  Why is this an
> automake thing and not an autoconf thing?

I can only guess that it's historical reasons, same with AM_PROG_GCJ.
Also, both of these macros do not really address portability issues in
the tools, they merely set some variables and adjust Automake's
dependency tracking accordingly.

One problem with doing a true portable AC_PROG_AS is that the assembler
dialect can not be abstracted away, so the developer will have to
delimit herself anyway, or provide several variants of a file.  In the
end, the gains seem limited.

Cheers,
Ralf




Re: Integration of git-based release workflow into "make dist"

2009-08-15 Thread Ralf Wildenhues
* Roger Leigh wrote on Sat, Aug 15, 2009 at 12:13:21PM CEST:
> Additionally, VCSes don't typically store the results of "make dist",
> only the source for that.  Due to changing tool versions over the
> years, you can get in a situation where you can no longer rebootstrap
> checkouts of old versions.  By storing this in the VCS as well, you
> guarantee that you can get a usable tree even if those tools are no
> longer available or functional.  Plus, you can also diff the
> generated files between releases as an added bonus, which can help
> track down build infrastructure issues.

I think your solution is pretty cool, and goes to show again how nice it
is that git is so versatile.  I might consider using it for future
Automake releases.

But as to inclusion in Automake proper, I'm still slightly on the side
of "this is git-centric and probably a bit Debian-centric" in the sense
that other distros or other developers might want to choose to do it a
bit differently, and I'd rather have Automake offer generic solutions.
Are you coordinating with other distros?  Some feedback from others
would certainly not hurt here.

Thanks,
Ralf




Re: using color-tests backwards-portably

2009-08-14 Thread Ralf Wildenhues
Hello Ben,

* Ben Pfaff wrote on Fri, Aug 14, 2009 at 06:33:14PM CEST:
> As an alternative, could Automake provide an API that allows
> users to say "if feature X is supported, then expand this
> configure.ac code"?  For example:
> 
>   AM_FEATURE_PREREQ([color-tests],
> [AM_INIT_AUTOMAKE([foreign color-tests])],
> [AM_INIT_AUTOMAKE([foreign])])
> 
> This seems to me more in keeping with the Autoconf philosophy.

Yes, the idea sounds better.  But without also exposing something like
AM_SET_OPTION, it would not scale: with the above, you might have
exponentially many cases to cater to as user (e.g., 8 for 3 features,
if you really want to be fully version-agnostic by ignoring that feature
X was introduced after feature Y).

One major detail of this idea is that now, aclocal or the macro code
supplied with Automake would need to know about the set of options that
are supported.  It doesn't do so now; only _process_option_list in
lib/Automake/Options.pm used at automake run time knows.  Also, the set
of options isn't just a set of fixed strings, but includes some regexps.

I would hate to have to duplicate this functionality in m4, esp. since
we definitely also need it at automake run time for those options listed
in the AUTOMAKE_OPTIONS Makefile.am variable.

Cheers,
Ralf




Re: Integration of git-based release workflow into "make dist"

2009-08-14 Thread Ralf Wildenhues
Hello Roger,

* Roger Leigh wrote on Fri, Aug 14, 2009 at 11:40:18AM CEST:
> 
> An initial implementation follows.  This works, but it does need
> further refinement (error checking, for example).  And probably
> review by a git expert.  I'm sure other people can make it much
> nicer, but this hopefully demonstrates the point.

Thanks.  Not to sound discouraging, but this seems to be able to cope
mostly without any Automake internals (am__remove_distdir is a rather
benign issue) nor need special help from Automake.  That means it can
easily live outside of Automake proper for a while at least, until it
is settled and useful for more than a couple of projects or distros,
as Bob already noted.  You can keep it synchronized across projects
easily by putting the code into a fragment.am file and including that
into your toplevel Makefile.am, and synchronizing the fragment, no?

You could consider submitting this fragment as a gnulib module (where
you just place your code into the Makefile.am part of the modules/..
file.  Or as part of maint.mk, if the gnulib crowd likes that better.

> The distributed release is put on a distribution branch, and its
> parents are both the previous release and the current head.  i.e.
> it's a merge of the old distributed release and the current release.
> This lets you do easy merging of changes between both branches, with
> a correct history.

Nice idea, thanks!

Cheers,
Ralf




Re: using color-tests backwards-portably

2009-08-13 Thread Ralf Wildenhues
Hi Jim,

* Jim Meyering wrote on Thu, Aug 13, 2009 at 08:29:21AM CEST:
[...]
> systems with automake older than 1.11.
> On those systems, it's fine to ignore or disable these
> two unsupported options.

Yes.  Often, automake's rejection of unsupported options is stricter
than it would need to be.  Unfortunately, I don't know of a general way
to solve that, as just ignoring (and warning about) them is definitely
not strict enough: some options cause incompatible behavior that may be
desired.

> I proposed a configure.ac-modifying "solution",
> 
> https://www.redhat.com/archives/libguestfs/2009-August/msg00160.html
> 
> but it's too ugly.
> 
> Can you think of another way?

I think Automake should provide an API to allow users to say "if the
Automake version is >= X, then expand this configure.ac code".  I think
that would be general enough (it could use Automake conditionals to
adjust Makefile.am files, it could check for >= X and not >= X+1 to
enforce exact versions (or we could just provide a general version
compare function) and it could then call AM_INIT_AUTOMAKE with the
appropriate options).  Also, it would be explicit enough for the
developer to be conscious about not using this by accident.

However, it would likely allow fool any tools that directly grepped the
configure.ac files for AM_INIT_AUTOMAKE invocations, instead of using
  autom4te --language=Autoconf-without-aclocal-m4 --trace=AM_INIT_AUTOMAKE

and even that would probably not give the right answer then (as
aclocal.m4 contents will be needed in order to provide it).

The initial patch below ought to improve the situation for future
Automake versions (but unfortunately not for past ones).  With it,
you will be able to write code like

  AM_VERSION_PREREQ([1.12],
[# this code assumes Automake 1.12
 AM_INIT_AUTOMAKE([foreign fancy-new-option])],
[# this code does not assume 1.12
 AM_INIT_AUTOMAKE([foreign])])

Does that look like a viable approach to people?

Of course, just like m4_version_prereq, it violates the Autoconf
principle that you should test the actual feature you would like to use,
rather than some version number.  I don't know how to solve that without
another layer of indirection, which would amount to something you
outline in the URL above.  (The other rationale why it may not be so bad
here is that luckily, distros seem to provide pretty much plain upstream
Automake these days, without many patches tacked on and features added,
so that a version number actually retains some meaning.)


What you can do in the meantime to have things work well for you with
git Automake (once that provides AM_VERSION_PREREQ) is something like
  m4_ifdef([AM_VERSION_PREREQ],
   [ use AM_VERSION_PREREQ like above ... ],
   [ backward compat code ... ])

and you could even extend the idea to work back to 1.11 by testing for
some macro that was new in 1.11, such as AM_SILENT_RULES or AM_COND_IF.
However, there is a glitch: in case these macros are defined, i.e., the
version is >= 1.11, you have to also actually use them in your
configure.ac, so that not only aclocal (who searches in the
.../share/aclocal-X.Y directory by default) finds them as defined, but
also actually includes their definition into the aclocal.m4 file, so
that a subsequent autoconf (who doesn't search .../share/aclocal-X.Y)
finds them defined, too.  Clear?

As an unrelated side note, we don't use @defmac to define our macros,
which can cause bugs when used with @dvar on a @-continued definition
line, in the manual.  This should be fixed.

Cheers,
Ralf

New macro AM_VERSION_PREREQ.

* m4/amversion.in (_AM_AUTOMAKE_VERSION): New define.
* m4/prereq.m4 (AM_VERSION_PREREQ): New file, new macro,
with code taken from Autoconf's m4_version_prereq.
* tests/prereq.test: New test.
* tests/Makefile.am: Adjust.
* doc/automake.texi (@dvar): New texinfo macro, taken from
Autoconf.
(Public Macros): Document AM_VERSION_PREREQ.
* NEWS: Update.
Based on suggestion from Jim Meyering.

diff --git a/NEWS b/NEWS
index 7e14ed8..fe11327 100644
--- a/NEWS
+++ b/NEWS
@@ -1,5 +1,10 @@
 New in 1.11a:
 
+* Miscellaneous changes:
+
+  - New macro AM_VERSION_PREREQ to allow expanding different code at M4 run
+time based upon the Automake version used.
+
 Bugs fixed in 1.11a:
 
 * Bugs introduced by 1.11:
diff --git a/doc/automake.texi b/doc/automake.texi
index b3f4a76..9a489ef 100644
--- a/doc/automake.texi
+++ b/doc/automake.texi
@@ -15,6 +15,14 @@
 @r...@var{\varname\}@r{]}
 @end macro
 
+...@c @dvar(ARG, DEFAULT)
+...@c ---
+...@c The ARG is an optional argument, defaulting to DEFAULT.  To be used
+...@c for macro arguments in their documentation (@defmac).
+...@macro dvar{varname, default}
+...@r{[}@var{\varname\} = @samp{\defaul...@r{]}@c
+...@end macro
+
 @copying
 
 This manual is for @acronym{GNU} Automake (version @value{VERSION},
@@ -3

Re: Integration of git-based release workflow into "make dist"

2009-08-13 Thread Ralf Wildenhues
Hello Roger,

* Roger Leigh wrote on Fri, Aug 14, 2009 at 01:07:39AM CEST:
> What can automake do?
> 
> This is a rough outline of what I'd like to do (unless someone beats
> me to it!)
> 
> • Add a dist-git option and Makefile target.
>   This will cause $distdir to be injected into git, rather than just
>   calling tar as for other git targets.

For those bits that are pure additions, it should be possible, as a
test, to just write these rules out in your toplevel Makefile.am, for
your specific project.  Doing so once, and looking at the result and its
semantics can greatly help eliminate design errors.

IOW, show exactly what rules you would like automake to generate for
you, and which other things change, that make it much easier to comment
on.  Then, as long as these changes are pure additions (rather than
changes to existing generated text), it is easy to try them out for a
while in a couple of projects and gain experience with them.

> • This will require some additional options in order to work correctly:
>   · A branch name (the head to append the new tree to)
>   · [optional] Tag name, could be a pattern such as dist/$(VERSION)
>   · [optional] Flag for signing the tag or not
>   . [optional] Template commit message
>   These could all just be variables in the top-level Makefile.am.

As to package versioning, see this discussion and references therein:

as well as the approach currently used in gnulib's GNUmakefile.
(I still haven't gotten very far with a nonworking patch on this.)

The gnulib machinery also has sample rules for distribution in maint.mk;
it is helpful to look at them and see what different semantics you need
etc.

Cheers,
Ralf




Re: distcheck warning: libtool --finish...

2009-08-13 Thread Ralf Wildenhues
Hello Peter,

* Peter Johansson wrote on Thu, Aug 13, 2009 at 08:27:56PM CEST:
> 
> I use automake and libtool in a project and when I run `make
> distcheck' I get the following warning sent to stderror:
> 
> libtool: install: warning: remember to run `libtool --finish 
> /home/peter/projec
> ts/pristine/yat-trunk/build/yat-0.6pre/_inst/lib'
> 
> Am I doing something wrong here or why do I get that warning? It seems
> I get the same warning when I issue:
> 
>  make install DESTDIR=`pwd`/tmp > stdout

Problem is, when you install like that, libtool cannot run $finish_cmds,
as that typically (e.g., 'ldconfig -n' on GNU/Linux) requires libraries
to show up at their final location, not in a temporary staging
directory.  So once the libraries are in their final location, you can
either wait for a reboot (or ldconfig) to happen by some system
mechanism, or you can run the --finish command line to achieve the same
goal.

You can safely ignore the warning during distcheck, as that just
exercises a DESTDIR install.

Cheers,
Ralf




Re: aclocal.m4 location

2009-08-11 Thread Ralf Wildenhues
Hello Sam,

* Sam Steingold wrote on Tue, Aug 04, 2009 at 06:06:03PM CEST:
> automake requires aclocal.m4 to be present in the current directory.
> this is not the case for me, I have one master aclocal.m4 and a few
> subprojects, each with its own Makefile.am and configure.in.
> this means that for automake to run, I have to create a temp symlynk for
> aclocal.m4 and then the generated Makefile requires the symlink to be present
> (otherwise it calls automake which fails because aclocal.m4 is not in the
> current directory).

Hmm, that's not so easy, because aclocal, trying to optimize, puts only
those macros from the Automake package into the aclocal.m4 file which
are used for the Automake support in the current package.  So if one of
your subprojects needs another macro, it couldn't detect that, and your
configure script could not be generated correctly.

OTOH, aclocal.m4 files are typically not large any more if the macro
files they reference are in-tree (then they mostly contain a list of
m4_include()s for those files), so having a few of them around doesn't
hurt that much.

(Note that, if you are not using automake, then you can have a
hand-written aclocal.m4 file, and the above does not apply.)

> Would it be possible to add a command line option to automake to specify the
> location of the aclocal.m4?

Hmm, even having that (without the ability to share generated aclocal.m4
files between packages) could open at least a small can of worms, as a
few of the tools rely on the exact location of this file.  Haven't
checked in detail yet, though.

Hope that helps.

Cheers,
Ralf




Re: Tests fail due to argument list too long

2009-08-11 Thread Ralf Wildenhues
Hello Bob,

this was about:


* Bob Friesenhahn wrote on Sat, Aug 01, 2009 at 10:05:25PM CEST:
> Something surprising about the AIX 4.3.3 argument list length
> problem is that I am not seeing this problem reported anywhere else.
> Not even under MinGW and Cygwin.

I think one reason here might be that GNU make may create a batch file
under w32 in order to execute the actual command.

> Either the problem only occurs under this old AIX or else the
> problem occurs elsewhere but is not detected or reported.

What is the size of your environment on this AIX system?
How much does 'make' add to it?  Are you using GNU make or the system
one, and if the former, does the .NOEXPORT help at all, and if the
latter, does it have a similar feature?

If you control this AIX system, maybe you can increase the limit using
chdev as described on Sven's page (for AIX 5)?

Cheers,
Ralf




Re: Nonrecursive Makefile.am: Ensuring directory creation?

2009-08-11 Thread Ralf Wildenhues
Hello Jack,

* Jack Kelly wrote on Tue, Aug 11, 2009 at 02:37:22PM CEST:
> I am writing a nonrecursive Makefile.am, and cannot find the
> recommended way of requiring that directories in the source tree have
> had their equivalents created in the build tree. This has led me to
> write rules like this:
> 
> 8<---
> doc/fake437-primitive.texi: doc/$(am__dirstamp)
> include/fake437/primitive.h; $(EXTRACT_TEXI)
> --->8
> 
> but I feel that using anything starting with $(am__) is asking for
> trouble by relying on internal details.

Right you are in assuming that.

Unfortunately, Automake doesn't yet provide a good API for directory
stamping yet.  You can easily use your own though, that don't interfere
with automake's stamps and rules:

  subdir/generated-file: subdir/my-dirstamp
commands ...
  subdir/my-dirstamp:
@$(MKDIR_P) subdir
@: > $@
  DISTCLEANFILES += subdir/my-dirstamp

Hope that helps.

Cheers,
Ralf




Re: .deps/foo.Po:1: *** multiple target patterns. Stop.

2009-08-10 Thread Ralf Wildenhues
* Sam Steingold wrote on Mon, Aug 10, 2009 at 11:42:57PM CEST:
> On Mon, Aug 10, 2009 at 5:25 PM, Ralf Wildenhues 
> wrote:
> > * Sam Steingold wrote on Mon, Aug 10, 2009 at 11:10:14PM CEST:
> >>
> >> this is a plain cygwin installation.
> >
> > Then you should use /cygdrive/c/... style paths, not C:/... ones.  Your
> > compiler will understand the former (assuming you are using the "cross"
> > compiler that creates pure w32 executables and comes with Cygwin).
> 
> this sounds reasonable, however, what happens if clisp is compiled
> using msys/mingw system instead?

Then you use the compiler that comes from the MinGW web site (it
understands C:/ style paths only) and the shell and runtime from
there (it translates /C/ paths on the command line to C:/ style)
and you use CPPFLAGS=-IC:/... and the like.  All consistent.

Of course you wouldn't use a clisp compiled for MSYS/MinGW on Cygwin,
or vice versa.  Just treat both systems as "very different", requiring
something like cross compilation, and things will look less surprising.

Cheers,
Ralf




Re: .deps/foo.Po:1: *** multiple target patterns. Stop.

2009-08-10 Thread Ralf Wildenhues
* Sam Steingold wrote on Mon, Aug 10, 2009 at 11:10:14PM CEST:
> On Mon, Aug 10, 2009 at 4:39 PM, Ralf Wildenhues wrote:
> > * Sam Steingold wrote on Mon, Aug 10, 2009 at 10:23:50PM CEST:
> >> <http://thread.gmane.org/gmane.comp.lib.gnulib.bugs/18253>
> >
> > Hmm, that doesn't tell which 'make' variant you're using, nor how you
> > invoked configure, or whether this is mixing Cygwin and MinGW tools,
> > or who added that -Ic:/ path, whether it was you or some part of the
> > configure script.  These information bits would help.
> 
> this is a plain cygwin installation.

Then you should use /cygdrive/c/... style paths, not C:/... ones.  Your
compiler will understand the former (assuming you are using the "cross"
compiler that creates pure w32 executables and comes with Cygwin).

Cheers,
Ralf




Re: .deps/foo.Po:1: *** multiple target patterns. Stop.

2009-08-10 Thread Ralf Wildenhues
* Sam Steingold wrote on Mon, Aug 10, 2009 at 10:23:50PM CEST:
> On Mon, Aug 10, 2009 at 4:20 PM, Ralf Wildenhues wrote:
> > If you had given a reference to the discussion where this occurred
> > initially (I know that I've skimmed it before, but as I'm still catching
> > up with mail backlog I am not eager to search for it), I might've even
> > been able to say what exactly your issue is, and whether it is an
> > Automake one.  ;-)
> 
> <http://thread.gmane.org/gmane.comp.lib.gnulib.bugs/18253>

Hmm, that doesn't tell which 'make' variant you're using, nor how you
invoked configure, or whether this is mixing Cygwin and MinGW tools,
or who added that -Ic:/ path, whether it was you or some part of the
configure script.  These information bits would help.

Cheers,
Ralf




Re: .deps/foo.Po:1: *** multiple target patterns. Stop.

2009-08-10 Thread Ralf Wildenhues
Hello Sam,

* Sam Steingold wrote on Mon, Aug 10, 2009 at 09:20:32PM CEST:
> I get the above message:
> .deps/regex.Po:1: *** multiple target patterns.  Stop.
> on mingw with many files.
> it does not seem to prevent building, but it looks scary.
> what does the message mean?

'make' stumbles over file names with a drive spec:

  regex.o regex.o:  \
  [...]
  c:/sds/dev/current/build-mingw-1/gllib/stdint.h \

It seems you are mixing w32 path-aware tools (GCC) with tools not w32
path-aware (make).  Typically, the solution is to use matching sets of
programs, either by using mingw32-make, or using MinGW/MSYS without
Cygwin tools, or using Cygwin fully.  My MSYS make has no problem with
these file names.

If you had given a reference to the discussion where this occurred
initially (I know that I've skimmed it before, but as I'm still catching
up with mail backlog I am not eager to search for it), I might've even
been able to say what exactly your issue is, and whether it is an
Automake one.  ;-)

Cheers,
Ralf




Re: Translation of automake docs into spanish

2009-08-10 Thread Ralf Wildenhues
Hello Santilín,

* santilistas wrote on Sat, Aug 08, 2009 at 01:01:03PM CEST:
> Hi, I'd like to know if there is a translation of the automake docs into
> Spanish.

See my answer regarding the Autoconf manual,
.

Cheers,
Ralf




Re: Tests fail due to argument list too long

2009-07-31 Thread Ralf Wildenhues
* Bob Friesenhahn wrote on Fri, Jul 31, 2009 at 07:29:31PM CEST:
> On Fri, 31 Jul 2009, Ralf Wildenhues wrote:
> 
> >You should be able to work around it by splitting tests in pieces,
> >and running things piecewise only for now; assuming the *_TESTS
> >variables are all nonempty:

> > check-wand:
> >   $(MAKE) $(AM_MAKEFLAGS) check-TESTS \
> >   TESTS='$(WAND_TESTS)' TEST_SUITE_LOG=test-suite-wand.log

> Are you saying that the user would then need to explicitly invoke
> 'check-wand' or is it possible to create a 'check' dependency so
> that all of these would be implicitly invoked in turn?

I think it should be possible to create a 'check' target that invokes
all bits in turn (ignore automake target override warnings with
-Wno-override).

> It looks
> like there would be a bunch of mini-test reports and so the user is
> likely to miss most of the reports since they would have scrolled on
> by.

You could error out after the first failure; or provide a summary at the
end, manually.  As I said, it's a hack for now (and it also uses
internal details like check-TESTS).  I still need to think of a way to
fix this issue properly in a future release.

> It is likely that the number of tests will quadruple over time since
> currently the tests are the bare minimum (only 785 tests) for my
> software.  I don't mind explicitly splitting internally if there is
> still a way for the user to do the standard 'make check' and 'make
> distcheck' (with reliable ultimate failure on test fail) and see a
> total report for all of the tests at the end.

You could use multiple Makefile.am files (gasp!) with one test suite in
each.  Or you could merge multiple tests in one test script/program of
yours.  All hacks, yes, but at least they should get you going for now.

Cheers,
Ralf




Re: Tests fail due to argument list too long

2009-07-31 Thread Ralf Wildenhues
Hello Bob,

* Bob Friesenhahn wrote on Fri, Jul 31, 2009 at 06:24:09PM CEST:
> On a system running the very latest modern release of AIX 4.3.3 I am
> encountering this problem at the end of 'make check':
> 
> PASS: utilities/tests/preview_threshold.sh
> PASS: utilities/tests/preview_wave.sh
> gmake[3]: execvp: /bin/sh: The parameter or environment lists are too long.
> gmake[3]: *** [test-suite.log] Error 127
> 
> The package uses Automake 1.11 with these Automake options:
> 
> AUTOMAKE_OPTIONS = 1.11 subdir-objects parallel-tests color-tests
> dist-bzip2 dist-lzma foreign

Does that error go away if you remove the parallel-tests option?

The list of test logs in GraphicsMagick has a length of 24132 over here,
which isn't surprising to exceed the AIX 4.3.3 limit (see
).  But the old test
suite driver should have had the same problem, as it also expands
$(TESTS) which isn't much shorter (23348 here).

So far we do not know of a portable way (both to different operating
systems and to different make implementations) to work around this limit
in a general way.  With many Automake constructs, you can use multiple
special variables (see the "Length Limitations" node in the manual), but
such a solution has not been implemented in Automake for tests yet.

You should be able to work around it by splitting tests in pieces,
and running things piecewise only for now; assuming the *_TESTS
variables are all nonempty:

  TESTS = $(WAND_TESTS) $(UTILITIES_TESTS) ...

  check-wand:
$(MAKE) $(AM_MAKEFLAGS) check-TESTS \
TESTS='$(WAND_TESTS)' TEST_SUITE_LOG=test-suite-wand.log

  check-utilities:
...

I agree that it's a bit of a kludge (and I haven't tested it yet).

Cheers,
Ralf




Re: EXTRA_DIST respects Automake conditionals?

2009-07-30 Thread Ralf Wildenhues
Hello Ben,

* Ben Pfaff wrote on Thu, Jul 30, 2009 at 04:57:08AM CEST:
> I was surprised today to discover that EXTRA_DIST respects
> Automake conditionals.

I think you are the first person to report this, but thinking about how
generally conditionals more or less cover file creation and installation,
but only less than more cover distribution, I see that this makes perfect
sense to assume.  In fact, we have at least one open bug report to the
opposite where it was undesirable to distribute some file that was
conditionally not used.

Generally, it might be necessary to introduce more than one kind of
conditionality here, or, in practice, introduce prefixes that declare
that some item is to be conditionalized not only for creation and
installation, but also for distribution.

Not sure whether that will suffice for all kinds of use cases.  For
example, could a user want to let COND1 govern creation and installation
only but COND2 also govern distribution?  If yes, then we'd need to
think about something more general, like another type of conditional.

In the particular case of EXTRA_DIST, there is only one semantic
attached to this variable: distribution.  So, barring the need for
another type of conditional, it would be sufficient to add another
variable that unconditionally distributes things.  Changing the current
semantics seems like way too dangerous a thing to do I think.

As a workaround, as you say, you could use noinst_HEADERS; can't think
of any better off the top of my head.

Cheers,
Ralf




Re: Can 'make ctags' create tags in top_builddir?

2009-07-25 Thread Ralf Wildenhues
* Rhys Ulerich wrote on Tue, Jul 21, 2009 at 02:56:37PM CEST:
> >> Could the ctags -a argument (or some other mechanism) be used to
> >> simulate the inclusion mechanism?
> >
> > AFAIK there is no such mechanism in ctags files, nor in vi or its clones
> > who read tags files.
> 
> It may be possible to use ctags' -a/--append option here.  Maybe having
> 'make ctags' run in $top_builddir do something like rm -f $top_builddir/ctags.
> After that, the 'make ctags' target in all subdirectories appends to
> $top_builddir/ctags using 'ctags -a $top_builddir/ctags'.
> 
> If that vague description sounds useful, I'll take a pass at getting it 
> working.
> Let me know.  If that's not something you think generally useful, I'm happy to
> stick to ctags -R for my own work.

TBH, I'm not sure either what would be the best way.

Can you try out if it works well with emacs' ctags implementation?
What if the package in question is so large that the user might not want
to point her editor at a package-wide tags file, but rather at one in a
subdirectory which only points to symbols within that tree?  This is
especially helpful if other parts of the source tree contain the same
set of symbols, too.

Thanks,
Ralf




Re: how to install library in a specific directory?

2009-07-24 Thread Ralf Wildenhues
Hello,

* John Calcote wrote on Thu, Jul 23, 2009 at 08:02:37AM CEST:
> On 7/22/2009 9:15 PM, A117 wrote:
> >Thank you. I've decided to put the library in /usr/local/lib, while its 
> >header files in /usr/local/include/ezproject.
> >It's strange though /usr/local/lib is in /etc/ld.so.conf (actually in 
> >another file it includes), and I can build other programs acting much as 
> >mine, I have difficulty with mine only. I run ldconfig manually and then it 
> >works. Now I'm releasing my software.
> 
> ldconfig updates the library cache. the /etc/ld.so.conf file is the
> file used by ldconfig to determine which directories to scan for
> libraries. When you add a new library to one of the directories in
> /etc/ld.so.conf, then you need to run ldconfig to ensure that the
> cache is aware of that new library.

FWIW, 'libtool --mode=install' should be running 'ldconfig -n' if you
don't happen to use DESTDIR; otherwise, there is --finish.

Cheers,
Ralf




Re: automake error with filenames

2009-07-21 Thread Ralf Wildenhues
Hello Murray,

* Murray S. Kucherawy wrote on Tue, Jul 21, 2009 at 05:44:48PM CEST:
> sbin_PROGRAMS = openfoo openfoo-util1 openfoo-util2
> 
> One of the source files for "openfoo" is a file called "stats.c".

> if STATS
> sbin_PROGRAMS += openfoo-stats
> openfoo_stats_SOURCES = openfoo-db.c openfoo-stats.c
> endif
> 
> There are indeed two different source files in the package, one called
> "openfoo-stats.c" (used by openfoo-stats) and one called "stats.c"
> (used by openfoo).
> 
> Running "automake" causes this complaint:
> 
> openfoo/Makefile.am: object `openfoo-stats.$(OBJEXT)' created by 
> `openfoo-stats.c' and `stats.c'
> 
> I thought maybe this has to do with filename similarity.  Indeed
> renaming the files, even by removing the hyphens, solves the problem.

The reason for this can be found in
  info Automake "Per-Object Flags"

and the previous node.

> Is this something I can mitigate without renaming files in my package?

You can add something like
  openfoo_stats_CPPFLAGS = $(AM_CPPFLAGS)

as described for a related issue in
  info Automake "Objects created both with libtool and without"

> Also, just to confirm, this is something I can safely ignore, correct?:

That would only be correct if you never intend to have your package
built with a compiler that does not grok the flags "-c -o" together.
These are primarily found on older platforms, and with some MSVC
uses on w32 (I recommend using a wrapper that avoids this issue).

> libfoo/Makefile.am:2: compiling `foo.c' with per-target flags requires 
> `AM_PROG_CC_C_O' in `configure.ac'
> 
> Adding that to my "configure.ac" as it suggests doesn't make the problem go 
> away.

It should, after you have regenerated all autotools files afterwards.

Cheers,
Ralf




Re: CLASSPATH_ENV warning

2009-07-21 Thread Ralf Wildenhues
Hello Braden,

* Braden McDaniel wrote on Fri, Jul 17, 2009 at 06:21:14AM CEST:
> When using -Wall, I get this warning from automake:
> 
> tests/Makefile.am:79: user variable `CLASSPATH_ENV' defined here...
> /usr/share/automake-1.11/am/java.am: ... overrides Automake variable 
> `CLASSPATH_ENV' defined here
> tests/Makefile.am:81:   while processing `dist_noinst_JAVA'
> 
> Is this really appropriate?

Nope.  Will fix.

> CLASSPATH_ENV is documented in the manual as something that is
> user-settable.

Thanks,
Ralf




Re: Can 'make ctags' create tags in top_builddir?

2009-07-21 Thread Ralf Wildenhues
Hello Rhys,

* Rhys Ulerich wrote on Thu, Jul 16, 2009 at 01:10:04AM CEST:
> I noticed that the default 'make tags' target creates a toplevel TAGS
> file that includes all the TAGS files in subdirectories.  This is
> goodness.  At least on automake 1.10.1 it seems that a similar,
> toplevel tags file is not created for the 'make ctags' target.  Is
> this because ctags does not support an inclusion mechanism?

Right.

> Could the ctags -a argument (or some other mechanism) be used to
> simulate the inclusion mechanism?

AFAIK there is no such mechanism in ctags files, nor in vi or its clones
who read tags files.

With Exuberant Ctags, I've come to just use 'ctags -R $(srcdir) .' in
the toplevel build tree.  But -R is not POSIX (and has a different
meaning for emacs ctags, for example).  I'm not sure how to make the
support in Automake more useful.

Hope that helps.

Cheers,
Ralf




Re: docdir with "packages" directory

2009-07-10 Thread Ralf Wildenhues
Hi John,

in addition to what Russ wrote:

* John Calcote wrote on Fri, Jul 10, 2009 at 07:52:39PM CEST:
> 
> 1) Why the controversy?

I wasn't aware of any distro using *packages*.  Surely Debian, its
derivatives like Ubuntu, and RHEL 4 don't use it.

> 2) Is there any movement within the Automake community to move
> toward the LSB directory structure as the default?

Automake just follows what the GNU Coding Standards says for matters
like this.  It's the entity that should be changed, for Automake to
change.  So you should involve the bug-standards list in this is,
and otherwise this is mostly an Autoconf issue, too, rather than an
Automake one (but that's a minor detail).

However, so far I don't see a compelling reason for this.  If only some
distros use this, then using configure command line arguments (or a
config.site file) is enough to mandate local policy.

Cheers,
Ralf




Re: problem with make maintainer-clean while doing static linking

2009-07-09 Thread Ralf Wildenhues
Hi Vincent,

* Vincent Torri wrote on Thu, Jul 09, 2009 at 08:28:50AM CEST:
> We have a library that uses modules. These modules can be shared lib
> or can be statically linked to the library. We use that tree
> structure:
> 
> evas
>   src
> lib
>   modules
> loaders
>   png
> 
> the library itself is in evas/src/lib
> the modules (here the png loader) is in evas/src/modules/loaders/png
> We first compile the library, then the modules.

> In evas/src/lib/Makefile.am we have added:
> 
> if EVAS_STATIC_BUILD_PNG
> SUBDIRS += ../modules/loaders/png
[...]
> The problem is that, when doing 'make maintainer-clean', the clean
> is done first in evas/src/lib, hence, with the code in
> evas/src/lib/Makefile.am, in evas/src/modules/loaders/png too.

> 1) Is our use for doing static linking correct ?

Seems ok to me.

BTW, whenever you add @substed@ values to *_LIBADD, automake cannot know
whether these substed values will contain references to other
uninstalled libraries or objects, for which it would be nice to add make
dependencies to, so the library is rebuilt when they are updated.  In
that case, what you can do is fill out the corresponding *_DEPENDENCIES
variable yourself, in that case with all library dependencies of course.

> 2) If yes, what should I do to correct the problem above (more
> precisely, I want 'make distcheck' to succeed, but it's the same
> problem than 'make maintainer-clean')

You can override DIST_SUBDIRS to avoid this.  In your particular case,
you should set it in evas/src/lib/Makefile.am, to the empty value.

Hope that helps.

Cheers,
Ralf




Re: -I. in DEFAULT_INCLUDES

2009-07-06 Thread Ralf Wildenhues
Hello Bob,

* Bob Ham wrote on Mon, Jul 06, 2009 at 01:24:06PM CEST:
> I have a problem due to conflicts between local and system header
> filenames.  This problem comes about because of the addition of -I. to
> the CXXFLAGS of any objects.  I've traced this to a variable called
> DEFAULT_INCLUDES in every Makefile.in:
> 
>   DEFAULT_INCLUDES = -...@am__isrc@ -I$(top_builddir)
> 
> 
> Why does this -I. exist?  How can I remove it?

See
  info Automake --index-search nostdinc

for information about the Automake option nostdinc.

Cheers,
Ralf




Re: Coverage check for data files inclusion in dist

2009-07-03 Thread Ralf Wildenhues
Hello Christian,

* Christian Egli wrote on Thu, Jul 02, 2009 at 12:15:22PM CEST:
> I work in a project where we ship a lot of data files. These files sit
> in a subdirectory where Makefile.am lists them explicitly as follows:
[...]

> But as a compromise would it maybe be possible to have some form of
> coverage check that takes the table_files variable and checks if there
> are files in the directory that are not listed (aside from the obvious
> Makefile.am and Makefile)? Anyone have some code snippets to share? I
> checked gnulib but it doesn't seem to contain anything like that.

Well, how can your project tell whether they are complete or not?
That's the test I would cause "make check" to run.

Cheers,
Ralf




Re: automake question

2009-07-01 Thread Ralf Wildenhues
Hello Jerry,

* Jerry Wang wrote on Wed, Jul 01, 2009 at 06:00:24PM CEST:
> Has the if/else statement always been supported by automake?

Conditionals first appeared in version 1.2 from 1997, and else/endif
in 1.5 from 2001.  However, several features need special handling for
being able to appear within a conditional, and for some, that has only
been added rather recently (or not yet at all, such as texinfo outputs).

> - I have 2
> machines with the same source code, 1 has an older version of automake, and
> it's failing to build at the if/else statement.

Easiest to say if you state both versions used, and post the Makefile.am
snippet that fails, or a small reproducible example.

Cheers,
Ralf




Re: Circular dependency dropped warning from make in relating to CUDA code

2009-06-29 Thread Ralf Wildenhues
* Adam Mercer wrote on Mon, Jun 29, 2009 at 10:19:59PM CEST:
> On Sat, Jun 27, 2009 at 05:33, Ralf Wildenhues wrote:
> 
> > (Note also that using $< in target rules is not portable to non-GNU
> > make).
> 
> Is there a more portable version?

Using $< in inference rules, as in my example, is portable.
In target rules, you can spell out the input file name
(including an eventual vpath prefix).

> > Here's what I would do:
> >
> > NVCC = nvcc
> > NVCFLAGS = -cuda --host-compilation=c
> > SUFFIXES = .cu .c
> > .cu.c:
> >        $(NVCC) $(NVCFLAGS) $(INCLUDES) $(CPPFLAGS) --output-file $@ $<
> >

Cheers,
Ralf




Re: problems with recursive make target

2009-06-29 Thread Ralf Wildenhues
Hello John,

* johnwohlb...@gmail.com wrote on Mon, Jun 29, 2009 at 09:36:09PM CEST:
> in top/lib/Makefile.am
> SUBDIRS = pika_comm pika_utilities
> # provide a separate recursive target for making tests
> tests : all
> echo `pwd`;
> for dir in $(SUBDIRS); do \
> cd $$dir && $(MAKE) $(AM_MAKEFLAGS) $@ || exit 1; \
> done
> echo `pwd`;
> .PHONY : tests

You don't ever 'cd' back out of the first subdirectory, so you can't
find the second:

> make[2]: Leaving directory `/users/wohlbier/MPMC/pika/build/lib/pika_comm'
> /bin/sh: line 1: cd: pika_utilities: No such file or directory
> make[1]: *** [tests] Error 1

Cheers,
Ralf




Re: Circular dependency dropped warning from make in relating to CUDA code

2009-06-27 Thread Ralf Wildenhues
Hello Adam,

* Adam Mercer wrote on Fri, Jun 26, 2009 at 09:23:03PM CEST:
> make: Circular CudaChisq.cu <- CudaChisq.cu.c dependency dropped.
> 
> The following rule is used to generate the .c file from the .cu source:
> 
> CudaChisq.cu.c: CudaChisq.cu
>   nvcc -cuda --host-compilation=c $(INCLUDES) $(CPPFLAGS) --output-file 
> $@ $<
> 
> and I think that this is the source of the dependency warning. Does
> anyone know how I can "fix" this warning?

Why not name the product file CudaChisq.c rather than CudaChisq.cu.c?
The problem here is that the other half of the circle comes from the
make-internal rule `%: %.c' for compiling single-source programs.
I guess you can also avoid it with `make -r', but that may have other,
unwanted consequences.

(Note also that using $< in target rules is not portable to non-GNU
make).

Here's what I would do:

NVCC = nvcc
NVCFLAGS = -cuda --host-compilation=c
SUFFIXES = .cu .c
.cu.c:
$(NVCC) $(NVCFLAGS) $(INCLUDES) $(CPPFLAGS) --output-file $@ $<

and then list *.cu files in automake *_SOURCES variables instead of the
*.c files.  You can list *.c files in EXTRA_DIST if you want to have
them distributed (which implies that the output of nvcc should be
portable, which I don't know whether it is the case).

Hope that helps.

Cheers,
Ralf




Re: dvi bug in distcheck target?

2009-06-24 Thread Ralf Wildenhues
* John Calcote wrote on Wed, Jun 24, 2009 at 09:15:10PM CEST:
> It never occurred to me that Automake would
> look inside the texi file to determine the name of the output file,
> but it makes sense.

I don't think it does.  Well, maybe it produces "expected results" for a
large number of use cases, but every time the output of automake depends
upon the contents (or even existence) of other files, we have dependency
issues and are liable for surprises: strictly speaking, in this case,
Makefile.in should depend upon the .texi file, and thus be regenerated
each time the .texi file changes.  However, that would certainly not be
expected by automake users just writing on their manual a bit.

Cheers,
Ralf




Re: dvi bug in distcheck target?

2009-06-24 Thread Ralf Wildenhues
Hi John,

thanks for the report.

* John Calcote wrote on Wed, Jun 24, 2009 at 08:10:02PM CEST:
> It appears that make clean is leaving the dvi file in place. In
> fact, when I manually execute make clean, after make dvi, I get the
> following output:
> 
> test -z "zardoz" || rm -f zardoz
> rm -rf zardoz.aux zardoz.cp zardoz.cps zardoz.fn zardoz.fns zardoz.ky \
>   zardoz.kys zardoz.log zardoz.pg zardoz.pgs zardoz.tmp \
>   zardoz.toc zardoz.tp zardoz.tps zardoz.vr zardoz.vrs \
>   sample.dvi sample.pdf sample.ps sample.html
> rm -f *.o
> 
> It looks like the last line should contain:
> 
>   zardoz.dvi zardoz.pdf zardoz.ps zardoz.html

It would if you had
  @setfilename zardoz.info

in your zardoz.texi file.  Hmm, this is probably a bug in Automake,
but from 'info texi2dvi', I cannot even infer whether it is intentional
that @setfilename not decide the name of DVI or PDF output, and while I
think it implies to do so for HTML, I'm not fully sure either.

Cheers,
Ralf




Re: Dependencies bw LTLIBRARIES at install time

2009-06-23 Thread Ralf Wildenhues
Hi Akim,

* Akim Demaille wrote on Tue, Jun 23, 2009 at 11:52:43AM CEST:
> I might have missed the information, but I believe there are no
> instruction about handling the dependencies bw libraries are install
> time (they all live in the same Makefile.am, and I do not want to
> break it).

Well, we don't have an end-all be-all solution to this issue yet.
I babbled about an idea to solve this a while ago in

but never got around to an actual implementation.

> I have libfoo that depends on libbar, and both are
> installed in different directories.  Too bad, install goes in the
> wrong order, and libtool fails to relink libfoo which it tries to
> install first.  Had both been in the same directory, then according
> to the code, I just have to define then in lib_LTLIBRARIES in the
> right order.

Right.

> install-execenvLTLIBRARIES: install-libLTLIBRARIES

> But of course it does not work: with a simple warning, Automake
> throws away its own install rule, "overriden" by mine.  My work
> around: an
> 
>   @ADDITIONAL_DEPENDENCIES@
> 
> line in Makefile.am which I AC_SUBST with the right content.
> 
> Are there any recommendations on this regard?

Yeah, you can avoid changing configure.ac in your workaround
by hiding the target name in a variable:

  inst_execenv_libs = install-execenvLTLIBRARIES
  $(inst_execenv_libs): install-libLTLIBRARIES

But of course you are still invading Automake private namespace, and
exploiting the bug that automake does not "see" through variables for
rule target names.  I suppose that we might break your hack with the
Automake version that will finally provide a solution for this.

BTW, strictly speaking, you are also missing a opposite workaround for
the uninstall rule (just in case your "make uninstall" is racing with,
or uses programs that use these libraries, and you have an older, say
incompatible, set of libraries lying around as well).

Cheers,
Ralf




support for lzip (was: GNU Automake 1.11 released)

2009-06-12 Thread Ralf Wildenhues
[ adding the automake list; as this is a topic of general interest ]

Hello Antonio, and sorry for the delay,

* Antonio Diaz Diaz wrote on Tue, May 19, 2009 at 06:26:53PM CEST:
> Congratulations on the new release.

Thank you!

> Ralf Wildenhues wrote:
>>
>>   - "make dist" can now create xz-compressed tarballs,
>> as well as (deprecated?) lzma-compressed tarballs.
>
> Do you plan to support the creation of lzip-compressed tarballs?
> http://lists.gnu.org/archive/html/automake/2008-11/msg00076.html

Well, to be honest, I regard the addition of lzma support as a mistake
in hindsight.  We should have waited and added xz support only, with a
stable xz release.  It was also a mistake that I did not deprecate lzma
support more prominently in the 1.11 release; will fix that for 1.11.1.

In general, adding support for a new compression format Foo to Automake
incurs several costs, at least with the current way we do things:

- it adds a 'dist-Foo' rule to the toplevel Makefile.in of each package,
- it adds a 'dist-Foo' option to the automake command, which controls
  whether the 'dist' rule also creates tarballs compressed with Foo.
- it tends to promote proliferation of compression formats.  Results of
  this are both increased load on file servers (when several formats are
  offered) and caches/mirrors, and increased pressure on end users (and
  maybe developers, too) to install several decompression tools.

Here's my own personal preference: I'd offer packages in two or at most
three formats only: one that is widely usable (gzip fits this well), and
one that offers a mixture of best compression and lowest extraction work
(so that, with many downloads, bandwith and CPU cycles are minimized).
In practice, for me that would mean that, in addition to gzip, I'd offer
one other compression.  zip would fit good for portability; so far I've
gone with bzip2 for Automake, but that should probably be revisited in
the light of recent improvements.

Of course, there is not much reason the priorities for Automake users
should equal my personal ones; rather, Automake should be usable for a
broad range of developers, and it should provide a reasonable range of
functionality.

So, where does this leave us for lzip support?  I'm not quite sure
myself.  From a totally non-objective internet search, I see that it
does not have a very large user base; but a comparison with lzma/xz user
base is not conclusive.  From feature set comparison, I don't quite see
the striking advantage that lzip offers over the choices already
available.  If anything, then I don't like encouraging fragmentation of
the API base.

What do other people think?

Thanks,
Ralf




Re: installation error with automake 1.11 but not with 1.10

2009-06-12 Thread Ralf Wildenhues
OK, I know what's happening now, it is an issue with trailing slashes.
Luckily, there is an easy workaround: Please use
  controllerdir = $(libdir)/eina/mp

instead of
  controllerdir = $(libdir)/eina/mp/

in the Makefile.am.  Be sure to run "make clean" once, so that
eina_chained_mempool.la is rebuilt.

I suppose we should make the libtool script be more forgiving about
this, and definitely document this new requirement in the Automake
manual.  I don't see how to easily fix it within Automake alone, without
also giving up the faster install.

Cheers, and thanks again for the report,
Ralf




Re: installation error with automake 1.11 but not with 1.10

2009-06-12 Thread Ralf Wildenhues
Never mind, I can reproduce it with an in-tree deplib.
Working on a fix.




Re: installation error with automake 1.11 but not with 1.10

2009-06-12 Thread Ralf Wildenhues
* Vincent Torri wrote on Fri, Jun 12, 2009 at 08:44:59AM CEST:
> >* Vincent Torri wrote on Thu, Jun 11, 2009 at 11:30:14PM CEST:
> >>test -z "/tmp/lib/eina/mp/" || /bin/mkdir -p "/tmp/lib/eina/mp/"
> >>/bin/sh ../../../../libtool   --mode=install /usr/bin/install -c   
> >>eina_chained_mempool.la '/tmp/lib/eina/mp/'
> >>libtool: install: error: cannot install `eina_chained_mempool.la' to a 
> >>directory not ending in /tmp/lib/eina/mp/

> >Ouch.  You've possibly found the first regression in Automake 1.11.
> >However, I cannot reproduce it with Libtool 2.2.6 nor with Libtool
> >1.5.26 (and I remember to have tried the faster install code with
> >these versions before the Automake 1.11 release).  So which libtool
> >version are you using here?
> 
> libtool git (2.2.7a)

Hmm, weird.

> Note that there is no problem with a "standard" library. Would it be
> possible that there is the bug because -module -avoid-version are
> passed to libtool 2.2.7a, using automake 1.11 ?

Well, I'm trying almost exactly what you posted, and cannot see it.
Which Autoconf version are you using, and how exactly did you invoke
configure, make, and make install?  Maybe it's a problem with trailing
slashes.

Thanks,
Ralf




Re: installation error with automake 1.11 but not with 1.10

2009-06-11 Thread Ralf Wildenhues
Hello Vincent,

thanks for the bug report.

* Vincent Torri wrote on Thu, Jun 11, 2009 at 11:30:14PM CEST:
> We are experiencing a strange error problem when doing 'make
> install' with automake 1.11. The error is:

> test -z "/tmp/lib/eina/mp/" || /bin/mkdir -p "/tmp/lib/eina/mp/"
> /bin/sh ../../../../libtool   --mode=install /usr/bin/install -c   
> eina_chained_mempool.la '/tmp/lib/eina/mp/'
> libtool: install: error: cannot install `eina_chained_mempool.la' to a 
> directory not ending in /tmp/lib/eina/mp/
> 
> If we use automake 1.10, there is not problem. The Makefile.am file is :

Ouch.  You've possibly found the first regression in Automake 1.11.
However, I cannot reproduce it with Libtool 2.2.6 nor with Libtool
1.5.26 (and I remember to have tried the faster install code with
these versions before the Automake 1.11 release).  So which libtool
version are you using here?

Thanks,
Ralf




Re: Making output of make more beautiful

2009-06-10 Thread Ralf Wildenhues
Hello Paul, Vincent,

* Paul Smith wrote on Wed, Jun 10, 2009 at 02:22:51PM CEST:
> On Wed, 2009-06-10 at 14:04 +0200, Vincent Torri wrote:
> > Is it possible to add 2 features :
> > 
> > 1) adding a percentage at the beginning of the line
> 
> People often ask for this, but it's not possible: a percentage of what?

I'll play devil's advocate on this one.

You can communicate to the toplevel `make' invocation, in the same
manner as the job server, the number of targets considered and the
number of sub-makes invoked.  That won't give you a percentage, of
course, as neither needs to be constant over two `make' invocations
nor with different MAKECMDGOALS.  But these two numbers can give a
rough estimate on the progress.

> > 2) displaying the lines with a color according to what is done 
> > (compilation, execution or error)

> The colorization of different types of rules (rather than output) is
> most easily accomplished by automake, by adding that directly to the
> output.

Yes; and in fact, Automake 1.11 has the option 'color-tests' which can
cause the output of 'make check' to be colorized, if you use the builtin
test driver (i.e., TESTS = ...).

For normal build rules, I would go this way only with another option,
and only if there was a useful and easy semantics for this colorization.

For highlighting compiler warnings, I'd just try out the 'silent-rules'
Automake option (Linux kernel-style build output).

> However, both of these could also be, and are perhaps most easily,
> accomplished via an extra "colorization" tool, where the output of make
> is piped into this tool which reads stdin, colorizes it, then displays
> it.  This saves a lot of effort and change in make and automake and
> provides a clean and much more easily modifiable colorization
> environment.

Of course; and in fact, editors have been doing this task very well for
years already.  But now that we're already on our way to the dark side,
see above, it's even more difficult to convince users that it's the
wrong path.  ;-)

Cheers,
Ralf




Re: running tests under in a tests/ directory

2009-06-09 Thread Ralf Wildenhues
* Bob Friesenhahn wrote on Tue, Jun 09, 2009 at 09:54:25PM CEST:
> On Tue, 9 Jun 2009, aaragon wrote:
>>
>> AM_LDFLAGS = -L$(top_srcdir)/yafeq -lyafeq
>>
>> and this works fine. Is this the right way to do this?
>
> Any in-tree library dependencies should be specified via LIBADD rather  
> than LDFLAGS.  Using LDFLAGS is a common mistake which results in  
> problems, such as the wrong library being used.  See the documentation.

Also, you should use `$(top_srcdir)/yafeq/libyafeq.la' rather than
`-L$(top_srcdir)/yafeq -lyafeq', because the former will correctly
add the file to the dependencies of the targets it is linked to.

Cheers,
Ralf




Re: vpath builds and nobase_include_HEADERS

2009-06-08 Thread Ralf Wildenhues
* Peter Johansson wrote on Mon, Jun 08, 2009 at 06:52:18PM CEST:
> Ralf Wildenhues wrote:
>> Oh.  It seems your first message in this thread didn't make it to the
>> list then.  Dunno whether it was eaten by the spam filter, but it is
>> not sitting in the moderation queue.  You might consider resending
>> (or bouncing) it.
>>
>>   
> I tried a second time on Friday, but seems like it did not get through  
> either. Hope for better luck this time.

Well, it reached me at least, but to get through to the list you'd
probably have to delete generated files from it, as the size of 400K
is a bit over the limit of this list.

Anyway, I think the whole misunderstanding that John had, AFAICS, is
around this:

If you write this in sub/Makefile.am:
  nobase_foobar_HEADERS = sub1/foo.h sub2/bar.h

then that means that in the source or the build tree, there exist files
  sub/sub1/foo.h sub/sub2/bar.h

and upon 'make install', they will be installed in
  $(foobardir)/sub1/foo.h
  $(foobardir)/sub2/bar.h

Thus it is wrong to specify
  nobase_include_HEADERS = \
comp1/comp1.hh \
comp1/processed.hh

in lib/comp1/Makefile.am, because that will then look for files named
lib/comp1/comp1/comp1.hh in the source or the build tree, for
installation into $(includedir).

You can easily fix that either by

- listing the files one level up (in lib/Makefile.am) and possibly not
  needing a Makefile.am in the lib/comp1 directory at all, or by

- just not using nobase_, and fixing up the installation directory,
  in lib/comp1/Makefile.am:
comp1includedir = $(includedir)/comp1
comp1include_HEADERS = comp1.hh processed.hh

Hope that helps.

Cheers,
Ralf




Re: Shared library location

2009-06-05 Thread Ralf Wildenhues
Hello Russell,

* Russell Shaw wrote on Fri, Jun 05, 2009 at 06:42:16PM CEST:
> In my code that isn't installed, i dlopen "my.so":
>
>   ~/home/russ/myproj/src/libdir/my.so
>
> When i "make install", i want the dlopen to get my.so
> from a system location:
>
>   /usr/lib/my.so
>
> What "make" variable can i utilize that has a value
> dependent on whether the package is installed or not?

Typically, projects that create modules use libtool to do so, and use
  libtool --mode=execute -dlopen libdir/my.la ./program ...

to run an uninstalled program, while they use
  program ...

to run an installed program.  The former ensures that dlopening my.so
will find the uninstalled library.

Should you not use libtool, you could still use wrapper scripts for your
uninstalled programs to prefer uninstalled libraries, config files, etc.
The Automake package does this too, with the tests/automa...@version@
and tests/acloc...@version@ wrappers.

Hope that helps.

Cheers,
Ralf




Re: running tests under in a tests/ directory

2009-06-05 Thread Ralf Wildenhues
* aaragon wrote on Fri, Jun 05, 2009 at 09:05:18PM CEST:
> When deliberately making the tests fail, I still get:
> 
> aara...@~/Documents/workspace/cpputils$make check
> Making check in cpputils
> make[1]: Nothing to be done for `check'.
> Making check in tests
> make  check-TESTS
> FAIL: test001
> PASS: testcppblas
> ==
> 1 of 2 tests failed
> See tests/test-suite.log
> Please report to Alejandro Aragon 
> ==
> make[3]: *** [test-suite.log] Error 1
> make[2]: *** [check-TESTS] Error 2
> make[1]: *** [check-am] Error 2
> make: *** [check-recursive] Error 1
> 
> and I removed the double quoting from the AC_INIT macro already.

This looks normal, and expected to me, given that you put your name and
address in the third argument of AC_INIT.  What do you consider
unexpected here?

> One more comment. You forgot to add the diff-driver file to the EXTRA_DIST
> variable.

Thanks!  I'm sure Martin will like to fix this on his web page then.

Cheers,
Ralf




Re: vpath builds and nobase_include_HEADERS

2009-06-05 Thread Ralf Wildenhues
Hi Peter,

* Peter Johansson wrote on Fri, Jun 05, 2009 at 07:50:56PM CEST:
> Ralf Wildenhues wrote:
>> could you please post this on-list unless it has huge size; that way,
>> others can profit (or comment, critize, help, ...), too.
>>   
> Sorry, I don't understand. I attached the tarball and CC:d the automake  
> list.

Oh.  It seems your first message in this thread didn't make it to the
list then.  Dunno whether it was eaten by the spam filter, but it is
not sitting in the moderation queue.  You might consider resending
(or bouncing) it.

> Are attachments filtered away when automake list is in CC or what  
> do you mean post on-list?

No, attachments should not be filtered.  There is some size limit though
(but it's reasonably high IIRC).

Cheers,
Ralf




Re: Re: vpath builds and nobase_include_HEADERS

2009-06-05 Thread Ralf Wildenhues
Hello Peter,

> On Jun 5, 2009 8:42am, Peter Johansson  wrote:
>
>> I've attached a modified version of your package for which `make  
>> distcheck' works. I'm not sure what structure you desire for the  
>> installed header files, but that should be easy to adjust.

could you please post this on-list unless it has huge size; that way,
others can profit (or comment, critize, help, ...), too.

Thanks!
Ralf




Re: RFE: allow for computed version number

2009-06-05 Thread Ralf Wildenhues
Hi Bob,

* Bob Friesenhahn wrote on Fri, Jun 05, 2009 at 06:54:16PM CEST:
> On Fri, 5 Jun 2009, Peter Johansson wrote:
>> Bob Friesenhahn wrote:
>>> Doesn't any approach which depends on an automatically generated file 
>>> assure that the version control system is one step out of date? Every 
>>> time you do a 'commit' the version file is one step newer and 
>>> therefore needs to be committed.
>>
>> Your analysis seems to assume that you commit the version file into the VCS.
>
> Yes.  It assumes the model where anyone who receives the package has the 
> ability to build and maintain it similar to the original maintainer.  

But whether the file is distributed in a package tarball or not is
independent of whether the file is put into the VCS or not.

> Some might say that the ability for independent maintenance is part of 
> the essence of free software.  There are complexities if an active 
> version control system (with particular access and user rights) needs to 
> be available in order to produce a release.  It can be easily argued that 
> if a particular instance of a particular version control system is 
> required, that satisfying GPL requirements becomes problematic since the 
> VCS becomes part of the "build scripts" which are required to be 
> delivered.

Well, this line of thought is actually an argument in favor of putting
the version in a file, not putting that file in VCS, but in a
distribution tarball.  That way, the user of the tarball does _not_ need
the VCS, because the file will be up to date, and `cat version-file`
will work even where `git describe` will not.

> If the version comes from a version control system change set ID, and  
> the version info is also distributed in a file, then the version in the 
> distributed file will always be behind what the value would be if it was 
> comitted.

Again, you are mixing up committing the file with distributing it here.
As long as the distribution tarball is generated by 'make', it is
trivially possible to ensure the version file is up to date before
making the tarball.

Cheers,
Ralf




Re: failed to create dist targets

2009-06-04 Thread Ralf Wildenhues
* aaragon wrote on Thu, Jun 04, 2009 at 10:11:22PM CEST:
> 
> So I got all my code up and running and now I want to make a file so I can
> use my library in another computer. Now, when I try to make dist-zip, I get:
> 
> aara...@~/Documents/workspace/cpputils$make dist-all
> { test ! -d "-cpputils--0.001" || { find "-cpputils--0.001" -type d ! -perm
> -200 -exec chmod u+w {} ';' && rm -fr "-cpputils--0.001"; }; }
> test -d "-cpputils--0.001" || mkdir "-cpputils--0.001"
> mkdir: illegal option -- c
> usage: mkdir [-pv] [-m mode] directory ...
> make: *** [distdir] Error 64
> 
> It's trying to make a directory that starts with a hyphen. Is this a bug?
> Can I get around this and indicate that I don't want the filename of the
> directory to start with a hyphen?

You can name the package cpputils rather than -cpputils- (change the
first argument of AC_INIT in configure.ac) or override the tarname only
(add a fourth argument to AC_INIT).
See "info Autoconf 'Initializing configure'".

Cheers,
Ralf




Re: RFE: allow for computed version number

2009-06-04 Thread Ralf Wildenhues
Hi Bruno,

> * Bruno Haible wrote on Sun, May 31, 2009 at 12:50:13PM CEST:
> > - with a lazily computed version number (this is for projects where the 
> > version
> >   changes several times a day):
> > 
> > AM_INIT_VERSION([$$version], [version=`sh $(top_srcdir)/gen-version`])

Here's a slightly different idea.  The basic unit of granularity in
makefiles is a target, or a file.  The problem we're facing is that the
version is clumped together with all other information inside
config.status, config.h, Makefile files.  So let's separate it by
declaring a special file that contains the version (whether that be
.version or another name), say, by some AM_VERSION_FILE macro.

You could then experiment with suitable rules to keep the file up to
date, depending on the VCS.  VCS hooks could ensure to update (or
remove, or otherwise outdate) the file.  All rules that need the version
can just depend upon the file, and read from it afterwards, that nicely
avoids any duplicate work.  The file should conceptually live in the
source tree as it is a source property not a build one (though newline
encoding may make this a bit ugly for cross-platform mounts).
Maintainer mode could even be used to avoid updates to this file.

What do you think of this approach?

Thanks,
Ralf




Re: vpath builds and nobase_include_HEADERS

2009-06-03 Thread Ralf Wildenhues
Just because I think my reply is easily misunderstood:

* Ralf Wildenhues wrote on Thu, Jun 04, 2009 at 07:47:50AM CEST:
> * johnwohlb...@gmail.com wrote on Thu, Jun 04, 2009 at 12:24:29AM CEST:
> >
> > from lib1/Makefile.am
> >
> > nobase_include_HEADERS = \
> > $(header_file_list)
> 
> The above is ok, the below is not.

You should keep the above two lines and just drop the next 8 lines:

> > lib1/header1.hh : lib1
> > for file in $(header_file_list) ; do \
> > nobase_file="`echo $$file | sed -e 's?lib1/??'`" ; \
> > $(LN_S) -f @HOST_ACDIR@/src/lib1/$$nobase_file lib1/. ; \
> > done
> >
> > lib1 :
> > mkdir lib1;

and then adjust -I paths and then be happy.

Cheers,
Ralf




Re: vpath builds and nobase_include_HEADERS

2009-06-03 Thread Ralf Wildenhues
Hello John,

* johnwohlb...@gmail.com wrote on Thu, Jun 04, 2009 at 12:24:29AM CEST:
> I'm having troubles with out of tree (vpath) builds, using  
> nobase_include_HEADERS, and *.in files processed by configure.

I think you're making things more complicated than they need to be.

> I have a collection of components in a component library.

> I want to do out of tree (vpath) builds, and maintain the directory  
> structure of the components for the headers on install.

OK.

> First, in order to preserve the directory structure for the headers on  
> install, I added rules such as these to the Makefile.am's
>
> from lib1/Makefile.am
>
> nobase_include_HEADERS = \
> $(header_file_list)

The above is ok, the below is not.

> lib1/header1.hh : lib1
> for file in $(header_file_list) ; do \
> nobase_file="`echo $$file | sed -e 's?lib1/??'`" ; \

(no need for the double quotes in this line BTW)

> $(LN_S) -f @HOST_ACDIR@/src/lib1/$$nobase_file lib1/. ; \
> done
>
> lib1 :
> mkdir lib1;

(directory dependencies don't work like this; they are satisfied by
directories found by VPATH, too)

> where $(header_file_list) has the directory name in the file name and  
> @HOST_ACDIR@ points to the location of configure.

Don't do this.  Just accept that you have parts of the headers in the
build tree (namely those that are generated by or after configure) and
parts in the source tree.  All you need for that to work seamlessly is
to have -I. -I$(srcdir) in your cpp flags (most likely AM_CPPFLAGS).

The install rules generated by automake will ensure to pick up all
needed files from the build or the source tree.

> This of course makes a directory lib1 in my current lib1 build dir and  
> makes soft links to the header files in the source tree. Then on a make  
> install it puts the headers in the place I want, ie, 
> $prefix/include/lib1.
> I have such a snippet for all of my components.

Just remove it.

> One additional aspect of this is that my source tree has a types.hh.in  
> which is processed by configure and placed in the build directory. I have 
> an additional rule that links the processed file into my lib1 
> subdirectory for install.

configure should just create it in the correct relative location in the
first place, namely in lib1/types.hh IIUC.

> Now, the problem is that in lib2 headers/sources I want to have includes  
> such as
> #include 
> #include 
>
> And of course I want my -I include path to point to the build location of 
> these headers since some of them have been processed by configure, ie,  
> types.hh.in -> types.hh

Sure.

> The trouble is that as part of my effort to get the 
> nobase_include_HEADERS feature for installing headers in subdirectories, 
> is that those headers now live in lib1/lib1 due to the targets in my 
> Makefile.am and the code won't compile with -I$(top_builddir)/lib1.

Well, you need -I$(top_builddir).  Most likely that path is already
added for you by automake, too.

> This all seems very complicated and it most definitely seems like this  
> combination of vpath/nobase_include_HEADERS/types.hh.in is a pretty usual 
> combination that I shouldn't have to stand on my head for to make it 
> work.

Yep, you're doing too much.  :-)

If that still doesn't help, it is useful to post a small example
package.

Cheers,
Ralf




Re: running tests under in a tests/ directory

2009-06-03 Thread Ralf Wildenhues
* aaragon wrote on Wed, Jun 03, 2009 at 09:22:59PM CEST:
> Ralf Wildenhues wrote:
> > * aaragon wrote on Sat, May 30, 2009 at 07:34:13PM CEST:
> >> check:all
> >>   @echo "Running tests"
> >>   cd $(TSTDIR) && $(MAKE) $(AM_MAKEFLAGS) -s test

> >> Except that if I change 'check: all' to something else, then it
> >> doesn't work anymore.
> > 
> > "doesn't work"  is too vague.  What does not work, what error message do
> > you get, more details please?
> 
> In your first reply you suggested to use check-local, but if I put in the
> Makefile.am in the root director:
> 
> check:check-local

Ah, that's not what I meant, sorry for being imprecise.  I meant to
use

check-local:
@echo ...

IOW, replace 'check' with 'check-local'.  And then, because 'check' will
invoke 'check-local', and 'check' will recurse already anyway, you can
omit the recursion and just let it invoke the check-am target in
tests/Makefile.

> >> Now, from the code that I got from the internet, I really liked the
> >> fact that you don't need to specify the name of the tests in the
> >> Makefile.am file because of the use of wildcards. Is there a way to
> >> make something similar here that is really portable?
> > 
> > Not really; well, at the very least you will need to have a naming
> > scheme such that all tests have some specified suffix(es), for example
> > '.test'.  Your sources would then match '*.test.cpp'.

> Can you be more specific on this? All my files start with the word 'test',
> but I can change the file names so that they end in 'test.cpp' if needed.
> However, I tried to put 
> 
> TESTEXECS = 'test*.cpp'
> 
> in the Makefile.am under the tests directory and it gives me an error:

Yeah, currently it is not possible to use wild cards there.  Sorry for
giving that false impression, I didn't think it fully through.

> Ralf Wildenhues wrote:
> >> If I make one of the tests fail (by deliberately changing one of the
> >> verified files), I get a message saying that a binary operator is
> >> expected:
> >> 
> >> aara...@~/Documents/workspace/cpputils/tests$make check
> >> make  check-TESTS
> >> FAIL: test001
> >> PASS: testcppblas
> >> /bin/sh: line 0: test: Alejandro: binary operator expected
> >> 
> >> 1 of 2 tests failed
> >> See tests/test-suite.log
> >> 
> >> make[2]: *** [test-suite.log] Error 1
> >> make[1]: *** [check-TESTS] Error 2
> >> make: *** [check-am] Error 2
> >> 
> >> Is this normal?
> > 
> > Not sure, it could be a bug in my suggested code or in Automake.  Can
> > you post the output of
> > 
> >   make check SHELL=/bin/sh\ -x
> > 
> > for that, as well as the tests/Makefile.am and the automake version
> > you're using?  Thanks.

> I'm using:
> aara...@~/Documents/workspace/cpputils$automake --version
> automake (GNU automake) 1.11

> If I type 'make check' and the tests are successful, I get the following
> error at the end:
> 
> aara...@~/Documents/workspace/cpputils$make check

> ---
> Running tests
> cd tests && make  -s test
> make[1]: *** No rule to make target `test'.  Stop.
> make: *** [check] Error 2
> aara...@~/Documents/workspace/cpputils$
> 
> 
> I think that error is because of the following line in the Makefile.am in
> the root director:
> 
>   cd $(TSTDIR) && $(MAKE) $(AM_MAKEFLAGS) -s test
> 
> Shouldn't I just use
> 
>   cd $(TSTDIR) && $(MAKE) $(AM_MAKEFLAGS) -s
> 
> ?

Well, if there is no "test" rule defined in tests/Makefile.am, then it
cannot be run.  But automake already provides recursion for you, with
the check rule, so you can just use that and need not build your own.
See above.

> Here I make one of the tests fail deliberately. This is the output of "make
> check SHELL=/bin/sh\ -x" for this case, where you can see the error I was
> referring to in my post at the end:
> 
> aara...@~/Documents/workspace/cpputils$make check SHELL=/bin/sh\ -x
> + test '!' -f config.h

> + msg='1 of 2 tests failed.  See tests/test-suite.log.  '
> + test -n Alejandro 'Aragon '
> /bin/sh: line 0: test: Alejandro: binary operator expected

This indicates that your PACKAGE_BUGREPORT setting from configure.ac
contains weird kind of quoting, namely both double and single quotes.
Can you omit the single quotes?

Cheers,
Ralf




Re: running tests under in a tests/ directory

2009-06-03 Thread Ralf Wildenhues
Hello Alejandro,

* aaragon wrote on Sat, May 30, 2009 at 07:34:13PM CEST:
> 
> Thank yo Ralf for replying to my post. The code that I showed in my post was
> indeed taken from that website. The code you provided me is way too advanced
> for me, since I consider myself just a user. The main idea that I got from
> your reply is that this new code won't work with older versions of Automake.

Right.  But as William already replied, you only need Automake on your
development system and not on systems where the code is configured and
built.

> Now, regarding the code that you wrote, I am trying to make it work, so I
> changed the Makefile.am in the root directory to what you suggested
> (almost):
> 
> # test rule
> TSTDIR = tests
> 
> check:all
>   @echo 
> "---"
>   @echo "Running tests"
>   cd $(TSTDIR) && $(MAKE) $(AM_MAKEFLAGS) -s test
>   @echo 
> "---"
> 
> Except that if I change 'check: all' to something else, then it doesn't work
> anymore.

"doesn't work"  is too vague.  What does not work, what error message do
you get, more details please?

> In the tests directory, I changed the Makefile.am to reflect that I use only
> cpp files. I also made the generate-verified-files silent, so that the
> output of that target is:

> Now, from the code that I got from the internet, I really liked the fact
> that you don't need to specify the name of the tests in the Makefile.am file
> because of the use of wildcards. Is there a way to make something similar
> here that is really portable?

Not really; well, at the very least you will need to have a naming
scheme such that all tests have some specified suffix(es), for example
'.test'.  Your sources would then match '*.test.cpp'.

> If I make one of the tests fail (by deliberately changing one of the
> verified files), I get a message saying that a binary operator is expected:
> 
> aara...@~/Documents/workspace/cpputils/tests$make check
> make  check-TESTS
> FAIL: test001
> PASS: testcppblas
> /bin/sh: line 0: test: Alejandro: binary operator expected
> 
> 1 of 2 tests failed
> See tests/test-suite.log
> 
> make[2]: *** [test-suite.log] Error 1
> make[1]: *** [check-TESTS] Error 2
> make: *** [check-am] Error 2
> 
> Is this normal?

Not sure, it could be a bug in my suggested code or in Automake.  Can
you post the output of

  make check SHELL=/bin/sh\ -x

for that, as well as the tests/Makefile.am and the automake version
you're using?  Thanks.

Cheers,
Ralf




Re: RFE: allow for computed version number

2009-06-02 Thread Ralf Wildenhues
Hello Bruno,

* Bruno Haible wrote on Sun, May 31, 2009 at 12:50:13PM CEST:
> Ralf Wildenhues wrote:
> > I have been playing with the idea of having two separate version strings.
> > ...
> > smells of the developer working around limitations of the tools, rather
> > than the tools providing adequate functionality
> 
> Yes, exactly.
> 
> This approach of two pieces is useful for the PACKAGE name - after all,
> even a "GNU Binutils for Fedora" is a "GNU Binutils", and the tarname
> remains 'binutils' -, but not for the version.

For the version strings generated by git, this holds as well.  Which
VCSes do not follow this form of version string append?

> > > Why AC_INIT(PACKAGE, VERSION) is bad
> > > 
> > 
> > I hear you.  When we go and revise newer interfaces, to find out they
> > are not actually better than their older counterparts, we may need to
> > ask ourselves if our strategy to move forward can be improved.  Such
> > moves are painful, and they should be minimized, which in practice may
> > mean that we should think harder when adding new interfaces.
> 
> Not "think harder", but "work in an early feedback loop with the users".
> It may mean announcing more frequent prereleases. Or it can be achieved by
> declaring some new features "experimental" (but documented!) and upgrading
> them to "stable" only at the next release.

Well, but we're doing exactly that.  It may not happen at the pace users
would like, but that's an issue of manpower not of intent.  Also, all
development is done in public on mailing lists, and the git version of
Automake is intended to be usable at all times as well as installable in
parallel with other released versions (though any development snapshot
between release X and X+1 will have the same name).

AC_INIT doesn't really match the early feedback issue, because the
AC_INIT(PACKAGE, VERSION) API is not new at all.  These changes come
from 2001 and before, and the Automake requirements in this area are
also several years old.

For examples of new and experimental interfaces: there is Autotest, or
the parallel-tests framework in Automake 1.11.  And that one did have a
test release and at least some feedback on that (thanks to all that
tested it!).

No, what I really mean is that we introduce new APIs, and half a decade
later users still haven't moved over to it fully, and in practice it
means we have to support both the new and the old API forever; and then
we learn that the old APIs are better suited for some common task.
The proliferation of different APIs for the same tasks is not good.

> for version.o and special CPPFLAGS for version.c. (It is a pity that adding
> special CPPFLAGS for a single file requires the creation of an intermediate
> library.)

Well, if after two years we haven't found a make implementation that
does not grok $(var1$(var2)) as used with silent-rules, we can not only
go ahead and let silent-rules be the default, but also we can start
using this idiom for other, less trivial issues such as per-object
overrides.  However, it would also require the user to acknowledge that
per-object overrides won't create two different objects, which again
makes the user interface so much more difficult conceptually, as that's
yet another leaky abstraction (cf. `info Automake true').

> - with a lazily computed version number (this is for projects where the 
> version
>   changes several times a day):
> 
> AM_INIT_VERSION([$$version], [version=`sh $(top_srcdir)/gen-version`])

These kind of quoting hacks are incredibly ugly and error-prone; they
are difficult to understand even by me, I don't see how this can be
communicated to less experienced users in any sane way.  Hidden in the
implementation is the side condition of disallowed single quotes, how
would someone changing this line later ever be able to infer that?
(Some projects use $(...) style command substitutions throughout, for
them the above will be very confusing.)

Leaky abstractions contribute very visibly to maintenance overhead,
as they require endless re-explaining to new users, on this mailing list
and elsewhere.

> "make dist" computes the version number once, then computes the 'distdir'
> variable from it, and passes it to a sub-make invocation, from where it
> and the 'top_distdir' variable will be propagated to subdirectories.

This, too, is quite error-prone, as non-GNU make don't propagate
overridden variables to sub makes.

> + noinst_LIBRARIES = libversion.a
> + libversion_a_SOURCES = version.c

FWIW, this line is not needed with recent Automake; but it is useful
for backward compatibility to older versions.

> + libversion_a_CPPFLAGS = $(AM_CPPFLAGS) 

git repo back to previous state

2009-06-02 Thread Ralf Wildenhues
Hello,

thanks to the creators of git and to the administrators of savannah!,
the restoration of the Automake repository was little more work than

  git push --dry-run origin \
   ad-parallel-tests branch-1-10 branch-1.11 master
  # inspect output
  git push origin \
   ad-parallel-tests branch-1-10 branch-1.11 master
  git push --dry-run --tags
  # inspect output
  git push --tags

and has been done now.  Note that I did not use --all, that would have
exposed a bunch of unfinished topics and broken patches.

Cheers,
Ralf




Re: RFE: allow for computed version number

2009-05-30 Thread Ralf Wildenhues
Hi Bruno,

thanks for the good writeup!

* Bruno Haible wrote on Sun, May 24, 2009 at 03:15:47AM CEST:
> It has been mentioned in a discussion [1][2][3]
>   "In the medium to long run, Autoconf should be changed to not depend
>at autoconf run time upon a volatile version string."
> and
>   "the goal is that the version string should _not_ appear in
>config.h, so there should be _no_ configure output that changes in content
>due to a version string change."

For the Automake package itself, I have been playing with the idea of
having two separate version strings.  One, the current VERSION, changes
only after a release, or, more precisely, reflects that we are after the
last tagged version.  The other shows the full per-commit version string.

I fully understand that this may be a weird solution to use in general,
smells of the developer working around limitations of the tools, rather
than the tools providing adequate functionality, and I must admit of not
having put further thought into the issue yet.  This would be
implementable without any changes to autotools AFAICS.

> Why AC_INIT(PACKAGE, VERSION) is bad
> 

I hear you.  When we go and revise newer interfaces, to find out they
are not actually better than their older counterparts, we may need to
ask ourselves if our strategy to move forward can be improved.  Such
moves are painful, and they should be minimized, which in practice may
mean that we should think harder when adding new interfaces.

> 2) The propagation/substitution mechanism through 'configure', 'config.status'
>etc. is not adequate for values that change before "make distclean". See
>above.

Good point.

> Temporary hack #1
> =
> 
> In gettext 0.17 I used this mechanism for a computed version number:
> - Use AC_INIT without arguments.
> - Then compute the version number.
> - Then use the two-argument form of AM_INIT_AUTOMAKE.
> 
> -
>   AC_INIT
>   AC_CONFIG_SRCDIR(...)
>   . $srcdir/../version.sh
>   AM_INIT_AUTOMAKE([gettext], [$VERSION_NUMBER])
> -
> 
> The drawback of this approach is that it's not possible to pass
> automake options to AM_INIT_AUTOMAKE. Ralf mentions in [4] that some
> automake options can only be passed this way, not through the Makefile.am.

Note that you can use the optional third argument to AM_INIT_AUTOMAKE,
[NO-DEFINE], to keep PACKAGE and VERSION from showing up in config.h.
(I'm mentioning it here for completeness only, as it of course still
causes autotools to be rerun, all it avoids is the lots of rebuilds due
to changed config.h.)

> It would be good to be able to invoke AM_INIT_AUTOMAKE with options, and
> be able to set the package and version names (with computed values)
> separately.

So that then there would not be any version nor package information in
files generated by automake and autoconf and config.status?

How should 'make dist' work in this case?

Or do you allow package and version information in config.status output,
i.e., letting configure be rerun upon version change, in automake lingo,
  CONFIG_STATUS_DEPENDENCIES = $(srcdir)/../version.sh

?

> I'm *not* asking for a specific way of handling the version in a separate
> file; simply to allow any way of computing the version (e.g. fetching it
> from a shell script, by calling git-version-gen, etc.) Rationale: There
> is more than one type of distributed VCS. Also, we need time to experiment
> with this version propagation in order to find out the best way of doing it.

Understood.

Thanks,
Ralf




Re: makes which break with `silent-rules'

2009-05-30 Thread Ralf Wildenhues
* Jan Engelhardt wrote on Wed, May 27, 2009 at 09:12:44PM CEST:
> On Sunday 2009-05-24 15:24, Thomas Dickey wrote:
> >> all :
> >>echo $(XY_V)
> >>
> >> XY_V = $(XY_$(V))
> >> XY_0 = silent
> >> XY_1 = verbose
> >> XY_ = unknown
> >>
> >> I think this is supported by POSIX. POSIX [1] says: "Macros can appear
> >> anywhere in the makefile.".
> >
> > POSIX says that; however different implementations of 'make' treat
> > forward-references differently.

> Well that's when you would put XY_V last, just to be sure:
> 
> XY_ = unknown
> XY_0 = silent
> XY_1 = verbose
> XY_V = $(XY_$(V))
> 
> then there is no forward reference.

I wouldn't know a construct where the order of variable assignments is
treated differently by different make implementations.  Can you give an
example, Thomas?  Thanks.

The portability issue I was speaking about was purely about variable
references $(V) within "$(" and ")".

Cheers,
Ralf




Re: running tests under in a tests/ directory

2009-05-30 Thread Ralf Wildenhues
[  ]

Hello Alejandro, Martin,

* aaragon wrote on Fri, May 29, 2009 at 05:44:57AM CEST:
> 
> I found a Makefile.am file on the internet to run tests under a tests
> folder. The code looks like follows:

[ *snip*, looks an awful lot like
   ]

Martin, a while ago I went over your Automake page and noted some issues
with it, and proposed a newer version that uses Automake 1.11 features.
1.11 has been released now, would you be so nice to update your web page
to reflect these fixes?

A few nits inline below, and part of my recommendation follows at the
end.  It will also help you, Alejandro, avoid the pitfall of letting
code be compiled by GNU make implicit rules (which uses different
variable names and thus does not pick up the include paths).

> check:all

You can use check-local; and check will depend upon all-am anyway (the
part of 'all' that is updated in this directory), maybe that is
sufficient for your needs.

>   @echo "Running tests"
>   @make test -s -C $(TSTDIR)

Non-GNU make doesn't support -C, and you should use $(MAKE) so that
parallel make and make -k and make -n work; you can use
  cd $(TSTDIR) && $(MAKE) $(AM_MAKEFLAGS) -s test

but if I were you, I'd just let the automake-provided check rule do its
work in the toplevel directory.

> TESTSOURCE = $(wildcard test*.cpp)

GNU make-specific.

> TESTOBJS=$(TESTSOURCE:%.cpp=%.o)

Likewise.  More instances below.

> # additional linker and compiler flags
> LDFLAGS = -L$(top_srcdir)/yafeq -lyafeq $(LDFLAGS)

For in-tree libtool libraries, please always specify them as relative
paths to the .la file; also, don't stick them in LDFLAGS; thus, use
  LDADD = $(top_srcdir)/yafeq/libyafeq.la

(or even LIBS).

> aara...@~/Documents/workspace/yafeq/tests$mt
>  Compiling IntegrationRules.cpp   OK
> BD Software STL Message Decryptor v3.10 for gcc 2/3/4
> In file included from testIntegrationRules.cpp:1:
> ../yafeq/quadrature.hpp:14:36: error: cpputils/map_adaptor.hpp: No such file
> or
> directory
> 
> 
> Now, the file cpputils/map_adaptor.hpp does exist, and I actually have a
> test for detecting the cpputils library. I guess the CPPFLAGS that includes
> the correct path is not being passed to this piece of code written in the
> Makefile.am.


What follows is my suggestion that I wrote to Martin about the tests
thing.  I haven't tested it much, so if you have issues with them,
please speak up.

Cheers,
Ralf

For the tests example, I actually have a more radical suggestion:
replace it completely with a setup based on automake support for simple
tests using the TESTS variable.

Since I am lazy, and we have added several new features to the upcoming
Automake 1.11 release which can be of help here, I will further suggest
to require that next version, and use the 'parallel-tests' test driver
which has several new features.  In the following, I will also use some
more Automake 1.11 features like AM_DEFAULT_SOURCE_EXT.
(The stuff can more or less also be done with 1.10 features, but it will
be uglier, and, as already noted, I am lazy.  :-)

However, for portability, you will not be able to use wildcards.  So you
will need to mention each test once.

BTW, in your example is a line of the form
| LDFLAGS = $(LDFLAGS)

If this really were in your code, then it would provoke an endless
recursion, and an error from 'make' if referenced anywhere.

| # additional linker and compiler flags
| CXXFLAGS = -I$(top_srcdir)/src $(CXXFLAGS)

You should not override CXXFLAGS: that variable is reserved for the
user.  Automake provides AM_CXXFLAGS for that case.  However, include
paths are for the preprocessor, so AM_CPPFLAGS should be used here.

With the tests/Makefile.am and the test driver files below, there is no
need to modify toplevel Makefile.am, and you can run the testsuite with
  make check

or even
  make -j3 check

if you so prefer.  :-)

A word of caution: this code is completely untested.

Another word of caution: you need to run
  make generate-verified-files

once before you can run "make dist".  If you do not like that kind of
setup, please speak up.  (The dependencies are not completely precise
yet.)

You can update individual .verified files with something like
  make generate-verified-files TESTS='test1 test2'

Please note that 'diff -w' is not portable, at least not POSIX.
I simply used diff in the driver, but you may want to make that
configurable or so.

Cheers,
Ralf

--- tests/diff-driver ---
#! /bin/sh

# Compare output of test programs with expected output.
# Usage: diff-driver [--create] TEST-PROGRAM
#
# TEST-PROGRAM is the program to run.
# With --create, a .verified file is created.
# Without it, a .lastout file is created and compared
# to an existing .verified file; the diff is produced on stdout.
# $srcdir should contain the path to the source directory.
#
# .lastout files are created in 

Re: KDE linking problem with automake projects

2009-05-27 Thread Ralf Wildenhues
Hello Erwin,

* Erwin Brandenberger wrote on Wed, May 27, 2009 at 09:33:49AM CEST:
> 
> OS: Ubuntu 9.04
> IDE: KDE
> 
> Project HC is Program and not a Library. Why it uses libtool ? Any help ?
> 
> Making all in HC
> /bin/bash ../../libtool --tag=CXX --mode=link g++ -DRUN_UNIX -DRUN_LINUX -g
> -O2 -o HC HCApp.o HCDocument.o HCGridMDIChild.o HCImages.o HCMainWindow.o
> HCPpSimulationSheet.o HCPpSimulationSimulationPage.o
> HCPrefDirectoriesDialog.o ../../src/HCRw/libHCRw.a ../../src/HCDb/libHCDb.a
> ../../libtool: line 835: X--tag=CXX: command not found
> ../../libtool: line 868: libtool: ignoring unknown tag : command not found

The libtool script is generated from the ltmain.sh file and some
configure tests.  These tests come from the libtool.m4 file and
some others, and are integrated into configure by the autoconf program,
typically after being added to aclocal.m4 by the aclocal program.

The error you are seeing happens when the ltmain.sh and the macros do
not come from the same Libtool version.

I do not know whether you run the libtoolize, aclocal, autoconf
etc. programs yourself or you have some IDE do it for you.  In
the former case, you need to rerun them, after letting aclocal
point to the right macro files; in the latter case, you should
report this bug to the IDE maintainers.

Hope that helps.  It would be nice if you could post a solution
to this issue here (or on the libtool list), once you've found it,
because users come here more often to ask about this, but never to
report whether and how they got their problem solved.  And if there
is something special to be done with some IDE, then we'd like to know
about it.

Cheers,
Ralf




Re: Multiple Lexer Solution

2009-05-27 Thread Ralf Wildenhues
Hello Philip,

* Philip Herron wrote on Wed, May 27, 2009 at 01:51:36PM CEST:
> I think i see the portable solution to the multiple lexer problem we
> have been having, i think i will just put in a test case and update
> the documentation on whats the best way to use automake to handle it
> for you.

Great!

> The solution is correct on the automake documentation but its quite
> unclear because the %option prefix i was using in flex is probably a
> flex only option but what is does is exactly what happens in the
> current automake documentation but its awkward to setup with automake
> well its more just its unclear.

I'm not sure I understand this paragraph.  Are you saying that the bits
that are currently documented in the Automake manual are correct but not
clear or not sufficient?  And they basically have the same effect as the
%option prefix has, but the difference being that the list of #defines
is also portable to non-flex lexers?  That would be good.

> If i start to update some documentation etc + a test case to automake
> master git, do i just send the patch on the mailing list?

Yes.  Ideally you would send a patch against the git master tree of
Automake; but sending one against 1.11 would be fine, too.

> And do i need gcc copyright approval or however is it called?

For nontrivial changes, the FSF needs a copyright assignment, yes.
Details will follow off-list.

Cheers,
Ralf




Re: invoke pkg-config with --static

2009-05-25 Thread Ralf Wildenhues
Hello Lorenzo,

* Lorenzo Bettini wrote on Sat, May 23, 2009 at 11:24:30PM CEST:
> Ralf Wildenhues wrote:
>> You can just use pkg-config --static all the time, unless
>>  ( disable_static && ( host_os == Linux || host_os == Solaris ) )
>
> OK, so going back to the fact that, e.g. for debian packages, it's  
> better to have .Private, could you please show a possible use of this  
> technique from configure.ac?

No; sorry.  I don't use pkg-config, I cannot tell you how to use it.

Cheers,
Ralf




Re: using C# in automake

2009-05-25 Thread Ralf Wildenhues
* Bob Friesenhahn wrote on Mon, May 25, 2009 at 12:07:00AM CEST:
> I think that one of the points of C# is not to have "real .so files". C# 
> and .net's JIT linkage and compilation seem to go hand in hand.  C#  
> should produce something similar to .net assemblies.  These are rather  
> similar to "real .so files" except that they require a special VM to  
> load them.

Whatever the semantics: autotools don't provide support for them out of
the box yet.  We're certainly open to adding it, and to patches if
somebody wants to work on it, but anyway we definitely need detailed
description of the build semantics for these files, including ideally
examples, and pointers to documentation.  So far, I hardly know the C#
language at all.

Thanks,
Ralf




Re: 'make check' failure on MinGW

2009-05-24 Thread Ralf Wildenhues
I've been able to reproduce this issue on GNU/Linux with GNU make 3.80.
Committing this fix (will push out soon).

An easy workaround for this issue is to ensure that the TESTS variable
never contains trailing white space, i.e., the last test should be used
unconditionally on every system.

Thanks for the report!
Ralf

parallel-tests: avoid GNU make 3.80 substitution bug.

* lib/am/check.am [PARALLEL_TESTS] (check-TESTS): Remove any
`.log' entries from `$(TEST_LOGS)' even if the list is nonempty,
to work around GNU make 3.80 substitution reference issue with
trailing white space in the variable.
* tests/parallel-tests10.test: New test.
* tests/parallel-tests6.test: Update comment.
* tests/Makefile.am: Update.
* NEWS: Update.
Report by Bob Friesenhahn.

diff --git a/NEWS b/NEWS
index 48873d0..c3bb3ab 100644
--- a/NEWS
+++ b/NEWS
@@ -2,6 +2,11 @@ New in 1.11.0a:
 
 Bugs fixed in 1.11.0a:
 
+* Bugs introduced by 1.11:
+
+  - The `parallel-tests' test driver works around a GNU make 3.80 bug with
+trailing white space in the test list (`TESTS = foo $(EMPTY)').
+
 * Long standing bugs:
 
   - On Darwin 9, `pythondir' and `pyexecdir' pointed below `/Library/Python'
diff --git a/lib/am/check.am b/lib/am/check.am
index 8c085d0..b1d1aad 100644
--- a/lib/am/check.am
+++ b/lib/am/check.am
@@ -234,10 +234,11 @@ check-TESTS:
 ## cannot use `$?' to compute the set of lazily rerun tests, lest
 ## we rely on .PHONY to work portably.
@test -z "$(TEST_SUITE_LOG)" || rm -f $(TEST_SUITE_LOG)
-   @set_logs=; if test "X$(TEST_LOGS)" = X.log; then   \
- set_logs=TEST_LOGS=;  \
-   fi; \
-   $(MAKE) $(AM_MAKEFLAGS) $(TEST_SUITE_LOG) $$set_logs
+   @list='$(TEST_LOGS)';   \
+   list=`for f in $$list; do   \
+ test .log = $$f || echo $$f;  \
+   done | tr '\012\015' '  '`; \
+   $(MAKE) $(AM_MAKEFLAGS) $(TEST_SUITE_LOG) TEST_LOGS="$$list"
 
 AM_RECURSIVE_TARGETS += check
 
diff --git a/tests/Makefile.am b/tests/Makefile.am
index 5d5a290..7895816 100644
--- a/tests/Makefile.am
+++ b/tests/Makefile.am
@@ -502,6 +502,7 @@ parallel-tests6.test \
 parallel-tests7.test \
 parallel-tests8.test \
 parallel-tests9.test \
+parallel-tests10.test \
 parse.test \
 percent.test \
 percent2.test \
diff --git a/tests/parallel-tests10.test b/tests/parallel-tests10.test
new file mode 100755
index 000..2642c7a
--- /dev/null
+++ b/tests/parallel-tests10.test
@@ -0,0 +1,47 @@
+#! /bin/sh
+# Copyright (C) 2009  Free Software Foundation, Inc.
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License as published by
+# the Free Software Foundation; either version 2, or (at your option)
+# any later version.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License
+# along with this program.  If not, see .
+
+# Check parallel-tests features:
+# - trailing whitespace in TESTS
+# GNU make 3.80 may expand trailing white space to `.log'.
+
+. ./defs-p || Exit 1
+set -e
+
+cat >> configure.in << 'END'
+AC_OUTPUT
+END
+
+cat > Makefile.am << 'END'
+TESTS = foo $(BAR)
+BAR =
+END
+
+cat >foo <<'END'
+#! /bin/sh
+exit 0
+END
+
+chmod +x ./foo
+
+$ACLOCAL
+$AUTOCONF
+$AUTOMAKE -a
+
+./configure
+$MAKE check
+
+Exit 0
diff --git a/tests/parallel-tests6.test b/tests/parallel-tests6.test
index 50f6c44..5dbb433 100755
--- a/tests/parallel-tests6.test
+++ b/tests/parallel-tests6.test
@@ -17,6 +17,7 @@
 # Check parallel-tests features:
 # - empty TESTS
 # BSD make will expand `$(TESTS:=.log)' to `.log' unless overridden.
+# See parallel-tests10.test for a similar issue.
 
 . ./defs-p || Exit 1
 set -e




Re: makes which break with `silent-rules'

2009-05-24 Thread Ralf Wildenhues
Hello Bruno, Bob,

* Bruno Haible wrote on Sun, May 24, 2009 at 01:12:02PM CEST:
> >  - The `silent-rules' option enables Linux kernel-style silent build output.
> >This option requires the widely supported but non-POSIX `make' feature
> >of recursive variable expansion,
> 
> We are talking about constructs like this:
> 
> == Makefile ==
> all :
>   echo $(XY_V)
> 
> XY_V = $(XY_$(V))
> XY_0 = silent
> XY_1 = verbose
> XY_ = unknown
> ==

Yes.

> I think this is supported by POSIX. POSIX [1] says: "Macros can appear 
> anywhere
> in the makefile.".

But a few lines down it also says:

| Macro expansions using the forms $( string1) or ${ string1} shall be
| replaced by string2, as follows:
[...]
| The parentheses or braces are optional if string1 is a single character.
| The macro $$ shall be replaced by the single character '$' . If string1
| in a macro expansion contains a macro expansion, the results are
| unspecified.

IRIX make barfs if the inner macro expansion is used without $() or ${}
even for one-character macros.  IOW, IRIX make barfs over $(foo$V) but
copes with $(foo$(V)).

Compare that with the other possibility we tried to implement the
silencing feature: macro expansions on the left hand side of an
assignment:
  foo$(V) = ...

This is explicitly allowed by POSIX, but fails in practice on several
systems.  (I documented this in some message on one of the automake
lists, but am too lazy to dig it up now unless somebody asks for it.)

Cheers,
Ralf




Re: makes which break with `silent-rules'

2009-05-24 Thread Ralf Wildenhues
sorry, hit the send key too early.

* Ralf Wildenhues wrote on Sun, May 24, 2009 at 03:20:04PM CEST:
> IRIX make barfs if the inner macro expansion is used without $() or ${}
> even for one-character macros.  IOW, IRIX make barfs over $(foo$V) but
> copes with $(foo$(V)).

There are also some make implementations that barf over a macro
expansion recursion depth greater than 2, i.e., $(foo$(bar$(baz)))
is not portable either.  Dunno if there are situations in which that
functionality cannot be emulated through repeataed depth 2 recursion.




Re: 'make check' failure on MinGW

2009-05-23 Thread Ralf Wildenhues
* Bob Friesenhahn wrote on Sat, May 23, 2009 at 11:52:22PM CEST:
> On Sat, 23 May 2009, Ralf Wildenhues wrote:
>> Can you post the output of
>>  make check SHELL="/bin/sh -x"
>>
>> on the failing system?
>
> Yes, it is attached.  Thank goodness it is so compressable.

Thanks.  Can you post the output of the following, done in an empty
directory:

tr X '\t' >Makefile <<\END
# get some trailing white space into the TESTS variable
TESTS = foo $(BAR)
BAR =
am__test_logs1 = $(TESTS:=.log)
am__test_logs2 = $(am__test_logs1:.exe.log=.log)
TEST_LOGS = $(am__test_logs2:.sh.log=.log)

all:
Xecho .$(TESTS).
Xecho .$(am__test_logs1).
Xecho .$(am__test_logs2).
Xecho .$(TEST_LOGS)
END
make

Thanks,
Ralf




'make check' failure on MinGW (was: The New parallel-tests Framework)

2009-05-23 Thread Ralf Wildenhues
* Bob Friesenhahn wrote on Sat, May 23, 2009 at 10:29:18PM CEST:
> On Sat, 23 May 2009, Ralf Wildenhues wrote:
>> * Bob Friesenhahn wrote on Sat, May 23, 2009 at 08:12:33PM CEST:
>>> I now have parallel tests working in GraphicsMagick (verified under
>>> Solaris, FreeBSD, and OS-X Leopard) except that under MinGW (TDM
>>> version) 'make check' ends with:
>>>
>>> make[3]: *** No rule to make target `.log', needed by `test-suite.log'.  
>>> Stop.
>>> make[3]: Leaving directory `/home/bfriesen/mingw/GM-16-static'
>>> make[2]: *** [check-TESTS] Error 2
>>
>> Can you post the Makefile of this build tree, compressed, or send it to
>> me off-list?
>
> A compressed copy of the Makefile is attached.

Thanks.  Hmm.

> 'make' identifies itself as GNU make 3.71.1.

Typo?  I guess you meant 3.79.1.

Can you post the output of
  make check SHELL="/bin/sh -x"

on the failing system?

Cheers,
Ralf




Re: The trouble with versions

2009-05-23 Thread Ralf Wildenhues
Hello Gerald,

* Gerald I. Evenden wrote on Sat, May 23, 2009 at 05:33:43PM CEST:
> Original library was setup and installed as:
> 
> libproject_la_LDFLAGS = -version-info 0:0:0 $(ALDFLAG) $(BLDFLAG)
> 
> I added a new procedure and modified several others (without changing their 
> interface).
> 
> Refering to 7.3 of libtool I change the above to:
> 
> libproject_la_LDFLAGS = -version-info 1:1:1 $(ALDFLAG) $(BLDFLAG)

Going from 0:0:0 straight to 1:1:1 is senseless.  If you increase
CURRENT, then in the same step you can reset REVISION to zero; see
  info Libtool "Updating version info"

Only in the next update it makes sense to bump REVISION again.

> I uninstalled the original library, did a "autoreconf -vfi", "./configure", 
> and checked src/Makefile to see that 1:1:1 was still in the procedure and 
> then did a "make" followed by "sudo make install" with the following results 
> in /usr/local/lib:
> 
> libproject.a
> libproject.la
> libproject.so
> libproject.so.0
> libproject.so.0.1.1<--- previously 0.0.0
> 
> What is going on??

Looks normal for GNU/Linux.  The point is that libtool takes the
-version-info triple, and from that computes some system-specific
version number or set of version numbers that try to match the
semantics defined in the Libtool manual, chapter "Libtool versioning".

The specific computation formula is a detail of libtool, you should
not need to care about.

Hope that helps.

Cheers,
Ralf




Re: The New parallel-tests Framework

2009-05-23 Thread Ralf Wildenhues
Hi Bob,

* Bob Friesenhahn wrote on Sat, May 23, 2009 at 08:12:33PM CEST:
> I now have parallel tests working in GraphicsMagick (verified under  
> Solaris, FreeBSD, and OS-X Leopard) except that under MinGW (TDM  
> version) 'make check' ends with:
>
> make[3]: *** No rule to make target `.log', needed by `test-suite.log'.  Stop.
> make[3]: Leaving directory `/home/bfriesen/mingw/GM-16-static'
> make[2]: *** [check-TESTS] Error 2

Can you post the Makefile of this build tree, compressed, or send it to
me off-list?

Thanks,
Ralf




Re: invoke pkg-config with --static

2009-05-22 Thread Ralf Wildenhues
* Lorenzo Bettini wrote on Fri, May 22, 2009 at 03:48:49PM CEST:
> No I meant: if additional libraries are shown when building with shared  
> library enable, this should do no harm, shouldn't it?
>
> In such case, one could simply not use .Private specification in .pc file.

D'oh.  Yes, I guess you're right, and I'm just too darn good at
misparsing your replies repeatedly.  Sorry about that.

Of course, as soon as you propose your software for packaging at
debian.org, they will count not using .Private as bug ... ;-)

Cheers,
Ralf




Re: The New parallel-tests Framework (was: Various testsuites)

2009-05-21 Thread Ralf Wildenhues
Hi Bob,

* Bob Friesenhahn wrote on Fri, May 22, 2009 at 12:54:02AM CEST:
> On Thu, 21 May 2009, Ralf Wildenhues wrote:
>
>> 6. Dependencies between Tests
>> =
>>
>> If there are dependencies between your tests, e.g., foo.test needs to be
>> run before baz.chk can be run, then, with TEST_EXTENSIONS set as above,
>> you can now go ahead and specify them as dependency relations between
>> the corresponding log files:
>>
>>  baz.log: foo.log
>
> This is not working for me.  It seems like there is a missing dependency 
> or logical race-condition somewhere.  I added this to try to force the 
> utilities/tests/montage.sh tests to run after a set of other tests:
>
> utilities/tests/montage.sh.log : \
>   utilities/tests/addnoise.sh.log \
>   utilities/tests/affine.sh.log \
>   utilities/tests/annotate.sh.log \
>   ...

You probably need to do this instead:
  TEST_EXTENSIONS = .sh
  utilities/tests/montage.log : \
utilities/tests/addnoise.log \
utilities/tests/affine.log \
utilities/tests/annotate.log \
...

as test dependencies currently only work with extensions that have been
registered.

Cheers,
Ralf




The New parallel-tests Framework (was: Various testsuites)

2009-05-21 Thread Ralf Wildenhues
Hello once again,

allow me to expand upon this topic a bit more.  In this message, I will
not try to be fair towards the different test suite frameworks; instead,
I'll bluntly praise the new parallel-tests driver.  :-)

> Automake TESTS support, as of version 1.11, consists of two different
> test drivers that run tests from within a makefile.  The old one is
> pretty minimalistic in test suite functionality, the new one (selected
> with the `parallel-tests' option) is mostly backward-compatible, but
> also offers some more and slightly changed functionality.  It still
> operates as a set of rules within a makefile.

Here is a small tour through the features of the new framework.  I hope
you don't mind that I just post it here.



1. Getting Started
==

First off, assume you already have tests listed in a Makefile.am file:

  TESTS = foo.test bar baz.chk ...

then just go there right now, and add the line

  AUTOMAKE_OPTIONS = 1.11 parallel-tests color-tests

in that Makefile.am file.  That's it!  Now rebuild and try it out:

  make all check

On a terminal that provides colors, the test results should be nicely
colored now, making it easy to spot failures even among several
screen-full of successful tests.


2. Inspection of Results


If there are any test failures, you will notice that now, there is a
test-suite.log file which contains a summary of the tests that didn't
go as expected, including the output of the tests.

Instead of looking at the test-suite.log file for the failure output,
you could have instead also used

  make check VERBOSE=yes

to cause the output to be produced on the terminal at the end.

You will notice that there are more log files created in the build
directory: one per test, giving you an easy way to look at the output of
one particular test only.

If you happen to have rst2html installed, you can even generate a
clickable HTML log with

  make check-html


3. Running Tests in Parallel


The parallel-tests option has its name from the parallel execution
feature.  If you, the developer, ensure that the tests in your suite
are independent from each other, and do not interfere while running
(for example by writing files with identical names) then you are set
to run the test suite in parallel:

  make -j4 check

Parallel runs will typically work with GNU and BSD make versions.


3. Running only a Subset of Tests
=

Any well-maintained test suite quickly contains lots of tests which can
take a long time to run when run in full.  So it's desirable to run only
a subset of all tests.  The parallel-tests driver provides several ways
to do this:

  a. Setting the TESTS variable:

   make check TESTS="foo.test bar"

  b. Setting the TEST_LOGS variable:

   make check TEST_LOGS="bar.log"

  c. Re-running only tests that failed during the last check:

   make recheck

  d. Re-running tests lazily:

   make check RECHECK_LOGS=

The last point (d) deserves more explanation below.  Note that with
non-GNU make, the variable overrides may require writing something
like this instead:

  env TESTS="foo.test bar" make -e check


4. Lazy test execution
==

Say, you are right in the middle of the edit-compile-test development
cycle.  The feedback loop should be as quick as possible, so rebuilds
should update only files that are out of date, and little else.

For an example, assume `bar' is a unit test program listed in TESTS,
then you would use:

  EXTRA_PROGRAMS = bar
  CLEANFILES = $(EXTRA_PROGRAMS)

to specify that `bar' is to be compiled from sources (bar.c by default).
The old test driver would recommend using `check_PROGRAMS', but that is
not quite as lazy: `check_PROGRAMS', `check_LTLIBRARIES' and
`check_SCRIPTS' are updated before any test starts.  With EXTRA_PROGRAMS
however, we now rely on `make' dependencies to update the program just
in time for the test, possibly concurrently while other tests are
already being run.

With this in place, while editing sources for `bar' you would now run

  make check RECHECK_LOGS=

in order to recompile and rerun only those tests for which there are
newer sources in the tree.

Of course, this requires that dependencies are tracked correctly.  For
example, it is crucial that you specify in-tree library dependencies as

  bar_LDADD = ../lib/libutil.la

so that automake can infer bar_DEPENDENCIES if you don't specify that
explicitly.  Had you instead written

  # Don't do this!
  bar_LDADD = -L../lib -lutil

then a rebuild of ../lib/libutil.la would not result in a test rerun.

Note that the semantics of the `check_*' file lists have not changed:
they are still updated before any test starts.  This allows you to
specify files that themselves are not listed in TESTS, but needed
before tests can be run anyway.


5. Test Extensions
==

The parallel-tests driver operates by creating rules for each test run.
In t

mailing list handling (was: My project can't use `silent-rules')

2009-05-21 Thread Ralf Wildenhues
Hi Bob, all,

* Bob Friesenhahn wrote on Sun, May 17, 2009 at 10:43:51PM CEST:
>
> P.S. I am pretty pissed that almost all recent discussion of automake  
> major features and direction has apparently taken place on the  
> automake-patches list rather than the automake list where it belongs.  
> Not everyone interested in Automake has time to read all the bugs and  
> detailed patches.

It seems not easy to please you.  :-)

A while ago you (IMHO rightfully) complained about too much email coming
your way, with duplicates due to cross-posting to more than one mailing
list increasing the pressure even more.  I can only assume that at that
time, you were subscribed to the -patches and/or bug- lists as well.
Now, we try to keep cross posts to a reasonable low, esp. I try to push
people to move to the -patches list when patches are involved, thinking
that the plain automake list also has many readers that aren't primarily
interested in the development of new features.

However, as you now noted that has a drawback too: some of the feature
discussion moves away from the automake list.  Ping-ponging back and
forth between both lists during the course of one message thread seems
awkward and error-prone, too.

I understand nothing is black and white here; also, the situation is a
bit remedied with cross posts as the mailing list manager allows to only
receive one mail per cross post.  I do not know of a perfect solution to
the general problem though.  Of course I can suggest that if you are
interested in details of new features, then it cannot hurt to also read
the -patches list.

But maybe now that your immediate problem is solved, you aren't all that
pissed any more anyway.  ;-)

Cheers,
Ralf




Re: invoke pkg-config with --static

2009-05-19 Thread Ralf Wildenhues
* Lorenzo Bettini wrote on Sun, May 17, 2009 at 09:01:18PM CEST:
> Ralf Wildenhues wrote:
>> * Lorenzo Bettini wrote on Sun, May 17, 2009 at 12:27:32PM CEST:
>>> Ralf Wildenhues wrote:
>>>> You can just use pkg-config --static all the time, unless
>>>>  ( disable_static && ( host_os == Linux || host_os == Solaris ) )
>>>>
>>>> Works wonders! :-)
>>> mhhh... so basically this means that it is better not to use   
>>> Libs.private in the pkg-config file,
>>
>> No, that is not what I said.  Doing that will continue to do the wrong
>> thing even when Libtool is fixed one day.
>
> No, I meant: not using Libs.private at all is however a safe choice,  
> isn't it?

No, it isn't.  At least that's how I understand it.  Why would you think
so?

Only on GNU/Linux and Solaris it might be safe, and only if you don't
care about static linkage.

Cheers,
Ralf




Re: Automake on Mac

2009-05-19 Thread Ralf Wildenhues
Hi Satyam,

* Satyam Satyanarayana wrote on Tue, May 19, 2009 at 01:34:55PM CEST:
> I am using AutoMake provided along with XCode IDE on Mac.
> My Mac OSX version is 10.5.6 and my automake version is 1.10

1.11 has just been released.  Try it out!  :-)

> I am previously having older version of Automake version 1.6.3
>
> When I am having older version, I am able to compile and build my  
> project (www.ifolder.com) successfully.
> But after updating to latest version, I am facing trouble with dylib  
> files not being copied to executable folder.
>
> Can some body tell where can I find the coparision of 1.6.3 and 1.10  
> versions so that I can figure out what's exactly the problem is?

You can look at the list of changes documented in the NEWS file.
But they are quite long; it may be easier to post the Makefile.am files
you are having problems with.  (We may also need to see parts of
configure.ac.)  Or is it that XCode creates them for you?

Cheers,
Ralf




use 'jot' if available (was: automake-1.11 test failure on darwin9 - instmany-python.test)

2009-05-18 Thread Ralf Wildenhues
Hi Peter,

thanks for the bug report; will address in a separate message.  For now:

* Peter O'Gorman wrote on Mon, May 18, 2009 at 03:43:00AM CEST:
> Also, I wonder if it is worth checking for bsd''s 'jot' utility in these
> tests that use seq?

That sounds like a good idea to me.  I didn't know jot.
I'm pushing this.

Cheers,
Ralf

testsuite: also try `jot' as `seq' replacement.

* tests/instmany-mans.test: Try BSD `jot' before resorting to a
slow but portable shell loop.
* tests/instmany-python.test: Likewise.
* tests/instmany.test: Likewise.
Suggestion by Peter O'Gorman.

diff --git a/tests/instmany-mans.test b/tests/instmany-mans.test
index 3ba8334..5fafa67 100755
--- a/tests/instmany-mans.test
+++ b/tests/instmany-mans.test
@@ -1,5 +1,5 @@
 #! /bin/sh
-# Copyright (C) 2008  Free Software Foundation, Inc.
+# Copyright (C) 2008, 2009  Free Software Foundation, Inc.
 #
 # This program is free software; you can redistribute it and/or modify
 # it under the terms of the GNU General Public License as published by
@@ -31,8 +31,8 @@ limit=2500
 subdir=long_subdir_name_with_many_characters
 nfiles=81
 
-# Let's use `seq' if available, it's faster than the loop.
-list=`(seq 1 $nfiles) 2>/dev/null || {
+# Let's use `seq' or `jot' if available, they are faster than the loop.
+list=`(seq 1 $nfiles) 2>/dev/null || (jot $nfiles) 2>/dev/null || {
   i=1
   while test $i -le $nfiles; do
 echo $i
diff --git a/tests/instmany-python.test b/tests/instmany-python.test
index d55af0f..9407948 100755
--- a/tests/instmany-python.test
+++ b/tests/instmany-python.test
@@ -1,5 +1,5 @@
 #! /bin/sh
-# Copyright (C) 2008  Free Software Foundation, Inc.
+# Copyright (C) 2008, 2009  Free Software Foundation, Inc.
 #
 # This program is free software; you can redistribute it and/or modify
 # it under the terms of the GNU General Public License as published by
@@ -27,7 +27,7 @@ limit=2500
 subdir=long_subdir_name_with_many_characters
 nfiles=81
 
-list=`(seq 1 $nfiles) 2>/dev/null || {
+list=`(seq 1 $nfiles) 2>/dev/null || (jot $nfiles) 2>/dev/null || {
   i=1
   while test $i -le $nfiles; do
 echo $i
diff --git a/tests/instmany.test b/tests/instmany.test
index aabeb7b..e423710 100755
--- a/tests/instmany.test
+++ b/tests/instmany.test
@@ -1,5 +1,5 @@
 #! /bin/sh
-# Copyright (C) 2008  Free Software Foundation, Inc.
+# Copyright (C) 2008, 2009  Free Software Foundation, Inc.
 #
 # This program is free software; you can redistribute it and/or modify
 # it under the terms of the GNU General Public License as published by
@@ -39,8 +39,8 @@ limit=2500
 subdir=long_subdir_name_with_many_characters
 nfiles=81
 
-# Let's use `seq' if available, it's faster than the loop.
-list=`(seq 1 $nfiles) 2>/dev/null || {
+# Let's use `seq' or `jot' if available, they are faster than the loop.
+list=`(seq 1 $nfiles) 2>/dev/null || (jot $nfiles) 2>/dev/null || {
   i=1
   while test $i -le $nfiles; do
 echo $i




Re: Various testsuites

2009-05-18 Thread Ralf Wildenhues
Hello,

* NightStrike wrote on Mon, May 18, 2009 at 09:41:27PM CEST:
> What's the difference between using this:
> http://www.gnu.org/software/autoconf/manual/autoconf.html#Using-Autotest
> 
> and this:
> http://www.gnu.org/software/automake/manual/automake.html#Tests
> 
> ?

Autotest is a generated portable shell script, similar to configure,
that you can, ideally, distribute stand-alone, and built with a set of
m4 macros like configure.ac is (but a slightly different set).

Automake TESTS support, as of version 1.11, consists of two different
test drivers that run tests from within a makefile.  The old one is
pretty minimalistic in test suite functionality, the new one (selected
with the `parallel-tests' option) is mostly backward-compatible, but
also offers some more and slightly changed functionality.  It still
operates as a set of rules within a makefile.

DejaGNU (which you didn't mention) is a separate package that provides
yet another, quite elaborate, test suite framework.  It is quite
powerful, but may be overkill for simpler uses.  YMMV.

> Which is better to use?

Good question.  The general answer is "it depends", I would say,

> Which will be maintained,

It is intended that all of them be maintained.

> and more future-proof?

I cannot answer that well, really.  I do hope though that
future-proofness should not be a deciding point.

> Why is there duplication between two products which are very
> inter-related?

Because these test suite frameworks grew separately, and with slightly
different purposes, and slightly different strengths and weaknesses.

For example, DejaGNU works very well with complex setups where a program
(say, a compiler) needs to be run on many different files, as in the GCC
test suite.  Being Tcl, is way more efficient than a bunch of shell
scripts with lots of forks.

OTOH, Autotest and Automake TESTS have less software requirements.

Autotest has quite some overhead due to aiming to be in portable shell.
The test structure is a bit more rigid when compared to Automake TESTS,
but also initialization and cleanup handling of test group directories
is all built-in and requires practically no developer intervention.
At the moment, there is experimental support for parallel execution on
some systems.

Automake TESTS has little initialization and cleanup handling, that
typically needs to be open-coded by the developer.  The parallel-tests
driver supports efficient parallel test execution through `make' level
parallelism, and it supports test wrappers based on file name extension
and such.  It is not as shell-centered as Autotest.

All frameworks have some level of summary reports.

It really depends upon the kind of testing that you would like to do,
and I would choose different of these frameworks based on the task at
hand.  I've also omitted lots of features in this message.

Cheers,
Ralf




Re: My project can't use `silent-rules'

2009-05-17 Thread Ralf Wildenhues
Hi Bob,

* Bob Friesenhahn wrote on Sun, May 17, 2009 at 10:43:51PM CEST:
> I see that the only way to request the new `silent-rules' feature is by 
> using the new form of AM_INIT_AUTOMAKE to pass the option.  Since my 
> package can not use the new form of AM_INIT_AUTOMAKE, then it can not 
> request `silent-rules'.

You can use
  AM_SILENT_RULES

instead.

Cheers,
Ralf




Re: automake 1.11 self test failure: vala5

2009-05-17 Thread Ralf Wildenhues
Hi Simon,

thanks for the bug report.

* Simon Josefsson wrote on Sun, May 17, 2009 at 09:47:50PM CEST:
> Hi!  I installed automake 1.11 without any problems except that one
> self-test failed, see snippet from tests/test-suite.log below.  Is a
> pkg.m4 missing?

Yes.  Rather, the testsuite test is not checking sufficiently hard for
having not only pkg-config available; aclocal also needs to be able to
find its installed pkg.m4 file.  This failure is harmless, as long as
you don't plan to use Vala sources with Automake.

Cheers,
Ralf




Re: makes which break with `silent-rules'

2009-05-17 Thread Ralf Wildenhues
Hi Bob,

* Bob Friesenhahn wrote on Sun, May 17, 2009 at 09:59:17PM CEST:
> Is anyone aware of specific vendor make programs which fail with  
> automake 1.11's new `silent-rules' option?

I don't know of any.

Cheers,
Ralf




Re: Testing for GSL

2009-05-17 Thread Ralf Wildenhues
Hello Gerald,

* Gerald I. Evenden wrote on Sun, May 17, 2009 at 08:40:46PM CEST:
> Has anyone successfully written a configure.ac that successfully tests
> for the Gnu Scientific Library (aka GSL)?

AFAIK the GSL package distributes a macro for checking for its library.
Have you tried it?

Cheers,
Ralf




GNU Automake 1.11 released

2009-05-17 Thread Ralf Wildenhues
We're pleased to announce the release of Automake 1.11.

Automake is a tool for automatically generating `Makefile.in's
suitable for use with Autoconf, compliant with the GNU Makefile
standards, and portable to various make implementations.

This release contains a bunch of new features and a bunch of bug fixes
over previous versions.  Highlights over 1.10, in no particular order:

Optional Linux kernel style less verbose compile rules, faster rules for
(un)install, support for parallel tests, colored tests output, improved
Fortran and initial Vala support, threaded automake execution, lzma/xz
compressed tarballs, man pages which are not transformed, non-".c"
default sources, better AM_MAINTAINER_MODE, man pages for aclocal and
automake.  See the NEWS excerpt below for more details, and a couple of
things to note when upgrading.

The important changes of 1.11 over the last test release, 1.10b, are
documented here:

Further, the test release was released using GPLv3+, but the 1.11
release uses GPLv2+.  Automake will again move to GPLv3+ once the
Exception has been rewritten to use the new GPL Exception language.

A big thanks to all people who have been testing pre-release, reporting
bugs, contributing code, suggesting enhancements, and answering user
questions on the mailing lists!

You can find the new release here:

ftp://ftp.gnu.org/gnu/automake/automake-1.11.tar.gz
ftp://ftp.gnu.org/gnu/automake/automake-1.11.tar.gz.sig
ftp://ftp.gnu.org/gnu/automake/automake-1.11.tar.bz2
ftp://ftp.gnu.org/gnu/automake/automake-1.11.tar.bz2.sig

Soon it will also appear on the GNU mirrors which are listed here:

http://www.gnu.org/order/ftp.html

Finally, here are the MD5 checksums:

4db4efe027e26b33930a7e151de19d0f  automake-1.11.tar.bz2
fab0bd2c3990a6679adaf9eeac0c6d2a  automake-1.11.tar.gz

Please report bugs by mail to .


New in 1.11:

* Version requirements:

  - Autoconf 2.62 or greater is required.

* Changes to aclocal:

  - The autoconf version check implemented by aclocal in aclocal.m4
(and new in Automake 1.10) is degraded to a warning.  This helps
in the common case where the Autoconf versions used are compatible.

* Changes to automake:

  - The automake program can run multiple threads for creating most
Makefile.in files concurrently, if at least Perl 5.7.2 is available
with interpreter-based threads enabled.  Set the environment variable
AUTOMAKE_JOBS to the maximum number of threads to use, in order to
enable this experimental feature.

* Changes to Libtool support:

  - Libtool generic flags are now passed to the install and uninstall
modes as well.

  - distcheck works with Libtool 2.x even when LT_OUTPUT is used, as
config.lt is removed correctly now.

* Languages changes:

  - subdir-object mode works now with Fortran (F77, FC, preprocessed
Fortran, and Ratfor).

  - For files with extension .f90, .f95, .f03, or .f08, the flag
$(FCFLAGS_f[09]x) computed by AC_FC_SRCEXT is now used in compile rules.

  - Files with extension .sx are also treated as preprocessed assembler.

  - The default source file extension (.c) can be overridden with
AM_DEFAULT_SOURCE_EXT now.

  - Python 3.0 is supported now, Python releases prior to 2.0 are no
longer supported.

  - AM_PATH_PYTHON honors python's idea about the site directory.

  - There is initial support for the Vala programming language, when using
Vala 0.7.0 or later.

* Miscellaneous changes:

  - Automake development is done in a git repository on Savannah now, see

  http://git.sv.gnu.org/gitweb/?p=automake.git

A read-only CVS mirror is provided at

  cvs -d :pserver:anonym...@pserver.git.sv.gnu.org:/automake.git \
  checkout -d automake HEAD

  - "make dist" can now create xz-compressed tarballs,
as well as (deprecated?) lzma-compressed tarballs.

  - `automake --add-missing' will by default install the GPLv3 file as
COPYING if it is missing.  It will also warn that the license file
should be added to source control.  Note that Automake will never
overwrite an existing COPYING file, even when the `--force-missing'
option is used.

  - The manual is now distributed under the terms of the GNU FDL 1.3.

  - Automake ships and installs man pages for automake and aclocal now.

  - New shorthand `$(pkglibexecdir)' for `$(libexecdir)/@PACKAGE@'.

  - install-sh supports -C, which does not update the installed file
(and its time stamps) if the contents did not change.

  - The `gnupload' script has been revamped.

  - The `depcomp' and `compile' scripts now work with MSVC under MSYS.

  - The targets `install' and `uninstall' are more efficient now, in that
for example multiple files from one Automake variable such as
`bin_SCRIPTS' are copied in one `install' (or `libtool --mode=install')
invocation if they do not have to be r

Re: Problems with linking multiple lex/yacc parsers into one executable

2009-05-17 Thread Ralf Wildenhues
* Philip Herron wrote on Sun, May 17, 2009 at 02:00:45PM CEST:
> But i'll maby play with this. If i find a more standard solution what
> should i do?

Write a patch (preferably against git master of Automake) that fixes the
Automake manual to document this precisely, and add a test to the test
suite that ensures that this very example really works.

Please note that nontrivial additions to GNU software requires copyright
assignments (more off-list if you're interested).

Cheers,
Ralf




Re: Problems with linking multiple lex/yacc parsers into one executable

2009-05-17 Thread Ralf Wildenhues
Hello,

* Philip Herron wrote on Sun, May 17, 2009 at 12:49:32PM CEST:
> 
> I solved this problem a while back!
> 
> http://lists.gnu.org/archive/html/automake/2009-04/msg00095.html
> 
> The trick is in each of your lexers(flex .l) add this:
> %option prefix="prefix" outfile="lex.yy.c"

I don't see %option documented in POSIX (looking at SUSv3).
Are you sure your solution is portable to non-flex lex?  If not, then
please reconsider praising it as a portable solution.

> Dont change the outfile ylwrap needs it to be lex.yy.c!

If the example in the Automake manual needs fixing, then we should
fix it, so that it really works.  However, somebody needs to do the
work and find and test it.  It should be added to the Automake test
suite.

I can do testing on various systems but working on the example is
not high in my priority list because I do not know lex well.

Cheers,
Ralf




Re: invoke pkg-config with --static

2009-05-17 Thread Ralf Wildenhues
* Lorenzo Bettini wrote on Sun, May 17, 2009 at 12:27:32PM CEST:
> Ralf Wildenhues wrote:
>> * Lorenzo Bettini wrote on Sun, May 17, 2009 at 09:07:16AM CEST:
>>> Bob Friesenhahn wrote:
>>>> On Sat, 16 May 2009, Lorenzo Bettini wrote:
>>>>
>>>>> when ./configure is run with --disable-shared, is there a way to  
>>>>> invoke the pkg-config macro with --static (so that it does not 
>>>>> select private libraries in the .pc file)?
>>
>> You can just use pkg-config --static all the time, unless
>>  ( disable_static && ( host_os == Linux || host_os == Solaris ) )
>>
>> Works wonders! :-)
>
> mhhh... so basically this means that it is better not to use  
> Libs.private in the pkg-config file,

No, that is not what I said.  Doing that will continue to do the wrong
thing even when Libtool is fixed one day.

> and always have all the libraries  
> dependences right?


Cheers,
Ralf




Re: invoke pkg-config with --static

2009-05-17 Thread Ralf Wildenhues
Hi Lorenzo,

* Lorenzo Bettini wrote on Sun, May 17, 2009 at 09:07:16AM CEST:
> Bob Friesenhahn wrote:
>> On Sat, 16 May 2009, Lorenzo Bettini wrote:
>>
>>> when ./configure is run with --disable-shared, is there a way to  
>>> invoke the pkg-config macro with --static (so that it does not select 
>>> private libraries in the .pc file)?

You can just use pkg-config --static all the time, unless
 ( disable_static && ( host_os == Linux || host_os == Solaris ) )

Works wonders! :-)

Once Libtool has been fixed to allow specifying separate dependencies
for static and shared libs (it might have a special API for pkg-config
at that point), this can be revisited.

>> It seems that LT_INIT has been executed, the shell environment 
>> variables enable_shared and enable_static are set to 'yes' if shared 
>> and/or static libraries will be built.  You can test these environment 
>> variables to determine the parameter to pass to pkg-config.
>
> but this cannot be done using the pkg-config autoconf macro, can it?

This is a question you might need to ask on the pkg-config list.

Cheers,
Ralf




Re: improve INSTALL contents

2009-05-17 Thread Ralf Wildenhues
Alfred,

can you please also read, and follow
?
I'm sure you must have missed it because I failed to spam it to three
mailing lists.  But your repetitions are just as boring as those from
everyone else.  And get bug-coreutils and automake off the recipients
lists!

Thanks,
Ralf




Re: improve INSTALL contents

2009-05-14 Thread Ralf Wildenhues
* Alfred M. Szmidt wrote on Fri, May 15, 2009 at 04:45:21AM CEST:
>In addition, if you use an unusual directory layout you can give
>options like @option{--bind...@var{dir}} to specify different
>values for particular kinds of files.  Run @samp{configure --help}
>for a list of the directories you can set and what kinds of files
>go in them.  In general, the default for these options is expressed
>in terms of @sam...@{prefix@}}, so that specifying just
>@option{--prefix} will affect all of the other directory
>specifications.
>
>If you wish to install the package into a staging directory
>(e.g. for packaging or testing purposes) then you can set the
>DESTDIR variable, @samp{make destd...@var{dir} install}.

What about packages that don't support arbitrary prefix override
(all those using current libtool), or packages or systems that don't
support DESTDIR installs?  This wording creates problems for them.

Cheers,
Ralf




Re: improve INSTALL contents (was: Core-utils 7.2; building only 'su')

2009-05-13 Thread Ralf Wildenhues
Hi Eric,

thanks for pursuing this.

* Eric Blake wrote on Wed, May 13, 2009 at 02:51:34PM CEST:
> @@ -156,7 +158,25 @@ Installation Names
>  In addition, if you use an unusual directory layout you can give options
>  like @option{--bind...@var{dir}} to specify different values for
>  particular kinds of files.  Run @samp{configure --help} for a list of
> -the directories you can set and what kinds of files go in them.
> +the directories you can set and what kinds of files go in them.  In
> +general, the default for these options is expressed in terms of
> +...@samp{$@{pre...@}}, so that specifying just @option{--prefix} will
> +affect all of the other directory specifications.
> +
> +Depending on the package, the default directory layout chosen during
> +...@command{configure} can be altered during subsequent execution of
> +...@command{make}.  For example, the command @samp{make
> +prefix=/path/to/alternate} will choose an alternate prefix, and
> +additionally impact all configuration variables that were expressed in
> +terms of @sam...@{prefix@}}.

This has a couple of issues:

- it fails to take into account that many packages that provide INSTALL
  do not provide this flexibility (e.g., all of those that use libtool
  don't currently provide it in full flexibility).

- it sounds as if developers are expected to check whether "prefix" has
  been changed, an in that case, rebuild files that need its definition.
  However, the GCS states that upon "make install prefix=...", no files
  are to be changed to reflect the changed prefix.  The above fails to
  make this clear.  As a consequence, it is not very useful to use
  "make all prefix=..." after a plain "make all", because typically that
  would not change any files either.

>  However, some programs need to know at
> +compile time where files will be installed, so the user should ensure
> +that the same directory choice is made for both @samp{make all} and
> +...@samp{make install}.  Some packages also support installation into a
> +staging area, via @samp{make destd...@var{dir} install}.  If your
> +...@command{make} does not understand arguments containing variable
> +assignments, you can try @samp{env destd...@var{dir} make -e install},
> +although you must then be careful of any other environment variables
> +affecting @command{make} behavior.

The last sentence can be omitted; packages that support DESTDIR
installs typically do not set DESTDIR in their makefiles.  All make
implementations understand arguments containing variable assignments,
and they override the settings in a toplevel makefile too, it's only
that they don't transport them to sub-makes if variables are also set in
the sub makefile.  (Yes, this is complicated, and applies to non-GNU
make only; see autoconf.texi.)

> +Some packages offer the ability to configure how verbose the execution
> +of @command{make} will be.  For these packages, running
> +...@samp{./configure --enable-silent-rules} sets the default to minimal
> +output, which can be overridden with @code{make V=1}; while running
> +...@samp{./configure --disable-silent-rules} sets the default to verbose,
> +which can be overridden with @code{make V=0}.

Cheers,
Ralf




Re: rebuilding following a change in prefix?

2009-05-11 Thread Ralf Wildenhues
* Andreas Schwab wrote on Fri, May 08, 2009 at 10:47:20AM CEST:
> Jan Engelhardt  writes:
> 
> > Well, automake (unfortunately?) does not currently issue a recompile
> > when the compiler command changed.
> > It would be really cool to have that, though.
> 
> But that should be optional.  I don't want to do a complete rebuild just
> because I want to do a quick recompile of a single object file with
> different options.

I think the general idea this is hinting at is a dependency on the full
build rules and on the dependency tree itself.  IIRC Solaris make has a
related feature, and there have been proposals for this for GNU make
(maybe a SoC one?), but AFAIK no code yet.  Anyway such an approach
would sound like a more useful idea to me.

Cheers,
Ralf




Re: Statically compiled binaries not re-linked as expected.

2009-05-11 Thread Ralf Wildenhues
Hi Seb,

please keep the mailing list in Cc:, thanks.

* Seb James wrote on Mon, May 11, 2009 at 10:49:34PM CEST:
> > > However, if I reconfigure like this:
> > > 
> > > ./configure --enable-static --disable-shared
> > > 
> > > Then make clean && make will generate a mainprog with libproject
> > > statically linked into the executable. So far so good.
> > > 
> > > However, if I change libcodefiles.cpp and re-call make, libproject is
> > > recompiled, but the executable mainprog is not re-linked.

> The project is called "wmlcbr" with the root directory "wmlcbr"
> containing just the "main()" function of each program and most of the
> code in wmlcbr/wmlcbr/

> wmlcbrfilt_LDADD = -L./wmlcbr -lwmlcbr -lcups -lwmllicence -lwmlppfilt 
> -lwmlppcommon -lxml++-2.6 -lwmlcups -lcups -luuid -lmysqlpp -lmysqlclient 
> -L/usr/lib/mysql

Please drop the -L./wmlcbr and change all in-build-tree library
references to be like wmlcbr/libwmlcbr.la:

  wmlcbrfilt_LDADD = wmlcbr/libwmlcbr.la -lcups -lwmllicence -lwmlppfilt 
-lwmlppcommon -lxml++-2.6 -lwmlcups -lcups -luuid -lmysqlpp -lmysqlclient 
-L/usr/lib/mysql

Similar for all other in-tree libraries that the line may have, and all
other programs.  BTW, omit the ./ before wmlcbr, in-tree file names
can be reused in wmlcbrfilt_DEPENDENCIES by automake.

Cheers,
Ralf




Re: binary distribution build with automake

2009-05-10 Thread Ralf Wildenhues
Hi William,

* William Pursell wrote on Sun, May 10, 2009 at 09:09:23AM CEST:
> Ralf Wildenhues wrote:
> > You can just write one in your Makefile.am, as a plain 'make' rule.
> > For reusability, you can also put it in a fragment file and include it
> > in your Makefile.am.
> 
> I have a "best practice" question about this.  For awhile,
> I've had generic targets that I like to include in all of
> my projects, but rather than including the fragment in
> each Makefile.am, I use the following hack^h^h^h^htechnique:
> configure.ac invokes my initialization macro (which I
> install on my dev box in /usr/local/share/aclocal rather
> than putting in a subdir of each project), this creates
> the make rule file snippets in a new subdir of the builddir
> at configure time and AC_SUBST_FILES them so that the Makefile.am
> can contain:
> 
> @MY_DEFAULT_RULES@
> 
> This mostly works, but there are a few cases where I need to
> have the generated snippet file be just the rule, so I can do
> things like:
> 
> check-local:
>   @MY_LOCAL_RULES@
> 
> 
> The question is: is that a reliable technique, or is it a
> ridiculous hack that will someday come to haunt me?

Yes, it is reliable in the sense of that I think it will always
substitute things into the Makefile as you desire.  However, the
technique completely hides the rules from automake, so it cannot
adjust its output to your additions.  For example, automake will
not let 'check' depend on or invoke the 'check-local' target when
it has not seen a 'check-local' rule.  Similarly with other stuff,
where automake does not output bits that it thinks are unneeded
in this Makefile.in.  Not doing this would probably blow up the
files by a few times.

Cheers,
Ralf




Re: binary distribution build with automake

2009-05-09 Thread Ralf Wildenhues
Hello Andreas,

* Andreas Otto wrote on Fri, May 08, 2009 at 01:28:16PM CEST:
>   I missing one target in automake to create a binary distribution

> Reason:
> 
>   1. the default prefix is "/usr/local" -> thats ok
>   2. this prefix is required for libraray "rpath" feature -> thats ok
>   3. to buid and install a sw in "/usr/local" is a problem because it
>   belons to "root" and not to the "user" ans should not be
>   used in "normal" sw development
> 
> what I want
> 
>   1. a new target "binary-dist"

You can just write one in your Makefile.am, as a plain 'make' rule.
For reusability, you can also put it in a fragment file and include it
in your Makefile.am.

>   2. to create a binary distribution:
>   "$(PACKAGE)-$(PACKAGE_VERSION)-$(build).tar.gz"
>   -> I choose "build" as hw/os identification string

I think for a binary distribution you want to use $(host) not $(build).

>   3. able to use "--prefix" as system-directory !without! 
>   installation requirement

You can use DESTDIR installs (see 'info Automake DESTDIR') to create a
temporary install tree that looks (almost) like the final thing.

I don't think Automake should provide something like this by default:
first, it's simple enough to be added by developers, second,
distributing binaries without further information on packaging,
dependencies etc. isn't something we need to further encourage,
and the business of distributing packages is very well done already
by the various distribution tools like rpm, apt, ...
You could just use one (or several) of those.

Hope that helps.

Cheers,
Ralf




<    1   2   3   4   5   6   7   8   9   10   >