Re: make check(s) pre-release problems

2022-10-06 Thread Paul Smith
On Wed, 2022-10-05 at 15:24 -0600, Karl Berry wrote:
> What troubles me most is that there's no obvious way to debug any
> test failure involving parallelism, since they go away with serial
> execution.  Any ideas about how to determine what is going wrong in
> the parallel make?  Any way to make parallel failures more
> reproducible?

I don't have any great ideas myself.

In the new prerelease of GNU make there's a --shuffle option which will
randomize (or just reverse) the order in which prerequisites are built.
Often if you have a timing-dependent failure, forcing the prerequisites
to build in a different order can make the failure more obvious.

In general, though, the best way to attack the issue is to try to
understand why the failure happens: what goes wrong that causes the
failure.  If that can be understood then often we can envision a way
that parallel or "out of order" builds might cause that problem.

Alternatively since you seem to have relatively well-defined "good" and
"bad" commits you could use git bisect to figure out which commit
actually causes the problem (obviously you need to be able to force the
failure, if not every time then at least often enough to detect a "bad"
commit).  Maybe that will shed some light.

But I expect there's nothing here you haven't already thought of
yourself :( :).



Re: make check(s) pre-release problems

2022-10-05 Thread Paul Smith
On Wed, 2022-10-05 at 05:27 +0200, Jan Engelhardt wrote:
> > So what the heck?[...] >These always worked before. But now, Jim
> > gets hundreds of failures with the first
> 
> Make was in the news recently, maybe that's the component to
> switch out for an earlier version?
> 
> 7ad2593b Support implementing the jobserver using named pipes

What version of GNU make are you using?  There has been no new release
of GNU make (yet).  If you're running a prerelease version of GNU make,
you might consider trying it with the last official release (4.3) to
see if it helps.

Certainly if something is failing with the GNU make prerelease that
would be good to know, however.



Re: portability of xargs

2022-02-15 Thread Paul Smith
On Tue, 2022-02-15 at 16:59 -0600, Bob Friesenhahn wrote:
> The people who tell me it is more portable are very interested in 
> targeting Microsoft Windows.

Unsurprising :).

Just to be clear, cmake can't/won't help you write programs that are
portable to Windows.  If you are targeting the W32 system API, you may
have to modify a lot of code to make it work and cmake will be of zero
assistance to creating this code "portably".  And if you are going to
use a POSIX layer on top of Windows (for example, cygwin) then you have
a POSIX environment already and can just run configure anyway.

If your project is basically pretty portable (doesn't use many system
APIs outside of C and C++ standard runtimes), so that the main problem
is the configure and build systems, then cmake can be a big benefit.

For example, you see a lot of cmake usage in library projects or simple
programs on Git forges, where the code is pretty stand-alone and people
want to make it easy to build on different systems.

> 
> The "Makefiles" that Cmake generates are self-referential in that 
> almost all rules invoke Cmake to do the work.

True; CMake includes a kind of "busybox" set of operations, to increase
portability (so it doesn't have to rely on "echo", "cp", etc. in its
rules, that wouldn't work everywhere).  But, the types of operations it
supports are limited.  It can echo, copy files/directories, make
directories, remove files/directories, rename things, touch files,
create tar files, and generate checksums with builtin operations (and a
few other things).

If you have cmake installed you can run "cmake -E --help" to see a
complete list.



Re: portability of xargs

2022-02-15 Thread Paul Smith
On Tue, 2022-02-15 at 15:37 -0600, Bob Friesenhahn wrote:
> I have been told by several people (who have much more self-esteem 
> than me) that a build tool called 'cmake' is far more portable than 
> Autotools so maybe we should make support for 32 year old systems 
> cmake's responsibility?

That is not accurate.  Or at least, cmake uses a much different
definition of "portable" than autoconf / automake.  Certainly cmake
cannot possibly support 32-year old systems (if that's needed).

cmake is a tool that, given an input definition of build dependencies,
can generate build control files for multiple different types of build
tools.  It can generate makefiles, Ninja files, Xcode project files,
and Visual Studio project files.  Maybe others, I'm not sure.

So in that sense it's "more portable" because given a single input
definition you can build the project with both make-based systems and
Visual Studio-based systems (etc.) without extra maintenance work.

However, of course because the definition files must be able to
generate all these different outputs they are necessarily "least common
denominator" definitions; you can't just define a new make rule to do
something obscure like you can with automake. Cmake works great for
simple projects but you will absolutely struggle to encode more complex
build requirements.

And, cmake has really not much comparable to autoconf.  It does provide
some very minimal support for choosing some compiler options and some
facilities sort of like pkg-config but nothing even approaching the
kinds of checks and configuration that autoconf supports.

The main idea of autoconf is that it allows configuring the package for
systems that the author never even heard of, much less has access to
(as long as it has a POSIX interface).  cmake only tries to generate
output files for systems it has been developed for (for example, each
new version of Visual Studio often requires a new release of cmake to
support it).

Of course maybe autoconf is not necessary anymore: I don't know about
that.  I do know that even though GNU make itself is relatively simple,
people still build and use it on older, and non-GNU/Linux, systems, and
that they want to use cross-compilers (cmake can do this, kind of, but
it's much more difficult than autotools support) and other more
advanced operations.



Re: Constantly changing libtool

2021-04-15 Thread Paul Smith
On Thu, 2021-04-15 at 08:33 -0500, Laurence Marks wrote:
> In the same way as autoXZY sets up Makefiles in an OS independent
> fashion, there should be a way to autoupdate autoXYZ files for each
> system without user intervention. (I don't mean automake itself or
> similar, I do mean only the local files.) In an ideal world this
> would be a macro in configure.ac

The problem comes (IMO) from checking in generated files.  If you only
check in actual source files, not generated files, then you don't run
into these issues with different people having different versions of
build tools.

For GNU make I never check in any files generated by running autotools:
I check in configure.ac and Makefle.am and that's all.  I don't even
check in .po files (latest versions pulled automatically by the build
config) or gnulib-installed files (installed during config as well).

Of course, this means that everyone who wants to build the software
needs to have their own local copy of ALL the tools needed to do so,
including autoconf, automake, and libtool (actually we don't use
libtool for GNU make but...).  There's never a free lunch so you just
have to decide which cost is more dear.





Re: Future plans for Autotools

2021-01-26 Thread Paul Smith
On Tue, 2021-01-26 at 11:01 -0800, Andy Tai wrote:
> GNU Make integrates with guile.  Maybe such extension can be 
> done using guile for flexibility?

The problem is that hardly any standard distributions build GNU make
with Guile support enabled.  If this was used basically it would end up
requiring anyone who wanted to use automake to build their own version
of GNU make first.

And, we'd have a catch-22 where Guile (if it used automake) would need
a GNU make built with Guile before it could be built...




Re: A new, novel, and faster Binary Search algorithm

2019-03-15 Thread Paul Smith
On 3/15/19 6:16 AM, Gregorius van den Hoven wrote:
> > My apologies. The algorithm is licensed under GPL 3 so it
> > seemed relevant to the GNU community to me. Do you have a
> > suggestion for a more appropriate mailing list?

I don't see any need for the somewhat harsh response you received
initially.  Any effort spent on base algorithms is appreciated; even
the benchmarking work is a valuable contribution.

Just to orient you, Gegorius, it could well be that your efforts are of
interest to the GNU community; however the automake project in
particular develops a tool which uses the shell and a macro language to
convert Makefile templates into standard Makefiles: this project will
not have much need for the work you've done (directly).


On Fri, 2019-03-15 at 08:44 -0500, Eric Blake wrote:
> 
> Perhaps GNU coreutils (coreut...@gnu.org), as the owner of the
> sort(1) utility, which would be the most likely program to pick up
> the use of this code?  Or to glibc, to see if it is worth improving
> qsort() by using the ideas in your algorithm?

Coreutils: https://www.gnu.org/software/coreutils/
GNU libc:  https://www.gnu.org/software/libc/libc.html

Those are good suggestions.  I might also suggest contacting the Gnulib
folks who maintain a suite of common library functions that many GNU
tools use: https://www.gnu.org/software/gnulib/

Although it's off-topic for this list, one thing you might consider in
your benchmarking before publishing further is being more clear about
the relative number of key checks per algorithm.  Binary searches on
integers are likely not the most common type for qsort() and other
general-purpose binary searches: searching on strings is likely much
more common.  There the cost of the key checks dwarfs the rest of the
costs so algorithms with the fewest key checks will perform better.

Thanks for your work and interest in the GNU project!




Re: distcheck does 'chmod a-w' - leads to mkdir "permission denied"

2017-03-03 Thread Paul Smith
On Fri, 2017-03-03 at 17:09 +, Paul Jakma wrote:
> Well, it's due to the distributed 'doc/quagga.info' being slightly
> older than the (built by automake) doc/defines.texi.

I don't understand this: if doc/defines.texi is needed by quagga.info
then why doesn't quagga.info depend on it?

And if it does depend on it, then how can doc/defines.texi be newer
without rebuilding quagga.info (when creating the dist file, not when
building the dist file)?



Re: distcheck does 'chmod a-w' - leads to mkdir "permission denied"

2017-03-03 Thread Paul Smith
On Fri, 2017-03-03 at 15:48 +, Paul Jakma wrote:
> make[4]: Entering directory 
> '/home/paul/code/quagga/quagga-1.2.0/_build/sub/doc'
>    MAKEINFO ../../../doc/quagga.info
> mkdir: cannot create directory ‘.am8211’: Permission denied
> could not open ../../../doc/quagga.texi: No such file or directory
> /bin/sh: line 16: cd: ../../../doc: No such file or directory
> ..
> $ cd doc # in the top-level repo
> $ make quagga.info
> make: 'quagga.info' is up to date.
> 
> Baffling?

Run make (in a remote directory freshly unpacked from your dist file)
with the "-d" flag (redirect it due to voluminous output) and it will
tell you exactly why it decided to run that recipe.

No doubt it depends on some other file which has an unexpected
timestamp.



Re: Missing separator error can be caused by too many files in alternate recipe format

2014-05-27 Thread Paul Smith
On Mon, 2014-05-26 at 20:46 -0400, PenguinDude24 wrote:
 What I think happened is that obviously there was some internal error 
 with heuristics that GNU Make (and most likely others), and the 
 heuristics engine could not figure out how to parse that long line 
 (maybe was not expecting data to be so long, or programmer so stupid
 to do that) and threw the error.

GNU make supports an arbitrary number of targets and prerequisites, up
to the size of your computer's memory.  There are no (known) hardcoded
limits.

And, an error like missing separator is not what I would expect to see
if the problem were related to some out of bounds error.

Instead, I expect that either there's some issue with the way automake
generated the makefile, or else some issue with the filenames in the
target list, or a bug in GNU make related to heap management.

The only way to know is if you give us the details necessary to
understand the problem.  What version of automake are you using?  What
version of GNU make?  And, what is the (exact) contents of the line
which is causing make to throw an error (and a few lines before/after
it, for context)?




Re: Help with static linking

2013-06-02 Thread Paul Smith
I'm removing automake from this thread as I'm getting two copies of
every mail.  Hope no one minds.

On Sun, 2013-06-02 at 03:06 -0400, Mike Frysinger wrote:
 On Sunday 02 June 2013 01:10:36 Kip Warner wrote:
  On Sat, 2013-06-01 at 23:14 -0400, Mike Frysinger wrote:
   be aware that what ever version of glibc  gcc you use to build, the end
   user cannot have a version older than that or it'll fail to start
  
  Do you mean in the case of dynamic linking? If so, that's awful. But
  strange because I've seen many upstream projects release precompiled
  binaries without that ever really being made into an issue. Or do you
  mean in the case of static linking?
 
 i mean dynamic linking.  it's always been this way.

This is because GCC has some of its internal functionality implemented
in libraries, which are linked dynamically by default.  This is
especially true if your program is written in C++, because the STL is
provided with the compiler and is very compiler-specific.  However, even
some low-level C functionality is implemented as a shared library.

If the runtime system has an older version of the compiler with older
versions of these libraries, you can run into trouble (again, C++ is the
biggest culprit: the C helper libraries are pretty stable, release to
release, in general).

This is easily solved, though.  You just have to add the -static-libgcc
and, if you use C++, -static-libstdc++ to the link line, then these
helper libraries are linked statically and doesn't matter what version
of GCC is installed on the system.

These libraries have a special exception to the GPL, which allows them
to be statically linked in straightforward situations without the result
being GPL-licensed.  See the license for details.

 people who do binary releases often times find an old distro that works and 
 then upgrade packages as need be.  then they keep that image around forever.

This is a different issue than the compiler version.  The above solution
lets you use a newer _compiler_.  This problem relates to the
_distribution_ (that is, the version of libc, etc.)

As Mike says, GNU/Linux distros generally guarantee that if you build
against version X of a system it will run without problems on any
version =X (note here I'm talking mostly about basic system libraries:
the higher up into userspace you go the less reliable such statements
become).  However, there is no guarantee about running on version X,
and in fact that very often does not work.  This is not really
surprising: you can't guarantee perfect compatibility, forward and
backward, for all time!

However, using the old image method is, IMO, not a good solution for
any larger-scale development.  It's slow, difficult to manage, and
generally painful.

My recommendation for this situation is to instead create a sysroot,
which is basically a directory structure containing the dev files for a
given distribution: header files and libraries (.so and .a).  You don't
need any bin, man, etc. files.  Pick a pretty old distribution (the
oldest that you want to support).  The current-minus-one Debian or Red
Hat distros are good choices, generally, because they usually have such
old versions of everything that it's unusual to find another mainstream
distro with older versions.

Alternatively, if you prefer to distribute different packages for
different systems, you can create multiple sysroots: one for each
system.  They're not that big and are easy to manage.

Then use GCC's --sysroot flag to point the compiler at the sysroot
directory structure rather than your system's headers and libraries.

Now your build environment is portable and completely divorced from the
version of the underlying build system and the result will run on any
distribution which has the same or newer versions of libraries as your
sysroot.

It's a bit of effort to set up but it's a LOT nicer than dealing with
virtual machines with older releases loaded, or whatever.


Regarding autoconf: using the above setup you have two choices I can
see.  First you can just go with it as-is, and hope the result works
well (it probably will).  Second you can try to build your packages as
if you were cross-compiling, which is a little safer since autoconf
won't try to use your build system to infer things about your host
system, if it detects that they're different.  However, not all packages
have perfect support for cross-compilation so it may be more work.




Re: Where Do All These Macros Come From?

2013-05-24 Thread Paul Smith
On Fri, 2013-05-24 at 07:25 -0700, Jordan H. wrote:
 Hello, folks!
 
 In learning automake I keep looking at example configure.ac files and in
 the tutorial someone says oh, you can just use this here macro. I look
 in the automake manual, though
 (https://www.gnu.org/software/automake/manual/automake.html#Macro-Index)
 and find the macro to not be found. One example is AC_CHECK_LIB that's
 automatically generated by autoscan...I see very little documentation on
 this macro.
 
 Is there a special syntax for configure.ac? This is never really
 explained in any tutorial. All I see is someone pulling a variable out
 from who-knows-where and a different tutorial doing the exact same thing
 pulls a different macro from who-know-where.
 
 Thanks. Hope my question is clear. I'm still sort of a newbie with automake.

Your problem is that your confusing automake and autoconf.

Automake is the tool that manages makefiles: it turns a Makefile.am file
into a Makefile.in file.

Autoconf is the tool that manages configuration: it turns a configure.ac
file into the configure script.

When your user runs the configure script, one of the things it does is
convert the Makefile.in file to a normal Makefile.

As a general rule, the autoconf macros that begin with AM_ are
automake macros.  The macros that begin with AC_ are autoconf macros,
and you need to read the autoconf documentation to learn about those.




Re: Where Do All These Macros Come From?

2013-05-24 Thread Paul Smith
On Fri, 2013-05-24 at 08:26 -0700, Jordan H. wrote:
 Right. I understand that much. My question is about autoconf (sorry 
 about saying auto*make*). I see a lot of macros that tutorial authors 
 use for which I don't see any documentation.

Well it depends on the macro.  The one you mentioned by name in your
email is AC_CHECK_LIB, and that's defined in the autoconf manual (along
with all the other AC_ macros), which is why I suggested it:

http://www.gnu.org/software/autoconf/manual/autoconf-2.69/html_node/Existing-Tests.html

http://www.gnu.org/software/autoconf/manual/autoconf-2.69/html_node/Libraries.html#Libraries

Those are the ones autoconf provides by default.  Other macros might
have been created specifically for a given project; they will be
contained in files in that project directory.  Other macros might be
defined by other 3rdparty software you are trying to work with (texinfo,
various libraries, etc.)  Those will be defined by those packages.




Re: libvirt build failure w/GNU make and automake.git (automake regression?)

2012-09-12 Thread Paul Smith
On Wed, 2012-09-12 at 17:01 +0200, Jim Meyering wrote:
 That is because of this automake-generated rule:
 
 undefine.log: undefine
 The trouble is that undefine is an operator in GNU make.
 
 The most pragmatic work-around is to rename the undefine test script.
 However, Stephano, as automake maintainer, I think you will want to
 fix automake not to prohibit the use of such test names.

This could be a legitimate bug in GNU make.  It's arguable that GNU make
should only consider undefine to be an operator in contexts where it
might be an operator.  Clearly a prerequisite (or target) named
undefine cannot be a make operator.





Re: allowing users to add source files without rerunning the autotools?

2012-01-20 Thread Paul Smith
On Fri, 2012-01-20 at 20:35 +0100, Stefano Lattarini wrote:
  What automake does for source files it knows about is just include
  $(DEPDIR)/srcfile.Po (apparently include is considered portable
  make?).
  
  It's not considered portable make.
 
 Still, it's worth noting that it works with every make implementation
 I usually test automake with:

FYI, I believe that POSIX has approved adding an include command to
the make standard.  I can't remember for sure.

I know that POSIX standard != portable, necessarily, but it's a start.




Re: [gnu-prog-discuss] Could automake-generated Makefiles required GNU make?

2011-11-22 Thread Paul Smith
On Tue, 2011-11-22 at 19:50 -0700, Warren Young wrote:
 On 11/22/2011 7:46 PM, Bob Friesenhahn wrote:
  P.S. I choose to be a 1 percenter.
 
 Do we let OpenVMS drive Autotools' direction, too?  OS/2?  BeOS?
 Ooops, nope.  GNU make probably shipped on those platforms, too. :)

As a matter of fact GNU make DOES still support those platforms.  The
OpenVMS port is actually quite active.

And Amiga, as well, if you can believe that.  Anyway the code is still
there and the last time I asked, people were still using it and didn't
want it deleted.

Cheers! :-)

-- 
---
 Paul D. Smith psm...@gnu.org  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.net
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: [gnu-prog-discuss] Could automake-generated Makefiles required GNU make?

2011-11-22 Thread Paul Smith
On Tue, 2011-11-22 at 18:33 +0100, Ralf Corsepius wrote:
 But I recall there had been massively broken gmake 
 releases and releases with major functional changes, which had broken
 a lot.

I don't believe that this is so.  There have been changes which broke
some makefiles here and there, and sometimes popular ones (for example
there was a small change which broke two rules in the Linux kernel
makefiles, which caused some annoyance) but massively broken is not
accurate.

Somewhere (lost it now) someone mentioned needing make to build GNU make
(bootstrapping).  In fact, GNU make has always provided a build.sh
shell script that can be used if you don't already have any version of
make.  This script doesn't attempt to do build avoidance, of course, but
for a program the size of GNU make this is not so important... and
anyway, once you've built it the first time you'll have a make you can
use for future attempts.

-- 
---
 Paul D. Smith psm...@gnu.org  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.net
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: Could automake-generated Makefiles required GNU make? (was: Re: [gnu-prog-discuss] portability)

2011-11-21 Thread Paul Smith
On Mon, 2011-11-21 at 21:56 +0100, Stefano Lattarini wrote:
   1. We start requiring GNU make in an experimental automake 2.0
  development line (which might, and will, break whathever
  backward-compatibility gets in its way).

If you go this route, then it's critical that the minimal set of
capabilities used by this new automake be clearly documented.  Even
something like all features available as of GNU make 3.76 or whatever
would be very helpful, although I would prefer an explicit list of
capabilities.

There ARE multiple versions of GNU make out there and many distributions
still ship versions many years old... I regularly get questions on the
GNU make help lists from users referring to our web manual, which
obviously documents the latest released version of GNU make, and they
are confused because their version of GNU make (released in 2006 or
whatever) does not support some of the documented features.





Re: Variable list modification

2011-04-20 Thread Paul Smith
On Thu, 2011-04-21 at 01:16 +0200, Ralf Wildenhues wrote:
 Hello Justin,
 
 * Too, Justin A. wrote on Wed, Apr 20, 2011 at 11:33:26PM CEST:
  FOO=1 2
  $(FOO:=bar) = 1bar 2bar.
  
  Is there a way to prepend bar?
 
 This is a GNU make question, or maybe a Posix make question, depending
 on what you target.  In Posix make, the answer is no.  With GNU make,
 you can use the patsubst function, see 'info make --index patsubst'.

If you have GNU make you can just write:

$(FOO:%=bar%)

which is shorter/simpler than using patsubst.


Cheers!

-- 
---
 Paul D. Smith psm...@gnu.org  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.net
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: debbugs, and a FAQ, for Autotools

2011-02-22 Thread Paul Smith
On Tue, 2011-02-22 at 22:30 +0100, Ralf Hemmecke wrote:
 That Linux distributions usually come with a good set of autotools is 
 irrelevant, since in my understanding all developers of *one* project 
 should work with the *same* autotools versions. Of course, the project
 might also compile otherwise on different systems, but using the same 
 version of autotools everywhere reduces (needless) incompatibility 
 issues between developers and increases acceptance of autotools. No?

This is very important IF you check in the outputs of the autotools into
your source repository.  If you do that, and people use different
versions of autotools, then you're constantly getting a ping-pong
effect with these files as different developers check in different
versions.

My opinion has always been that those are object files, not source
files, and just as you'd not typically check in a .o into your source
repository, so you wouldn't check in the output of autotools.  Because
of that it doesn't matter to me which version of autotools other people
use.

The downside of the approach I use is that anyone who wants to build
from the source repository has to go get copies of the autotools and
install them first.  I prefer this option anyway, as I think developers
need those tools, but other teams may have other preferences.




Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-17 Thread Paul Smith
On Mon, 2011-01-17 at 17:28 +0100, Steffen Dettmer wrote:
 When having a source three constructed of several (sub-)
 packages, how does a Beta-Build system looks like? Could there be
 tupfiles including sub-tup-files?
 
 What influence has the choice of a Beta-Build to the
 maintainability of such a sub package? Can they still be
 orthogonal to each other?

I don't offhand see any reason why a Beta build system couldn't use the
same makefile syntax as make, even GNU make.  It seems plausible that
the parser which creates the DAG, etc. could be kept basically as-is.
However, I think the walk the DAG, build decision parts of make would
have to be tossed and completely re-done.

You might have to add some new definitions to the makefile (for example
lists of files/directories that should be considered watched and ones
that should be omitted).

 How portable can a Beta-Build system be? Isn't it requiring
 specific extentions to watch the file system by some event bus or
 such?

That would be one way, and is certainly the most efficient.  Another
would be to simply stat all the files up-front; that could be done on
any system.  Obviously this is much more time-consuming.  On the other
hand you can avoid a lot of rule matching/searching/etc.

-- 
---
 Paul D. Smith psm...@gnu.org  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.net
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-14 Thread Paul Smith
On Fri, 2011-01-14 at 19:57 +0100, Ralf Wildenhues wrote:
 * Bob Friesenhahn wrote on Fri, Jan 14, 2011 at 07:55:17PM CET:
  Where do these terms 'alpha' and 'beta' build system originate from?
 
 I've read them in
 http://gittup.org/tup/build_system_rules_and_algorithms.pdf.
 
 No idea whether they are standardized somehow or somewhere.

I really recommend that, in addition to that paper, you might read the
thread that ensued on the help-make list after Mike posted a link to it
there:

http://www.mail-archive.com/help-make@gnu.org/msg08500.html

I think some of that discussion was quite illuminating.




Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-13 Thread Paul Smith
On Thu, 2011-01-13 at 12:50 -0600, Bob Friesenhahn wrote:
 FYI, ClearCase's clearmake is actually based on GNU make source code. 
 At least it was back when I was using it.

Not that it matters but this is not _quite_ true... otherwise clearmake
would fall under the GPL.

What clearmake used to do was ship with a patched version of GNU make
that added a -E flag, when that flag was given then GNU make would read
the makefiles but instead of actually building anything it would output
an elaborated makefile with much of the extra functionality of GNU
make already expanded; for example ifdef etc. would be processed
already, etc.  Clearmake in GNU make mode would exec that patched
version of GNU make with the -E flag, then read the output to get a more
vanilla makefile.  This gave some limited support for GNU make syntax.

However, that model was ditched back in the 1990's sometime.  The last
time I used ClearCase (over 4 years ago now), clearmake simply
implemented some set of GNU make features natively without using any GNU
make preprocessor.  However, the set of features implemented was pretty
weak; something like GNU make 3.76 or so; not much beyond that.  And,
even some 3.76 features there were missing, like auto re-exec
capabilities etc.




Re: [CRAZY PROPOSAL] Automake should support only GNU make

2011-01-13 Thread Paul Smith
On Thu, 2011-01-13 at 20:38 +0100, Ralf Wildenhues wrote:
 * Bob Friesenhahn wrote on Thu, Jan 13, 2011 at 08:32:01PM CET:
  On Thu, 13 Jan 2011, Ralf Wildenhues wrote:
  You can implement hash-based dependencies inside GNU make if you like.
  Or another make.  Maybe someone has even done so already.  It has little
  to do with Automake.
  
  I have never heard of such a thing implemented in make.  It would
  surely bloat the makefile by more megabytes or be excruciatingly
  slow.
 
 That's not what I meant.  You can hack the source code of GNU make
 with the goal to (optionally) replace timestamp-based dependencies
 with file-hash-based depdendencies.  No changes at all in any user
 makefiles.  It's not a large change even.  IIRC somebody has done
 this already, it just hasn't been merged upstream.

It's not QUITE so simple.  The advantage to using timestamp-based
dependencies is that the filesystem is your database of
out-of-dateness, and the kernel ensures that the behavior of that
database is well-defined.

If you want to use anything more sophisticated to determine
out-of-dateness, then make has to keep its own database to remember
the previous state, so it can be compared.  Granted this is not rocket
science but there are some tricky bits to it: it needs to be pretty
portable to the various architectures make supports; it needs to be
well-behaved in the case of a missing database, and in the case of
multiple instances of make running in the same directory at the same
time; what happens if the database is corrupted or somehow contains the
wrong information; etc.

Making this robust enough to trust it with your builds requires some
thought and care.

 BTW, Nathanael's propositions in this thread go in the same direction:
 http://gcc.gnu.org/ml/gcc-patches/2010-11/msg01504.html

There was a set of work done to introduce user-defined out-of-date
functions into GNU make, a few years ago.  That didn't end up getting
merged because it seemed too complex to me, both from a make developer
and make user standpoint.

I agree with Nathanael, although I take serious issue with his statement
the GNU make people were completely uninterested, since they didn't get
why it was a good idea--that is simply not true.

What is true is that Nathanael posted a 10,000ft-level description of
what he wanted, not completely thought-out, and when I replied to his
message with some discussion of the idea and potential problems with it,
he never followed up.




Re: Any way to get rid of -MP parameter to gcc for dependency creation?

2011-01-08 Thread Paul Smith
On Sat, 2011-01-08 at 21:57 +0100, Xan Lopez wrote:
 Any idea to bypass this whole mess?

You can create one or more libxxx.a files from subsets of your objects,
and link with those.

-- 
---
 Paul D. Smith psm...@gnu.org  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.net
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: execvp: /bin/sh: Argument list too long

2010-11-08 Thread Paul Smith
On Mon, 2010-11-08 at 21:01 +0100, Pippijn van Steenhoven wrote:
 I am root on my (Linux) system and I set the stack size to unlimited.
 The libtool macro reported a few billion (or something other really
 large) for maximum argument list length, bash also agreed (it easily
 executed the distdir target when copied into a bash script), but
 make doesn't.

What version of GNU make are you using?  There was an issue in GNU make
3.81 related to setting stack limits, that I believe is resolved in GNU
make 3.82.  I can't recall the details.

You might try that if you're not already using it.




Re: Using -MMD instead of -MD for depndency generation

2010-10-31 Thread Paul Smith
On Thu, 2010-10-28 at 12:18 -0400, Paul Smith wrote:
  Is there anything I can help with? bisect GNU make 3.81 - 3.82 to see when 
  it
  got slower? Or anything on our GNU make/automake usage? E.g. we use a non
  recursive makefile
 
 I'll try to look into this more this afternoon/tonight.  Until then I
 wouldn't bother doing too much work on it.  Let me try to reproduce your
 results then we can see where we are.

I was able to install the proper packages and get a complete build of
WebKit.

After this I see that GNU make 3.82 is twice as slow as 3.81, but even
3.82 only takes about 30 seconds to run (a do-nothing build) on my
system.  Taking 5+ minutes is crazy.  There must be something odd about
your system or setup causing this.

As for why 3.82 is slower, unfortunately I'm having problems figuring it
out.  I compiled with gprof but the cumulative profiled code in GNU make
only took 6 seconds or so, so I suppose the other 24 seconds must be in
libc or someplace... but trying to install a profile-enabled version of
libc on my system did not succeed (or rather the package install worked
but linking with the profiled libc did not work).


All I can assume is there's some difference in heap management in 3.82
which is causing this, because that's the only difference I can think of
that would not show up in the GNU make code itself.  I'll try using
valgrind and see what I get.  If anyone has other suggestions for
profiling tools let me know.

Cheers!

-- 
---
 Paul D. Smith psm...@gnu.org  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.net
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: Using -MMD instead of -MD for depndency generation

2010-10-28 Thread Paul Smith
On Wed, 2010-10-27 at 23:33 +0200, Holger Freyther wrote:
 On 10/27/2010 10:25 PM, Paul Smith wrote:
  On Wed, 2010-10-27 at 20:51 +0200, Ralf Wildenhues wrote:
 
  But I do have libgstreamer dev packages installed:
  
  webkit-1.3.5$ locate gst.h
  /usr/include/gstreamer-0.10/gst/gst.h
 
 hmm. Could you check the config.log if video was enabled? Maybe the configure
 decided that video was disabled but we still build that file...

I checked and I have a number of gstreamer pkg-config files:

/usr/lib/pkgconfig/gstreamer-0.10.pc
/usr/lib/pkgconfig/gstreamer-base-0.10.pc
/usr/lib/pkgconfig/gstreamer-check-0.10.pc
/usr/lib/pkgconfig/gstreamer-controller-0.10.pc
/usr/lib/pkgconfig/gstreamer-dataprotocol-0.10.pc
/usr/lib/pkgconfig/gstreamer-net-0.10.pc

but configure is looking for some I don't have:

configure:20784: $PKG_CONFIG --exists --print-errors gstreamer-0.10 = 
$GSTREAMER_REQUIRED_VERSION
 gstreamer-app-0.10
 gstreamer-base-0.10
 gstreamer-interfaces-0.10
 gstreamer-pbutils-0.10
 gstreamer-plugins-base-0.10 = 
$GSTREAMER_PLUGINS_BASE_REQUIRED_VERSION
 gstreamer-video-0.10

As a result, GSTREAMER_CFLAGS etc. is empty.

The default in configure.ac is that video is enabled, and it doesn't
seem like it gets disabled if pkg-config cannot find all the right
packages:

# check if gstreamer is available
if test $enable_video = yes; then
   PKG_CHECK_MODULES([GSTREAMER],
 [gstreamer-0.10 = $GSTREAMER_REQUIRED_VERSION
 gstreamer-app-0.10
 gstreamer-base-0.10
 gstreamer-interfaces-0.10
 gstreamer-pbutils-0.10
 gstreamer-plugins-base-0.10 = 
$GSTREAMER_PLUGINS_BASE_REQUIRED_VERSION
 gstreamer-video-0.10],
 [have_gstreamer=yes],
 [have_gstreamer=no])

   AC_SUBST([GSTREAMER_CFLAGS])
   AC_SUBST([GSTREAMER_LIBS])
fi

  FWIW, I have a Intel Core2 6600 at 2.4GHz with 2G RAM, running 64bit
  2.6.35+ kernel on an ext4 partition.
 
 My desktop is still in a container on some boat and I am stuck with my lower
 end laptop.
 
 Is there anything I can help with? bisect GNU make 3.81 - 3.82 to see when it
 got slower? Or anything on our GNU make/automake usage? E.g. we use a non
 recursive makefile

I'll try to look into this more this afternoon/tonight.  Until then I
wouldn't bother doing too much work on it.  Let me try to reproduce your
results then we can see where we are.




Re: Using -MMD instead of -MD for depndency generation

2010-10-27 Thread Paul Smith
On Wed, 2010-10-27 at 20:51 +0200, Ralf Wildenhues wrote:
  I am talking about executing make after the source has been built once. E.g.
  there is only one file changed and I want to rebuild the library with this
  single new file. My benchmark is to type 'time make' on unchanged source. 
  From
  my expectation it will parse the GNUmakefile and all the files (the .Plo, Po
  with the dependencies) included by it, then it needs to verify the
  dependencies of all these rules for the 'all' target and then exits. Right 
  now
  this takes 5:42m usertime (+- 10 seconds).

I saw this come across my inbox and thought I'd give it a tree; 5:42
seems like a very, very long time.

I downloaded the tarball you linked to, to my system but wasn't able to
build; it configured and built about half the code, then failed:

  CXX
WebKit/gtk/WebCoreSupport/libwebkitgtk_1_0_la-FullscreenVideoController.lo
WebKit/gtk/WebCoreSupport/FullscreenVideoController.cpp:32: fatal 
error: gst/gst.h: No such file or directory

But I do have libgstreamer dev packages installed:

webkit-1.3.5$ locate gst.h
/usr/include/gstreamer-0.10/gst/gst.h


Do I need to do something else?


Anyway, with half the tree built (I think):

webkit-1.3.5$ find -name \*.o | wc -l
1437

make 3.81 takes about 8 seconds on my system to get back to trying to
build the above file, and failing.  I do notice that make 3.82 takes a
lot longer (23s !!) so there's definitely something I need to check out
in that respect.  But 5m seems pretty crazy.


FWIW, I have a Intel Core2 6600 at 2.4GHz with 2G RAM, running 64bit
2.6.35+ kernel on an ext4 partition.




Re: Makefile to Makefile.am

2010-08-15 Thread Paul Smith
On Sun, 2010-08-15 at 23:32 +0200, Ralf Wildenhues wrote:
# old fashioned suffix rule, portable
.xcf.bmp:
$(GIMP) -i -b '(xcf-bmp $ $@)' -b '(gimp-quit 0)'
  
  Hey, maybe Automake hackers can riff off this thread in time for
  the next release... (or, Would you like me to submit a patch?).
 
 Yes, with a general example, please.

Note this is not enough: you also have to add these as prerequisites of
the .SUFFIXES target.

-- 
---
 Paul D. Smith psm...@gnu.org  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.net
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: Making output of make more beautiful

2009-06-10 Thread Paul Smith
On Wed, 2009-06-10 at 14:04 +0200, Vincent Torri wrote:
 Is it possible to add 2 features :
 
 1) adding a percentage at the beginning of the line

People often ask for this, but it's not possible: a percentage of what?
The way make works, it has absolutely no idea how much work there is
left to do.  Make does not plan out all the operations it needs to run,
then start working on them: rather, it knows what it wants to build and
it works on it one step at a time until there are no more steps to take.
But when it's working on step 5 it has no idea if there is one step
left, or 100.  So computing a percent complete cannot be done.

Certainly there's no way automake can do this.

 2) displaying the lines with a color according to what is done 
 (compilation, execution or error)

Today, make does no processing on the output of the commands it invokes:
they just write directly to stdout, so colorizing error or warning
messages by make is not possible.  This feature could conceivably be
added (in fact, there are even more legitimate reasons for make to
process job output than colorization: when make invokes parallel builds
the output from the parallel jobs is today all jumbled together which
makes it difficult to read: there are enhancement requests to allow the
output to be buffered then displayed all at once when the command
completes).

The colorization of different types of rules (rather than output) is
most easily accomplished by automake, by adding that directly to the
output.

However, both of these could also be, and are perhaps most easily,
accomplished via an extra colorization tool, where the output of make
is piped into this tool which reads stdin, colorizes it, then displays
it.  This saves a lot of effort and change in make and automake and
provides a clean and much more easily modifiable colorization
environment.

-- 
---
 Paul D. Smith psm...@gnu.org  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.us
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: Compilation order

2008-10-26 Thread Paul Smith
On Mon, 2008-09-29 at 22:06 +0200, Ralf Wildenhues wrote:
 BTW, while you're here, is there chance for a new GNU make release
 soonish with the bug fixed that prevented GCC from using dependency
 tracking?  (IIRC it dealt with order-only deps.)

Hi Ralf; sorry for the delay.

Is this bug already fixed, and just awaiting a release?  Or is it still
open?  Do you have a bug#?

I've just this weekend succeeded in getting the changes I've made to try
to control memory usage, back to a point where the entire GNU make test
suite passes.  I still have some cleanup left to do, but I'd like to
start running comparative test builds on various software packages to
see what differences in performance and heap usage I've created.

I'm especially interested in packages with very large numbers of targets
and/or variables; so any large-ish package implementing non-recursive
make for example would be useful for testing.

Any pointers to such packages would be appreciated.  I wanted to do
glibc which is where the memory usage situation was first reported, but
they don't seem to be creating buildable tarballs anymore and my trivial
attempt to run cvs co, ./configure, and make failed miserably :-/.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.us
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: Compilation order

2008-10-26 Thread Paul Smith
On Sun, 2008-10-26 at 21:50 +0100, Ralf Wildenhues wrote:
 I'm not sure whether it's fixed.  I think it's first mentioned here,
 which also points to some bug numbers:
 http://thread.gmane.org/gmane.comp.gcc.patches/158072/focus=159249

Ah.  Yes, that's fixed.

 You can try GraphicsMagick for a larger nonrecursive build.
 OpenMPI is large all in all, has many variables, but only parts
 that are nonrecursive, so I have no idea whether it is a good
 test for you.

OK, I'll look at these; thanks.

  I wanted to do
  glibc which is where the memory usage situation was first reported, but
  they don't seem to be creating buildable tarballs anymore and my trivial
  attempt to run cvs co, ./configure, and make failed miserably :-/.
 
 Given that they advertise their CVS tree stability, I think that would
 qualify for a bug report then.  I guess for testing you could try
 checking out their last release tag though.

I did check out the 2.8 release branch version.  Hrm, I think I see it:
the configure found mawk instead of gawk.  That should be simple enough
to fix.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.us
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: libtool link error

2008-10-18 Thread Paul Smith
On Sat, 2008-10-18 at 09:47 -0700, rrlangly wrote:
 Well, yeah. I am using linux, and there is an openal package that I've
 already installed and like I've shown

Hi Richard; actually what Bob means is that most distributions these
days are separating a single project (especially one that contains
libraries) into two different packages: a runtime package (e.g.,
openal) which contains the files needed for runtime, and a developer
package (e.g., openal-dev or something like that) which contains all
the extra files needed to develop with that package (like header files,
static libraries, pkg-config info, etc.)

So, he is suggesting you check the list of packages for your distro to
see if there are extra *-dev or similar packages for openal, and if so,
that you try installing them and see if it helps.


Cheers!

-- 
---
 Paul D. Smith [EMAIL PROTECTED]  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.us
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: Compilation order

2008-09-29 Thread Paul Smith
On Mon, 2008-09-29 at 21:41 +0200, Ralf Wildenhues wrote:
 When run in parallel (-jN) mode, make doesn't guarantee any order of
 rule execution except that implied by the dependency relations.

To be really, really pedantic: even in parallel mode, make (currently)
always relies on the order in which prerequisites are listed for a given
target.  However, since make is running them in parallel you can't
really tell which items might end up running together: it depends on how
the OS schedules the jobs and how long they take to complete, as well as
how large your N for -jN is.

There is an enhancement request for GNU make that hopes for a
randomization (to help find missing dependency relationships) of the
prerequisites.  However this is a large, difficult problem the way GNU
make is coded.

However your conclusion is absolutely correct: you shouldn't rely on [a
specific order of prerequisite builds].






Re: Problems with conditional sources

2008-08-25 Thread Paul Smith
On Mon, 2008-08-25 at 23:21 +0200, David Sveningsson wrote:
 Thanks, this solved my problems. It doesn't seem work if I use tabs
 for  indentation, but spaces works fine. Is this expected behavior?
 I've used  tabs in many other places and it seem to work. I cannot
 find any section  in the manual covering this.

It is expected behavior.  You should never, ever use TABs in makefiles
(which is what Makefile.am files are) unless you are actually trying to
write a recipe (command line) for a target.

Make tries to be smart (which, unfortunately, can cause more problems
than it solves IMO, but equally unfortunately there are a LOT of
makefiles out there that expect make to be smart in this way so we
cannot change it) and so it won't ALWAYS fail if you start a variable
definition with a TAB.

But, it's always wrong.  TABs are special to make.





Re: Problems with conditional sources

2008-08-25 Thread Paul Smith
On Mon, 2008-08-25 at 15:55 -0600, John Calcote wrote:
 Andreas Schwab wrote:
  John Calcote [EMAIL PROTECTED] writes:
  
  Make is a two-pass utility. The first pass completely assimilates
 all
  macro data specified in the Makefile. THEN, the second pass
 generates
  the rule dependency tree.
  
  This is not true.  Variable refences in target and dependency lists
 are
  expanded when they are read.  Any later redefinition will not change
  them.
  
  Andreas.
  
 
 This is only true if you use ':=' assignment. In this case (only),
 variable *assignments* are expanded immediately. But regular rule or
 command references are only expanded when the rule or command is
 evaluated (as opposed to read).

No.  Andreas is correct.  Make uses a two-pass model, but during the
first pass all rules (targets and prerequisites) are expanded and the
dependency graph is created.  On the second pass, make actually builds
out of date targets.

For example, if you write:

FOO = foo

$(FOO): bar

FOO = baz

you will have defined a rule foo : bar, NOT a rule baz : bar, even
though you are using recursively expanded variables (try it!)

If you read the section of the GNU make manual How make reads a
makefile it describes which parts of a makefile are read with immediate
expansion and which are read with deferred expansion.  Target and
prerequisite lists are read with immediate expansion.





Re: substitution vs expansion in Makefile.am

2008-03-04 Thread Paul Smith
On Tue, 2008-03-04 at 15:34 -0700, John Calcote wrote:
 This question is somewhat related to the last one I asked. Where
 should I use @top_srcdir@, and where should I use $(top_srcdir)?

The answer is, you should ALWAYS use the make variable, and NEVER use
the automake variable.  There may be very obscure exceptions but I can't
think of any; certainly none in a straightforward automake file.


My $0.02.

-- 
-
 Paul D. Smith [EMAIL PROTECTED] http://make.mad-scientist.us
 Please remain calm--I may be mad, but I am a professional.--Mad Scientist






Re: Conditional LDFLAGS not working

2008-03-04 Thread Paul Smith
On Tue, 2008-03-04 at 16:39 -0330, Sam Bromley wrote:
 But in the generated Makefile, the line
 
 libTclParseArgv_la_LDFLAGS += -L/usr/lib -ltclstub8.4
 
 shows up at the *end*, after any other targets are
 defined.

And this is a problem because... why?  Does something actually not work?

-- 
---
 Paul D. Smith [EMAIL PROTECTED]  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.us
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: Detecting FreeBSD's make

2007-11-06 Thread Paul Smith
On Tue, 2007-11-06 at 19:46 +, Harlan Stenn wrote:
 I'm getting bitten by the VPATH-handling differences between FreeBSD's
 make and gmake.

The automake manual says (somewhere) that the only version of make that
correctly supports out-of-the-tree builds is GNU make.  The VPATH
capability in every other version of make that I'm aware of, including
BSD make and SysV make, is just plain borked (IMNSHO).

My personal suggestion would be to follow that advice and simply
document that out-of-the-tree builds are not supported by your project
unless the user uses GNU make.

It's very difficult for autotools to detect and work around make issues,
because there's absolutely no guarantee that the make program that
autoconf/etc. finds and uses will be the one the user actually invokes
to build the project.  Users are not accustomed to saying ./configure
MAKE=gmake or similar.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.us
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: Detecting FreeBSD's make

2007-11-06 Thread Paul Smith
On Wed, 2007-11-07 at 01:00 +, Harlan Stenn wrote:
 IME, that position is excessive.  It was true, as far as I can recall,
 for SGI's make, but this is the first time this particular issue has
 bitten me (or any of the users of the package) in a *long* time.

Hm.  Maybe automake works around it better than it used to.

IIRC, the problem is that in SysV make (and perhaps in BSD make although
I confess I've not played with it much), a number of the normal
automatic variables are not defined for explicit rules; they're only
defined for suffix rules.  And if the automatic variables are not
defined, then you can't find out what the VPATH'd pathname is for those
files.

So, the upshot is VPATH just doesn't work with explicit rules, which is
not good.

There may have been other problems with VPATH as well.


This is from my increasingly dodgy memory so I could be wrong on
particulars.

Cheers!

-- 
---
 Paul D. Smith [EMAIL PROTECTED]  Find some GNU make tips at:
 http://www.gnu.org  http://make.mad-scientist.us
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: using same automake conditional twice doesn't work?

2005-06-07 Thread Paul Smith
%% Ed Hartnett [EMAIL PROTECTED] writes:

  eh SUBDIRS = man

  eh if BUILD_F77
  eh SUBDIRS += fortran
  eh endif

  eh SUBDIRS += libsrc nc_test ncgen ncdump nctest

  eh # If we're building the f77 API, test it too.
  eh if BUILD_F77
  eh SUBDIRS += nf_test
  eh endif

There's a much better way to do this.  Try:

  if BUILD_F77
  PRE_SDIRS = fortran
  POST_SDIRS = nf_test
  endif

  SUBDIRS = man $(PRE_SDIRS) nc_test ncgen ncdump nctest $(POST_SDIRS)


Cheers!

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Re: I: adjust test suite for upcoming GNU Make 3.83

2005-03-10 Thread Paul Smith
%% Dmitry V. Levin [EMAIL PROTECTED] writes:

  dvl Upcoming version of GNU Make introduces new incompatibility.
  dvl The NEWS file says:
  dvl * WARNING: Backward-incompatibility!
  dvl   GNU make now implements a generic second expansion feature on the
  dvl   prerequisites of both explicit and implicit (pattern) targets.  After
  dvl   the rule has been parsed, the prerequisites are expanded a second
  dvl   time, this time with all the automatic variables in scope.  This means
  dvl   that in addition to using standard SysV $$@ in prerequisites lists,
  dvl   you can also use complex functions such as $$(patsubst f%r,b%o,$$@) 
etc.
  dvl   This behavior applies to implicit rules, as well, where the second
  dvl   expansion occurs after the rule is matched.
  dvl   However, this means that you need to double-quote any $ in your
  dvl   filenames; instead of foo: boo$$bar you must write foo: foobar

  dvl GNU Make developers suggest to adjust dollar.test, to make it work with
  dvl both the old and new versions of GNU make, see
  dvl http://savannah.gnu.org/bugs/?func=detailitemitem_id=12260

Hi Dmitry;

I was going to post something here about this when I saw your message :-).


I'm interested in discussing the issue and possible solutions.  The one
I added to the bug report involves using :=, since simply-expanded
variables in GNU make are always only expanded one time, no matter how
many times they're referenced (unless within an eval function).  But,
obviously, using := does not yield a portable makefile.

I'm not actually sure whether there's any way to write a makefile that
doesn't rely on any GNU make features, that handles $ in filenames
properly.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]  Find some GNU make tips at:
 http://www.gnu.org  http://make.paulandlesley.org
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist




Re: Error because README is missing

2004-12-06 Thread Paul Smith
%% Akim Demaille [EMAIL PROTECTED] writes:

  ad Finally, note that you are allowed to ask config.status to perform
  ad substitutions on files it doesn't know[1].  In your case, I'm unsure
  ad binding the instantiation to configure instead of make is right.

Doh!

I thought this would be the perfect answer but I completely forgot: the
whole point behind build.sh is that you _DON'T HAVE A MAKE PROGRAM_!

This shell script lets you build the GNU make distribution on a system
with no native make installed.

So, I can't put the rules to build the build.sh file in the Makefile,
since when you need build.sh there will not be any make to run the
Makefile.


I suppose I could tell those folks to run the config.status step by
hand, but I'd really like to get this built through the configure
script.  Ideas?

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Re: Error because README is missing

2004-12-06 Thread Paul Smith
%% Ralf Wildenhues [EMAIL PROTECTED] writes:

   I suppose I could tell those folks to run the config.status step by
   hand, but I'd really like to get this built through the configure
   script.  Ideas?

  rw dnl ...
  rw AC_CONFIG_FILES([Makefile])
  rw AC_OUTPUT
  rw touch foo.in
  rw ./config.status --file foo

  rw What am I missing?

I wasn't sure config.status would be available while (my part of)
configure was still running.  I guess if all AC_OUTPUT does is run
config.status --file then that would make sense, but it could also do
the translation inline.

I'll give it a try; cheers!

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Re: Error because README is missing

2004-12-02 Thread Paul Smith
%% Akim Demaille [EMAIL PROTECTED] writes:

   I need a way to have a file marked as a config file, but not have
   configure (or automake) fail if the .in input file doesn't exist.

  ad Hm...  What is the service you are expecting?  You say that
  ad configure shouldn't fail, but if it's a config file, configure is
  ad expected to fail.  Unless you do mean configure, and not
  ad config.status?

When I first check out my code from CVS, I don't have a build.sh.in
file.  That file is autogenerated as part of the build process, from a
template file.

It's not required for a maintainer-mode build: that file is basically a
bootstrap assist: a way to build make if you don't have any make
yet.  Maintainers are expected to have a full suite of tools at their
disposal including make.

By the time I actually make a distfile, that file (build.sh.in) will be
there and it will be included in the distfile so normal users will never
see this issue.


But, I need a way to run autoreconf in a clean workspace where that
file doesn't exist... and if I use normal AC_CONFIG_FILES then automake
complains and won't build my Makefile.in.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Re: Error because README is missing

2004-12-02 Thread Paul Smith
%% Akim Demaille [EMAIL PROTECTED] writes:

  ad If you have a bootstrap script, why wouldn't it create a fake
  ad file, run autoreconf, and them create the right file?

I don't have a bootstrap script for building out of CVS.  I could create
one, I guess.

  ad Finally, note that you are allowed to ask config.status to perform
  ad substitutions on files it doesn't know[1].  In your case, I'm unsure
  ad binding the instantiation to configure instead of make is right.

Ah!!

I think you're right; that's probably the right way to do it.

Thanks, I'll look into this.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Re: Error because README is missing

2004-11-30 Thread Paul Smith
That worked!  Thanks Andreas.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Error because README is missing

2004-10-23 Thread Paul Smith
Hi all;

In the GNU make package I don't have a README file when it is first
checked out from CVS.  There is a README.in which is translated into a
README, replacing the version info etc.


This used to work but now automake is failing:

  Makefile.am: required file `./README' not found

Any hints on how I can convince automake that the file will really be
there by the time the package is already built, so it doesn't give this
error?

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT: HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Re: config.sub, config.guess, and the GNU FTP site

2003-09-04 Thread Paul Smith
%% Paul Eggert [EMAIL PROTECTED] writes:

  pe The GNU FTP server has been frozen since late July; nothing new
  pe has appeared on it.  I don't know what the holdup is.

It's not the entire server: the server was cracked (surely you heard
about this?) and so they took off all the content until it could be
verified to be correct.

The verification is complete for almost all the code, but Ben Elliston
(who, according to the FSF folks I talked to, owns these particular
files) has not responded to requests for help in verifying these files.

That's why I'm sending mail :).

  pe gnulib currently keeps copies of several files like that in one
  pe convenient place.  It slurps them from their canonical locations
  pe automatically.

That's what I want to do.  How does gnulib do this?  Does it get then
via CVS checkout somehow?  Or, some other way?

It has copies of config.guess and config.sub, which I
  pe think were the latest versions before the freeze occurred.  See:
  pe http://savannah.gnu.org/cgi-bin/viewcvs/gnulib/gnulib/config/

  pe So you can sync from gnulib for now, using the following URLs:

  pe 
http://savannah.gnu.org/cgi-bin/viewcvs/*checkout*/gnulib/gnulib/config/config.guess?rev=HEAD
  pe 
http://savannah.gnu.org/cgi-bin/viewcvs/*checkout*/gnulib/gnulib/config/config.sub?rev=HEAD

If I use wget -O config.guess ... then this works for me.  But I want
to get the official versions, wherever they are.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Re: GNU ftp crack and config.{sub,guess}

2003-09-04 Thread Paul Smith
%% John W. Eaton [EMAIL PROTECTED] writes:

  jwe   Maybe a better place to get the current versions of these files is
  jwe   here:

  jwe http://savannah.gnu.org/projects/config/

  jwe   It seems to me that instead of copying the files to the GNU ftp
  jwe   server (even if it could be done automatically), it would make
  jwe   more sense to just put a note there saying where to find them
  jwe   on the savannah server.

But right now I can very easily wget them from the FTP server as part of
my build process, so I don't even have any local copies.  The latest
stuff is grabbed for me when I run make update.

It's not so clear how to wget their contents from Savannah.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Re: Portability of preprocessor directives

2003-03-09 Thread Paul Smith
%% Thomas Dickey [EMAIL PROTECTED] writes:

  td judging by comments I've seen in other mailing lists, it's not
  td likely that GNU make will be worth bothering with, since it's
  td been subjected to incremental incompatibities.

Man, you're never happy about anything, are you?

Anyway, if you'd care to point out these other mailing lists and/or a
litany of the incremental incompatibilities GNU make has been
subjected to, I'd be very interested in seeing it... seriously.  At the
very least I'd like to make sure they're documented properly.  Thanks.

-- 
---
 Paul D. Smith [EMAIL PROTECTED]   HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Re: Problems with ansi2knr

2002-10-22 Thread Paul Smith
%% Alexandre Duret-Lutz [EMAIL PROTECTED] writes:

  Paul noinst_PROGRAMS = loadavg
  Paul nodist_loadavg_SOURCES = loadavg.c
  Paul loadavg_CFLAGS = -DTEST
  Paul loadavg_LDADD = @GETLOADAVG_LIBS@

  Paul loadavg.c: $(srcdir)/getloadavg.c
  Paul cp $(srcdir)/getloadavg.c loadavg.c

  adl (Incidentally, why do you copy the file instead of
  adlnodist_loadavg_SOURCES = getloadavg.c

Good idea... but see below.

  adl The following patch should fix it.

OK, I tried automake 1.7.1 where this is fixed.  It does fix this
problem but unfortunately there are other problems lurking.

First, I tried it as you suggested above:

  noinst_PROGRAMS = loadavg
  nodist_loadavg_SOURCES = getloadavg.c
  loadavg_CPPFLAGS = -DTEST
  loadavg_LDADD = @GETLOADAVG_LIBS@

The fix you made did fix the problem I originally reported, but there
are still issues.  The above results in this in the output
Makefile/Makefile.in:

  loadavg-getloadavg$U.$(OBJEXT): getloadavg$U.c
...
  getloadavg_.c: getloadavg.c $(ANSI2KNR)
$(CPP) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) 
... | $(ANSI2KNR)  getloadavg_.c

Note there that the getloadavg_.c file is constructed using
CPP | ANSI2KNR, then the loadavg-*.o file is compiled directly from the
_.c.

This won't work because when we run the C preprocessor to create the
getloadavg_.c file we don't add in the $(loadavg_CPPFLAGS), so that
-DTEST is not defined.

Although we do add -DTEST in the compilation of _.c into _.o, by then
it's too late because during the ansi2knr-ification we already ran it
through the C compiler.  IOW, this is what automake does:

  cpp getloadavg.c | ansi2knr  getloadavg_.c

  cc $(loadavg_CPPFLAGS) getloadavg_.c -o loadavg-getloadavg_.o

  cc -o loadavg loadavg-getloadavg_.o

But we really want this instead:

  cpp getloadavg.c | ansi2knr  getloadavg_.c

  cpp $(loadavg_CPPFLAGS) getloadavg.c | ansi2knr  loadavg-getloadavg_.c

  cc loadavg-getloadavg_.c -o loadavg-getloadavg_.o

  cc -o loadavg loadavg-getloadavg_.o

Note how the ansi2knr is done twice, once to create an _.c file without
the loadavg_CPPFLAGS and _again_ to create a _different_ _.c file _with_
the loadavg_CPPFLAGS.


So, next thing I tried to do was to go back to my original work,
thinking that I could solve the problem by copying the original .c file
over, then automake would ansi2knr it for me and build a .o out of
that.  But, I found a slightly different bug here.  So I had this:

 noinst_PROGRAMS = loadavg
 nodist_loadavg_SOURCES = loadavg.c
 loadavg_CPPFLAGS = -DTEST
 loadavg_LDADD = @GETLOADAVG_LIBS@

 loadavg.c: $(srcdir)/getloadavg.c
cp $(srcdir)/getloadavg.c loadavg.c

This seemed like it should work, and it did create the right rules for
me to convert loadavg.c into loadavg_.c, but it turns out that even in
this case the loadavg_CPPFLAGS were not included in the ansi2knr
statement; I conclude that even without the weird things I'm doing this
(adding a binary-specific set of preprocess flags) would never work with
ansi2knr.  When I try the above I get these commands:

  cp ./getloadavg.c loadavg.c

  cpp loadavg.c | ansi2knr  loadavg_.c

  cc $(loadavg_CPPFLAGS) loadavg_.c -o loadavg-loadavg_.o

  cc -o loadavg loadavg-loadavg_.o

Note again how the $(loadavg_CPPFLAGS) is missing from the CPP | ANSI2KNR 
line.  What I expected to get was this:

  cp ./getloadavg.c loadavg.c

  cpp $(loadavg_CPPFLAGS) loadavg.c | ansi2knr  loadavg_.c

  cc loadavg_.c -o loadavg-loadavg_.o

  cc -o loadavg loadavg-loadavg_.o


Thanks...

-- 
---
 Paul D. Smith [EMAIL PROTECTED] HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.





Re: Difference between AM_CFLAGS, AM_CPPFLAGS, and AM_LDFLAGS

2002-09-09 Thread Paul Smith

%% Stephen Torri [EMAIL PROTECTED] writes:

  st On Mon, 2002-09-09 at 23:11, Steve M. Robbins wrote:
   On Mon, Sep 09, 2002 at 10:33:40PM -0500, Stephen Torri wrote:

AM_CFLAGS - compile time flags
AM_CPPFLAGS - preprocessor flags (e.g. -I, -D)
AM_LDFLAGS - linker flags (e.g. -L)
   
I am working on a project that is updating its Makefile.am files. I see
a variety of flags in one variable like CFLAGS contains -I and -F flags
for example. I was wondering what is the advantage of doing this with
automake variables?
   
   Are you asking what is the advantage of separating them as automake
   does?  I guess the advantage is that automake uses, e.g. LDFLAGS for
   linking lines where generic CFLAGS (-O, -g) might upset the linker.

  st Yes. What is the advantage of separating them?

Steve gave you the advantage: so you can run them individually.

There are many compilers that won't accept linker flags (libraries,
etc.) on a command line meant to compile a file (but not link it).

The reason for having a separate CFLAGS and CPPFLAGS is even stronger:
often you want to just run the C preprocessor (or a tool that takes the
same arguments as the C preprocessor) either for debugging, or running
tools like lint, or instrumenting your code with special debugging
capabilities, etc.  In this case you need to be able to run the
preprocessor and be sure you're getting all the same arguments as when
you compile the code into an object file.

-- 
---
 Paul D. Smith [EMAIL PROTECTED] HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.





Two issues with automake

2002-04-21 Thread Paul Smith

1) I had a line like this in my configure.in:

  test -f somefile.in  AC_OUTPUT(somefile)

   This works fine with configure, but automake really doesn't like it:

  configure.in:337: required file `./test.in' not found
  configure.in:337: required file `./-f.in' not found
  configure.in:337: required file `./.in' not found

   Maybe an overzealous RE match?


2) I have a situation where I want to optionally compile some different
   files if the user specified a --with-xxx flag to configure.

   In older versions of automake I just put a configure-substituted
   variable right into the xxx_SOURCES variable but that no longer
   works; I get the error configure substitutions not allowed in
   _SOURCES variables.

   OK, that's fine, but how do I go about choosing whether to build
   those files or not now?

-- 
---
 Paul D. Smith [EMAIL PROTECTED] HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.




Miss --generate-deps

2002-04-21 Thread Paul Smith

GNU make used to use the automake --generate-deps option, which is now
missing :(.

The thing is, GNU make runs on a number of systems where it's not
possible to run the configure script, etc.  For the use of these
systems, GNU make ships various pre-built makefiles, which are
constructed from templates at distribution creation time.

I used to use --generate-deps to create a file containing all the
_internal_ dependency data (no system dependencies, just dependencies
within the make package), then one of the customizations made for the
other makefiles was to append this dependency information to the bottom
so that at least they had more-or-less accurate internal dependency
data.

How can I do this now?  My only idea so far is to process all the .Po
files to extract only the local files; essentially what --generate-deps
used to do, except do it myself... anyone have a better idea?

-- 
---
 Paul D. Smith [EMAIL PROTECTED] HASMAT--HA Software Mthds  Tools
 Please remain calm...I may be mad, but I am a professional. --Mad Scientist
---
   These are my opinions---Nortel Networks takes no responsibility for them.