'make distcheck' reports .deps/foo.Plo files left after distclean - what to do?

2018-08-16 Thread Zack Weinberg
I just got an error from 'make distcheck' that I don't know how to resolve:

ERROR: files left in build directory after distclean:
./.deps/alg-sha512.Plo
./.deps/alg-md4.Plo
./.deps/alg-sha256.Plo
./.deps/randombytes.Plo
./.deps/alg-md5.Plo
./.deps/alg-sha1.Plo
./.deps/alg-des-tables.Plo
./.deps/alg-hmac-sha1.Plo
./.deps/alg-des.Plo

These are header-file dependency lists, autogenerated during the
build.  The source files involved are primarily used as part of a
libtool library, but their object files are also linked directly into
certain tests, because they define internal symbols that are not
exported from the fully-linked shared library (and the tests need to
exercise them directly).  This _used_ to work fine, but is apparently
now causing problems.  Possibly it is a regression in automake 1.16,
which Debian's unstable distribution just updated to.

The full Makefile.am is lengthy, you can see it at
https://github.com/besser82/libxcrypt/blob/develop/Makefile.am .
Please advise if there is anything I should be doing differently, or
if you confirm that this is a bug in automake, whether there might be
a workaround.

Thanks,
zw



Installing something nonstandard in $(libdir)

2020-02-06 Thread Zack Weinberg
For reasons too complicated to get into here, I have been
experimenting with building shared libraries in an autoconf+automake
build *without* using libtool.  [Please do not try to talk me out of
this.]  I have something that works correctly on ELF-based operating
systems with GCC, *except* for installation, where automake is
refusing to do what I want.

Specifically, suppose I have custom rules like this:

if ENABLE_SHARED
noinst_LIBRARIES += libfoo-pic.a
libfoo_pic_a_SOURCES = $(libfoo_a_SOURCES)
libfoo_pic_a_CFLAGS = $(libfoo_a_CFLAGS) $(PICFLAGS)

libfoo_so = libfoo$(SOEXT).1

$(libfoo_so): libcrypt-pic.a Makefile
  $(AM_V_CCLD)$(LINK) $(SO_LDFLAGS) $(SONAME)$(libfoo_so) \
  $(WHOLE_A) libfoo-pic.a $(END_WHOLE_A) $(SO_LIBS)

libfoo.so: Makefile
-$(AM_V_at)rm -f libfoo.so
$(AM_V_GEN)$(LN_S) $(libfoo_so) $@
endif

where $(PICFLAGS), $(SO_LDFLAGS), etc. are set by autoconf, then the
only missing piece is to install $(libfoo_so) and libfoo.so in
$(libdir), but I can't find any magic variable that will do that.
lib_LIBRARIES is the obvious choice but that throws errors like

Makefile.am:158: error: 'libfoo$(SOEXT).1' is not a standard library name
Makefile.am:158: did you mean 'libfoo$(SOEXT).a'?

and lib_DATA is the obvious alternative but that doesn't work either:

Makefile.am:145: error: 'libdir' is not a legitimate directory for 'DATA'

So, the question is, is there a lib_SOMETHING variable that I can use
to install to $(libdir) arbitrary stuff that automake doesn't
understand?  If not, is there some other option?

(Please cc: me on replies, I'm not subscribed.)
zw



Warning category skew between Autoconf and Automake - workaround in autoreconf?

2020-09-10 Thread Zack Weinberg
autoreconf assumes that it can pass --warnings= to both
the tools maintained in Autoconf (autoconf, autoheader, etc.)
and to the tools maintained in Automake (automake, aclocal, etc.)
However, the set of warnings categories defined in autoconf’s
lib/Autom4te/ChannelDefs.pm has diverged from the set defined in
automake’s lib/Automake/ChannelDefs.pm.  (These were clearly once
the same file, but have not been in sync since before 2003!)
In particular, autoconf recognizes a ‘cross’ category but
automake does not, and automake recognizes ‘extra-portability’
and ‘portability-recursive’ categories but autoconf does not.

This means that if you do ‘autoreconf -Wcross’ you will get warnings
about an unknown warning category from automake and aclocal.
Conversely, ‘autoreconf -Wextra-portability’ will produce warnings
from autoreconf itself + multiple instances of autom4te (depending on
how complicated your configure script is).  This is not just cosmetic,
because ‘-Werror,cross’ or ‘-Werror,extra-portability’ will promote
the message to an error and aclocal/autom4te will exit unsuccessfully.
(Order matters: ‘-Wcross,error’ will just issue a warning.)
At least one redistributor of Autoconf has felt a need to patch
autoreconf to avoid supplying -Wcross to automake (checking first
whether ‘cross’ shows up in automake’s ‘--help’ output).

Clearly ChannelDefs.pm should be brought back into sync between the
two projects, but that will only fix the problem after *both* Autoconf
and Automake do a release.  So I’m wondering whether it would make
sense to merge this distributor’s patch to avoid supplying -Wcross to
automake -- perhaps generalized to arbitrary warning categories.  What
do you think?

zw



Re: Warning category skew between Autoconf and Automake - workaround in autoreconf?

2020-09-12 Thread Zack Weinberg
On Thu, Sep 10, 2020 at 6:43 PM Karl Berry  wrote:
> Hi Zack - in addition to the other replies, how do you prefer to do the
> sync? (which it seems like we might as well do asap.) From am to ac, or
> from ac to am?

We already sync quite a few Automake/*.pm files from am to ac, so I
think it makes most sense to sync ChannelDefs.pm in that direction as
well.  There's a minor complication because automake's version of
ChannelDefs.pm uses Automake::Config, which is a generated file, and
autoconf's build system can't deal with generating it, and I don't
want to mess with that during the release freeze.  But I have a
workaround already.

zw



Re: Warning category skew between Autoconf and Automake - workaround in autoreconf?

2020-09-12 Thread Zack Weinberg
On Thu, Sep 10, 2020 at 3:39 PM Paul Eggert  wrote:
> On 9/10/20 11:48 AM, Zack Weinberg wrote:
> > I’m wondering whether it would make
> > sense to merge this distributor’s patch to avoid supplying -Wcross to
> > automake -- perhaps generalized to arbitrary warning categories.  What
> > do you think?
>
> Yes, we can't assume that both packages are of the same vintage or have the 
> same
> warnings.
>
> The Autoconf Way would be to probe Automake to see what warnings it supports. 
> :-)

Good point.  I have a plan to make this happen -- stay tuned.

zw



Re: AC_DIAGNOSE not obsolete, please

2020-10-06 Thread Zack Weinberg
On Mon, Oct 5, 2020 at 5:40 PM Karl Berry  wrote:
>
> Zack and all - Thien-Thi (cc'd) just reported two failures in automake
> make check with autoconf-2.69c (bugs.gnu.org/43819). The critical
> message appears to be:
>
> ./lib/autoconf/general.m4:2328: AC_DIAGNOSE is expanded from...
> /home/ttn/local/.b/automake-1.16.2/m4/init.m4:29: AM_INIT_AUTOMAKE is 
> expanded from...
> configure.ac:2: the top level
> configure.ac:2: warning: The macro `AC_DIAGNOSE' is obsolete.
>
> Since AM_INIT_AUTOMAKE calls AC_DIAGNOSE, can you please not make
> it obsolete? Otherwise upgrading will be a nightmare for everyone.

Seems reasonable.  What do you think of the patches below?  The first
set, applicable to autoconf, makes it so autoupdate will still replace
AC_DIAGNOSE with m4_warn, but autoconf *won't* complain about
AC_DIAGNOSE being obsolete.  This is handled with a new argument to
AU_DEFUN and AU_ALIAS, which can be either empty, for the historical
behavior of warning and updating, or the word "silent" for just update
with no warnings from autoconf.  Autoconf 2.69 and previous will
ignore the new argument, so this is safe to use in third-party macros
if desired.

The second set, applicable to automake, is additional tweaks I needed
to make to the automake testsuite to get it to pass 100%.  It seems
that both autoconf and automake have become pickier about incomplete
configure scripts.

(m4_warn was already available in Autoconf 2.50, by the way.)

zw

--- autoconf patches ---
diff --git a/doc/autoconf.texi b/doc/autoconf.texi
index dea85e4a..07ab9763 100644
--- a/doc/autoconf.texi
+++ b/doc/autoconf.texi
@@ -14888,29 +14888,52 @@ Obsoleting Macros
 Autoconf provides a simple means to obsolete a macro.

 @anchor{AU_DEFUN}
-@defmac AU_DEFUN (@var{old-macro}, @var{implementation}, @ovar{message})
+@defmac AU_DEFUN (@var{old-macro}, @var{implementation},
@ovar{message}, @ovar{silent})
 @auindex{DEFUN}
-Define @var{old-macro} as @var{implementation}.  The only difference
-with @code{AC_DEFUN} is that the user is warned that
-@var{old-macro} is now obsolete.
+Define @var{old-macro} as @var{implementation}, just like
+@code{AC_DEFUN}, but also declare @var{old-macro} to be obsolete.
+When @command{autoupdate} is run, occurrences of @var{old-macro} will
+be replaced by the text of @var{implementation} in the updated
+@file{configure.ac} file.

-If she then uses @command{autoupdate}, the call to @var{old-macro} is
-replaced by the modern @var{implementation}.  @var{message} should
-include information on what to do after running @command{autoupdate};
-@command{autoupdate} prints it as a warning, and includes it
-in the updated @file{configure.ac} file.
+If a simple textual replacement is not enough to finish the job of
+updating a @file{configure.ac} to modern style, provide instructions for
+whatever additional manual work is required as @var{message}.  These
+instructions will be printed by @command{autoupdate}, and embedded in the
+updated @file{configure.ac} file, next to the text of @var{implementation}.

-The details of this macro are hairy: if @command{autoconf} encounters an
-@code{AU_DEFUN}ed macro, all macros inside its second argument are expanded
-as usual.  However, when @command{autoupdate} is run, only M4 and M4sugar
-macros are expanded here, while all other macros are disabled and
-appear literally in the updated @file{configure.ac}.
+Normally, @command{autoconf} will also issue a warning (in the
+``obsolete'' category) when it expands @code{old-macro}.  This warning
+does not include @var{message}; it only advises the maintainer to run
+@command{autoupdate}.  If it is inappropriate to issue this warning, set
+the @var{silent} argument to the word @code{silent}.  One might want to
+use a silent @code{AU_DEFUN} when @code{old-macro} is used in a
+widely-distributed third-party macro.  If that macro's maintainers are
+aware of the need to update their code, it's unnecessary to nag all
+of the transitive users of @code{old-macro} as well.  This capability
+was added to @code{AU_DEFUN} in Autoconf 2.70; older versions of
+autoconf will ignore the @var{silent} argument and issue the warning
+anyway.
+
+@strong{Caution:} If @var{implementation} contains M4 or M4sugar macros,
+they will be evaluated when @command{autoupdate} is run, not emitted
+verbatim like the rest of @var{implementation}.  If this is wrong, you
+will need to work around it---see for instance the definition
+of @code{AC_FOREACH} in @file{general.m4}.
 @end defmac

-@defmac AU_ALIAS (@var{old-name}, @var{new-name})
+@defmac AU_ALIAS (@var{old-name}, @var{new-name}, @ovar{silent})
 @auindex{ALIAS}
-Used if the @var{old-name} is to be replaced by a call to @var{new-macro}
-with the same parameters.  This happens for example if the macro was renamed.
+A shorthand version of @code{AU_DEFUN}, to be used when a macro has
+simply been renamed.  @command{autoupdate} will replace @var{old-name}
+with a call to @var{new-macro}, with the sa

Re: AC_DIAGNOSE not obsolete, please

2020-10-07 Thread Zack Weinberg
Both sets of patches have been pushed.  The changes to the Autoconf
manual are slightly different in the version I committed, I found some
small errors on proofreading the patch.

According to 
https://codesearch.debian.net/search?q=-pkg%3Aautoconf+%5CbAC_DIAGNOSE%5Cb&literal=0
this would have been a problem for macros distributed by gettext and
libtool as well as automake, and maybe more (I didn't read through all
853 pages :-) so I'm glad you caught it.

zw



Re: Build and test failures with Autoconf 2.70

2020-12-29 Thread Zack Weinberg
Karl Berry wrote:
> Zack Weinberg wrote:
>> They all appear to be cases of autoconf and/or aclocal
>> getting run when the test suite does not expect them to be run.  I am
>>  stumped as to why
>
> In short: because the 2.70 autom4te decided not to update configure.
>
> I don't know why not, but I have a guess: maybe the new autom4te
> ignores everything after AC_OUTPUT. Is that correct? Because that is
> where some of these tests are adding a new AC_ macro.
[...]
> in the successful run with autoconf 2.69, it was the
> $AUTOCONF run that updated configure (as one would expect).
>
> In the failing run with autoconf 2.70, the $AUTOCONF run ends with
> the line: autom4te: 'configure' is unchanged
>
> The failure follows.

Thanks, that was the clue I needed.  The newer autoconf wasn't
ignoring code after AC_OUTPUT, but what it *was* doing was leaving the
timestamp on configure untouched when a change in configure.ac did not
change the generated code.  (As if it were using move-if-change to
update configure.)  AC_LIBSOURCES doesn't change the generated code,
because it does all its work via traces, so those tests all failed.

I've fixed this in Autoconf trunk now
(revision 07130c3e48d12ec155ac5d7630dc7477b6989940)
and will shortly merge it to the 2.70 release branch.
With this patch applied, I see no failures in the Automake
testsuite on a GNU/Linux system.

zw

p.s. Sorry for the delay, I forgot I wasn't subscribed to any of the
Automake mailing lists.  I am now.



Re: Getting srcdir in script executed using m4_esyscmd in AC_INIT

2021-01-05 Thread Zack Weinberg
On Tue, Jan 5, 2021 at 8:43 AM Bob Friesenhahn
 wrote:
> On Mon, 4 Jan 2021, Zack Weinberg wrote:
> >>
> >> Something which surprises me is that the distribution tarballs became
> >> noticeably larger.
> >
> > This is expected. The bulk of the increase is probably due to the test
> > programs used to probe for C and/or C++ 2011 support in the compilers.
> > There were also some fixes for compatibility problems that involved
> > emitting more code throughout the script. I can explain the
> > rationale for any piece of the diff between the old and new configure
> > scripts that you're curious about.
>
> To be clear, the size difference was intended to be due to the list of
> parameters passed to AC_INIT and AM_INIT_AUTOMAKE and re-autotooling.
> In fact this is still using Autoconf 2.69.

Oh, I misunderstood you.

> Something I found which surprised me is that Automake has added a GNU
> COPYING file to my non-GNU non-GPL package.  Once the file appeared in
> the source directory, it seems that it continued to be used and
> included in spite of the Automake options provided.
>
> While I see more detail sprinkled around the generated files, it seems
> that this COPYING file may be the biggest issue.

Hmm.  If you remove COPYING, make sure that you have `foreign` in
either `AUTOMAKE_OPTIONS` or the options list passed to
`AM_INIT_AUTOMAKE`, and then re-run automake --add-missing, does
COPYING come back?  My understanding is that automake --add-missing is
supposed to install a copy of the GPL *only if* it's in `gnu` mode and
can't find any other statement of licensing.

zw



Future plans for Autotools

2021-01-20 Thread Zack Weinberg
Now we've all had a while to recover from the long-awaited Autoconf
2.70 release, I'd like to start a conversation about where the
Autotools in general might be going in the future.  Clearly any future
development depends on finding people who will do the work, but before
we worry about that I think we might want to figure out what work we
_want_ to do.

As a starting point, I wrote up a "strengths, weaknesses,
opportunities, and threats" analysis for Autotools -- this is a
standard project management technique, if you're not familiar with it,
there's a nice writeup in the draft of the book my friend and
colleague Sumana Harihareswara is writing [
https://changeset.nyc/resources/getting-unstuck-sampler-offer.html ].
I'm going to paste the full text of this analysis below, please reply
inline.  You can also read it on my blog at
https://www.owlfolio.org/development/autoconf-swot/ .

zw


I’ve been a contributor to GNU projects for many years, notably both
GCC and GNU libc, and recently I led the effort to make the first
release of Autoconf since 2012 (release announcement for Autoconf
2.70). For background and context, see the LWN article my colleague
Sumana Harihareswara of Changeset Consulting wrote.

Autoconf not having made a release in eight years is a symptom of a
deeper problem. Many GNU projects, including all of the other
components of the Autotools (Automake, Libtool, Gnulib, etc.) and the
software they depend upon (GNU M4, GNU Make, etc.) have seen a steady
decline in both contributor enthusiasm and user base over the past
decade. I include myself in the group of declining enthusiasts; I
would not have done the work leading up to the Autoconf 2.70 release
if I had not been paid to do it. (I would like to say thank you to the
project funders: Bloomberg, Keith Bostic, and the GNU Toolchain Fund
of the FSF.)

The Autotools are in particularly bad shape due to the decline in
contributor enthusiasm. Preparation for the Autoconf 2.70 release took
almost twice as long as anticipated; I made five beta releases between
July and December 2020, and merged 157 patches, most of them bugfixes.
On more than one occasion I was asked why I was going to the
trouble—isn’t Autoconf (and the rest of the tools by implication)
thoroughly obsolete? Why doesn’t everyone switch to something newer,
like CMake or Meson? (See the comments on Sumana’s LWN article for
examples.)

I personally don’t think that the Autotools are obsolete, or even all
that much more difficult to work with than some of the alternatives,
but it is a fair question. Should development of the Autotools
continue? If they are to continue, we need to find people who have the
time and the inclination (and perhaps also the funding) to maintain
them steadily, rather than in six-month release sprints every eight
years. We also need a proper roadmap for where further development
should take these projects. As a starting point for the conversation
about whether the projects should continue, and what the roadmap
should be, I was inspired by Sumana’s book in progress on open source
project management (sample chapters are available from her website) to
write up a strengths, weaknesses, opportunities, and threats analysis
of Autotools.

This inventory can help us figure out how to build on new
opportunities, using the Autotools’ substantial strengths, and where
to invest to guard against threats and shore up current weaknesses.

Followup discussion should go to the Autoconf mailing list.

Strengths

In summary: as the category leader for decades, the Autotools benefit
from their architectural approach, interoperability, edge case
coverage, standards adherence, user trust, and existing install base.

Autoconf’s feature-based approach to compiled-code portability scales
better than lists of system quirks.
The Autotools carry 30+ years’ worth of embedded knowledge about
portability traps for C programs and shell-based build scripting on
Unix (and to a lesser extent Windows and others), including variants
of Unix that no other comparable configuration tool supports.
Autoconf and Automake support cross-compilation better than competing
build systems.
Autoconf and Automake support software written in multiple languages
better than some competing build systems (but see below).
Autoconf is very extensible, and there are lots of third-party macros available.
Tarball releases produced by Autotools have fewer build dependencies
than tarball releases produced by competing tools.
Tarball releases produced by Autotools have a predictable,
standardized (literally; it’s a key aspect of the GNU Coding
Standards) interface for setting build-time options, building them,
testing them, and installing them.
Automake tries very hard to generate Makefiles that will work with any
Make implementation, not just GNU make, and not even just (GNU or BSD)
make.
The Autotools have excellent reference-level documentation (better
than CMake and Meson’s).
As they are GNU projects, users can hav

Re: Future plans for Autotools

2021-01-21 Thread Zack Weinberg
On Wed, Jan 20, 2021 at 5:15 PM Zack Weinberg  wrote:
> Now we've all had a while to recover from the long-awaited Autoconf
> 2.70 release, I'd like to start a conversation about where the
> Autotools in general might be going in the future.

> Now we've all had a while to recover from the long-awaited Autoconf
> 2.70 release, I'd like to start a conversation about where the
> Autotools in general might be going in the future.

Thanks for all the great thoughtful responses that came in overnight!
I see many people expressing similar sentiments, so rather than
respond to each of you individually I'm going to summarize and reply
to each strand in this one message.

> continuous integration for Autotools

This is one of my highest priorities for what volunteer time I have in
the next few months.  The biggest difficulty with CI, as I see it, is
that Autotools needs to be tested on a very wide variety of operating
systems, and ideally *versions* of operating systems, and unusual
compilers, and so on, in order to confirm that it's still fulfilling
its contract.  Testing just on the "big three" operating systems
(Linux, Windows, MacOS) typically supported by commercial CI providers
... would be better than nothing, but would not have helped all that
much with the 2.70 release, if I'm honest.  GNU's Hydra instance
(https://www.gnu.org/software/devel.en.html#Hydra -- I didn't
previously know about this, thank you Dan for pointing it out) is a
strong contender just because it has FreeBSD and Solaris as well.

> implementation languages - can't we do better than shell+perl+m4?

I deeply sympathize with frustration with M4 and portable shell as
implementation languages.  Perl isn't so bad IMO, but being restricted
to perl 5.6.0 with no CPAN modules is also a headache.

Having said that, switching to *anything else* would be a gigantic
task -- multiple full-time person-years of effort just for the core --
and would mean either porting or losing all of the third-party macro
libraries.  I don't think it makes sense to explore this possibility
any further unless both a lot of funding and a lot of developers
materialize.

A lesser possibility we could think about, would be dusting off GNU M4
2.0 and thinking about what we could change about the M4 language to
make it friendlier for Autoconf's purposes.  Again I would want
funding and developers lined up, but we'd not need quite so much of
either.

And we probably could bump up the minimum version requirement for Perl
quite some ways, since it's only needed at autoreconf time.  I'm not
sure how much Autoconf would gain from this, though, since it's only
incrementally changed since 5.6 and Autoconf doesn't do *that* much
with it. Automake uses it much more heavily, though, perhaps people
from that project would like to comment?

[Nate Bargmann]
> One strength of the Autotools that I think stands above the rest is
> the fact that a user of a distributed package does not need to
> install any of the Autotools packages.

I definitely agree with this, and I'm putting it right next to the
"implementation languages" note because this is a compelling reason to
stick to shell as the language for generated configure scripts.  The
only alternative that would come anywhere near as close in terms of
"just unpack and run" portability is Perl.  Which is a tempting
possibility -- it would wipe out dozens of internal headaches due to
shell's limitations and portability quirks -- but again, multiple
full-time person-years of effort, losing the third-party macro
libraries, and we *would* be making life harder for people starting
from scratch on OSes that don't bundle Perl.

> Autotools are too difficult to work on

Yes indeed.  On the assumption that we are stuck with shell, M4, and
Perl, however, what can we do about it?  I have a few ideas, maybe you
have more:

- A linter for configure.ac and third-party macro libraries.
  This wouldn't be any fun to write, because it would have to parse
  both M4 and shell, *accurately*, but it would be valuable to every
  user of autoconf.

- The parser for the linter could also be used as the basis for a
  reimplementation of autoupdate that *doesn't* rely on running
  configure.ac through M4 with most of the macros disabled.  That
  implementation choice is responsible for most of the bugs in
  autoupdate, and for most of its limitations (e.g. not being able to
  fix quotation errors).

- Audit existing configure.ac's looking for pain points: where are
  people having trouble doing the checks they need, where are people
  not realizing that they don't need to work so hard, where are people
  not realizing that they don't need to check for that at all...

- Audit the core for workarounds that are no longer necessary now we
  can rely on shell functions (I

Automake's file locking (was Re: Autoconf/Automake is not using version from AC_INIT)

2021-01-25 Thread Zack Weinberg
On Mon, Jan 25, 2021 at 9:52 AM Bob Friesenhahn
 wrote:
> At the moment it is a big deal for me because the locking prototol
> that Autoconf/Automake is using does not work with NFS mounts for
> Illumos-derived systems when the client is also an Illumos-derived
> system, because Illumos failed to support the legacy locking protocol
> used when the system locking daemon was re-implemented from scratch.
...
> It is likely that a small patch to Automake Perl-based locking code
> could solve this issue by using the same fall-back to using POSIX
> locking rather than legacy locking the same way that GNU/Linux does.
> It may also be that using POSIX locking in the first place is the
> solution.

Automake "just" calls Perl's 'flock' built-in (see 'sub lock' in
Automake/XFile.pm) (this code is copied into Autoconf under the
Autom4te:: namespace).  It would be relatively straightforward to
teach it to try 'fcntl(F_SETLKW, ...)' if that fails.  Do you know
whether that would be sufficient?  If not, we may be up a creek, since
depending on CPAN modules is a non-starter.

zw



Re: Automake's file locking (was Re: Autoconf/Automake is not using version from AC_INIT)

2021-01-28 Thread Zack Weinberg
On Mon, Jan 25, 2021 at 11:18 AM Bob Friesenhahn
 wrote:
> On Mon, 25 Jan 2021, Zack Weinberg wrote:
> > Automake "just" calls Perl's 'flock' built-in (see 'sub lock' in
> > Automake/XFile.pm) (this code is copied into Autoconf under the
> > Autom4te:: namespace).  It would be relatively straightforward to
> > teach it to try 'fcntl(F_SETLKW, ...)' if that fails.

I was wrong about this.  It is not practical to make a Perl program
directly call ‘fcntl(F_SETLKW)’ without use of CPAN modules.  This is
because, at the C level, ‘fcntl(F_SETLKW)’ takes a ‘struct flock’
argument, and we have no way of knowing what the layout of that
structure is.  (See https://metacpan.org/pod/File%3a%3aFcntlLock for
gory details.  We can’t adopt the implementation of that module,
because it relies on running a C compiler, and the locking is needed
at a point when we do not yet know whether a C compiler is available.)

> It may be that moving forward to 'fcntl(F_SETLKW, ...)' by default and
> then falling back to legacy 'flock' would be best.  Or perhaps
> discarding use of legacy 'flock' entirely.
>
> Most likely the decision as to what to do was based on what was the
> oldest primitive supported at the time.

I need to reemphasize that the decision here was made by the Perl
developers, not the Automake or Autoconf developers, and it was made a
very long time ago—judging by e.g. references to h2ph in ‘perldoc
Fcntl’, probably circa perl 5.000!  I am also going to guess that the
motivation for preferring flock was related to the motivation for this
grumpy aside in OpenBSD’s ‘fcntl(2)’ manpage:

| This interface follows the completely stupid semantics of System V
| and IEEE Std 1003.1-1988 (“POSIX.1”) that require that *all* locks
| associated with a file for a given process are removed when *any* file
| descriptor for that file is closed by that process
| ...
| The flock(2) interface has much more rational last close semantics

(Emphasis in original.)  As I recall, at the time, *neither* flock nor
fcntl locks were honored *at all* over NFS, so that wouldn’t have been
a consideration.



There is a potential way forward here.  The *only* place in all of
Autoconf and Automake where XFile::lock is used, is by autom4te, to
take an exclusive lock on the entire contents of autom4te.cache.
For this, open-file locks are overkill; we could instead use the
battle-tested technique used by Emacs: symlink sentinels.  (See
https://git.savannah.gnu.org/cgit/emacs.git/tree/src/filelock.c .)

The main reason I can think of, not to do this, is that it would make
the locking strategy incompatible with that used by older autom4te;
this could come up, for instance, if you’ve got your source directory
on NFS and you’re building on two different clients in two different
build directories.  On the other hand, this kind of version skew is
going to cause problems anyway when they fight over who gets to write
generated scripts to the source directory, so maybe it would be ok to
declare “don’t do that” and move on.  What do others think?

zw



Re: Automake's file locking (was Re: Autoconf/Automake is not using version from AC_INIT)

2021-01-28 Thread Zack Weinberg
On Thu, Jan 28, 2021 at 2:16 PM Bob Friesenhahn
 wrote:
> On Thu, 28 Jan 2021, Zack Weinberg wrote:
> >
> > The main reason I can think of, not to do this, is that it would make
> > the locking strategy incompatible with that used by older autom4te;
> > this could come up, for instance, if you’ve got your source directory
> > on NFS and you’re building on two different clients in two different
> > build directories.  On the other hand, this kind of version skew is
> > going to cause problems anyway when they fight over who gets to write
> > generated scripts to the source directory, so maybe it would be ok to
> > declare “don’t do that” and move on.  What do others think?
>
> This is exactly what I do.  I keep the source files on a file server
> so that I can build on several different types of clients.  This used
> to even include Microsoft Windows clients using CIFS.

Do you use different versions of autoconf and/or automake on the
different clients?

> The lock appears to be taken speculatively since it is taken before
> Autotools checks that there is something to do.
...
> The most common case is that there is nothing for Autotools to do
> since the user is most often doing a 'make' for some other purpose.

It looks to me like the lock is taken at exactly the point where
autom4te decides that it *does* have something to do. It might be
possible to change it to take a *read* lock first and only upgrade to
a write lock if new files are to be added to the cache, but Make
shouldn't be running the autotools at all if they have nothing to do
(which I suppose takes us over to the *other* thread about your
problems with automake and configure's dependencies :-)

zw



RFC: Bump minimum Perl to 5.18.0 for next major release of both Autoconf and Automake

2021-01-29 Thread Zack Weinberg
I propose that, as of the next major feature release of both Autoconf
and Automake, we raise the minimum version requirement for Perl to
5.18.0.  (Currently it is 5.6.0.)  On the Autoconf side, the version
number and timeframe for the first release with this change are still
TBD but it would *not* come from the post-2.70 release branch.  I
don't know what the Automake maintainers' current release schedule
looks like but I imagine this would be targeted for 1.17.0, whenever
that happens (ideally at about the same time as the Autoconf release
making the same change).

Increasing the minimum version requirement would have these benefits:

 - Access to useful library modules, notably Digest::SHA, List::Util,
Storable, and Time::HiRes.
 - Comprehensive interpreter support for Unicode (specifically, 'use
feature "unicode_strings"').  Neither autoconf nor automake makes much
use of non-ASCII characters themselves, but having Unicode supported
in the interpreter would make it easier for us to be sure that we
don't mess up non-ASCII text in our inputs.
 - Any number of bug fixes; I have personally tripped over problems
with File::Spec and File::Temp in 5.6.0.
 - Any number of performance enhancements.

Perl 5.18.0 was released in May 2013.  (Perl 5.6.0 was released in
March 2000.)  I selected it as the minimum because the Perl
maintainers recommend all users of older interpreters upgrade to at
least this version, due to security fixes (see
https://perldoc.perl.org/perlsec#Algorithmic-Complexity-Attacks --
inputs to Autoconf and Automake are trusted, but this rationale means
it's hard to find a CI platform that offers anything older than 5.18,
and we ought to be testing our minimum version requirement.)  I'm open
to arguments for setting it either lower or higher.

zw



Re: PLV, PSV in manual

2021-02-02 Thread Zack Weinberg
On Tue, Feb 2, 2021 at 1:19 PM DUDZIAK Krzysztof
 wrote:
> Isn't it that strange if for manual* following searches for a string result 
> in no matches?
> search for "PLV"
> search for "PSV"
> search for "Product List Variable"
> search for "Product Source Variable"

I've never heard these terms before and don't know what they mean. Can
you please explain what these terms mean and what you expected the
automake manual to have to say about them?

zw



Re: automake variable prefix 'check_'

2021-02-02 Thread Zack Weinberg
On Tue, Feb 2, 2021 at 1:24 PM DUDZIAK Krzysztof
 wrote:
> As one can't find string "distcheck" in GCS

By GCS do you mean the GNU Coding Standards?

> it looks like it wasn't GCS
> which constitutes support and usage of `distcheck' target.
> Maybe it is POSIX, or UNIX.

As far as I know, the distcheck target was invented by automake. It
probably should be added to the GNU Coding Standards, if it's not
there already, but I don't know if anyone is working on that document
anymore.

> I wonder if recipe to build `distcheck' target
> as compiled by automake gets same form as John Calcote
> describes it in chapter 3 his Practitioner's Guide (2nd ed.).

I have not read this book. Since it was published ten years ago,
anything it describes may well be out of date.  I don't think
distcheck has changed much in that time, but I could be wrong.

> For one Makefile generated by autotools in one of my last projects
> I see building `check' target is included in `distcheck' recipe.
> If yes, all products prefixed with `check_'  would be built also
> when `make distcheck' command is run.

Yes, this is intentional. I don't think I understand what your question is.

zw



Re: Automake's file locking

2021-02-03 Thread Zack Weinberg
On Thu, Jan 28, 2021 at 6:51 PM Bruno Haible  wrote:
> Zack Weinberg wrote:
> > There is a potential way forward here.  The *only* place in all of
> > Autoconf and Automake where XFile::lock is used, is by autom4te, to
> > take an exclusive lock on the entire contents of autom4te.cache.
> > For this, open-file locks are overkill; we could instead use the
> > battle-tested technique used by Emacs: symlink sentinels.  (See
> > https://git.savannah.gnu.org/cgit/emacs.git/tree/src/filelock.c .)
>
> I can confirm that, while flock() is the most basic/elementary locking
> facility [1], its emulation in gnulib [2] does not really work on
> Solaris. The unit test regularly fails on Solaris.
>
> Therefore I like the idea of merely relying on the atomicity of
> file creation / file rename operations.
>
> These files should reside inside the autom4te.cache directory. I would
> not like to change all my scripts and Makefiles that do
>   rm -rf autom4te.cache

Agreed.  The approach I'm currently considering is: with the
implementation of the new locking protocol, autom4te will create a
subdirectory of autom4te.cache named after its own version number, and
work only in that directory (thus preventing different versions of
autom4te from tripping over each other).  Each request will be somehow
reduced to a strong hash and given a directory named after the hash
value.  The existence of this directory signals that an autom4te
process is working on a request, and the presence of 'request',
'output', and 'traces' files in that directory signals that the cache
for that request is valid.  If the directory for a request exists but
the output files don't, autom4te will busy-wait for up to some
definite timeout before stealing the lock and starting to work on that
request itself.

This would be substantially easier to implement with access to the
Storable, Digest::SHA, and Time::HiRes modules, and that's the
principal reason I suggested bumping our minimum Perl requirement to
5.18 in <https://lists.gnu.org/archive/html/autoconf/2021-01/msg00133.html>.

zw



Re: RFC: Bump minimum Perl to 5.18.0 for next major release of both Autoconf and Automake

2021-02-17 Thread Zack Weinberg
On Fri, Jan 29, 2021 at 5:54 PM Karl Berry  wrote:

I don't know why, but I only received this message today.

> raise the minimum version requirement for Perl to 5.18.0
>
> Sounds sensible to me, for the reasons you outlined.
>
> But, I think it would be wise to give users a way to override the
> requirement, of course with the caveat "don't blame us if it doesn't
> work", unless there are true requirements such that nothing at all would
> work without 5.18.0 -- which seems unlikely (and undesirable, IMHO).
> 2013 is not that long ago, in autotime.
>
> don't know what the Automake maintainers' current release schedule
>
> We (I, at least, and I'm pretty sure it's true for Jim too) have no schedule.
>
> looks like but I imagine this would be targeted for 1.17.0,
>
> I guess we would declare the version making this change as 1.17.0.
> I don't have any other plans that would involve a new minor version.
>
> that happens (ideally at about the same time as the Autoconf release
> making the same change).
>
> Sounds feasible, since it doesn't sound like it will be especially quick
> (I'm glad). --thanks, karl.
>



Re: RFC: Bump minimum Perl to 5.18.0 for next major release of both Autoconf and Automake

2021-02-17 Thread Zack Weinberg
On Fri, Jan 29, 2021 at 5:54 PM Karl Berry  wrote:

I don't know why, but I only received this message today.

> But, I think it would be wise to give users a way to override the
> requirement, of course with the caveat "don't blame us if it doesn't
> work", unless there are true requirements such that nothing at all would
> work without 5.18.0 -- which seems unlikely (and undesirable, IMHO).
> 2013 is not that long ago, in autotime.

This is a reasonable suggestion but Perl makes it difficult.  You
specify the interpreter version requirement for a Perl script with a
statement like

use 5.006;   # what we have now

at the top of the script.  The number has to be a constant, and it has
to appear at the top of every .pm file as well as the main script.  I
don't think it makes sense to use a config.status substitution
variable for this -- then we'd have to generate _all_ of the .pm files
through config.status!

What we could do is something like this instead:

   use 5.008;  # absolute minimum requirement
   use if $] >= 5.016, feature => ':5.16';  # enable a number of
desirable features from newer perls

+ documentation that we're only _testing_ with the newer perls.

I did some more research on perl's version history (notes at end) and
I think the right thresholds are 5.10 for absolute minimum and 5.16
for 'we aren't going to test with anything older than this'.  5.10 is
the oldest perl that shipped Digest::SHA, which I have a specific need
for in autom4te; it is also the oldest perl to support `state`
variables and the `//` operator, both of which could be quite useful.
The new features in 5.12, 5.14, and 5.16 mostly have to do with
Unicode, which we do not strictly _need_ but which, if we turn them
on, give us a better chance of Just Doing The Right Thing with
user-supplied Unicode text.  The top-of-each-file boilerplate would
look something like this:

use 5.010;
use strict;
use warnings FATAL => ’all';
use utf8;
use if $] >= 5.016, feature => ’:5.16';
no  if $] >= 5.022, warnings => 'experimental::re_strict';
use if $] >= 5.022, re   => 'strict';

_Possibly_ we would also want `use open qw(:std :utf8)`, I'm not sure.
(That means "treat all files as encoded in UTF-8 unless overridden
below.")

Perl 5.10 shipped in 2007, so that's another six years of compatibility window.

zw


# perl 5.8

 - released 2002-07
 - support for `open FH, "-|", LIST`
 - Definitely useful modules: `Storable`, `Time::HiRes`, `if`, `open`,
   `threads` (already in use when available)
 - Possibly useful modules: `Digest::MD5`, `MIME::Base64`

# perl 5.10

 - released 2007-12
 - first version supporting `use feature`, `state`, the `//`
   operator, named capture groups, and a bunch of other regex
   enhancements
 - Possibly useful modules: `Digest::SHA`, `Compress::Zlib`

# perl 5.12

 - released 2010-04
 - first version for which `use 5.xx` implies `use strict`

# perl 5.14

 - released 2011-05
 - support for `s///r` (returns result of substitution, input variable
   is not modified)
 - first complete implementation of `use feature 'unicode_strings'`
 - Possibly useful module: `HTTP::Tiny`

# perl 5.16

 - released 2012-05
 - a whole bunch of unicode-related bug fixes
 - bugs relevant to autotools were fixed in `FindBin` and `IPC::Open3`

# perl 5.18

 - released 2013-05
 - first version with `experimental` warnings category
 - oldest version typically available on cloud platforms due to
   important security fixes (which are not directly relevant for
   use in autotools)
 - oldest version in which HTTP::Tiny implements HTTPS correctly



Re: config.sub/config.guess using nonportable $(...) substitutions

2021-03-09 Thread Zack Weinberg
On Tue, Mar 9, 2021 at 5:00 PM Tim Rice  wrote:
> On Tue, 9 Mar 2021, Warren Young wrote:
> > On Mar 9, 2021, at 1:26 PM, Paul Eggert  wrote:
> > Solaris 10 dates from early 2005.  We gave it 16 years of direct support, 
> > and now it’s on a sort of “extended” support if you point Autoconf 
> > configure scripts at a reasonable shell.
>
> The thing is, it is not even necessary to point Autoconf configure
> scripts at a reasonable shell. The configure script will find a capable
> shell and run itself with that shell.
>
> UnixWare has a /bin/sh as crufty as Solaris 10. I just pulled the
> latest config.guess and config.sub into my OpenSSH tree and configure
> was just as happy as before. Like Solaris /bin/sh, UniWare's does
> not handle $(...) either.

N.B. I changed Autoconf's "find us a better shell" logic in 2.70 to
include checking for $(...) _specifically because of_ the change being
debated here.  Configure scripts themselves still use `...`
nigh-exclusively.

I don't want to either tear down or defend Ben's change, but it's my
personal opinion that use of $(...) is a lot more defensible in a
configure script than in config.{sub,guess}, because it _has_ that
"find a better shell" step at the beginning.  Also, configure would
get more out of being able to use $(...) than config.{sub,guess} do,
just because its code is typically more complicated, and configure
scripts tend to pull in third-party code from all over the place that
may or may not ever have been tested on anything old.  For these
reasons I probably wouldn't revert the Autoconf change even if the
config.{sub,guess} change were reverted.

zw



`make dist` fails with current git

2021-10-13 Thread Zack Weinberg
$ make V=1 dist
make  dist-xz dist-gzip am__post_remove_distdir='@:'
make[1]: Entering directory '/home/zack/projects/gnu/autoconf/ci/b-automake'
make  distdir-am
make[2]: Entering directory '/home/zack/projects/gnu/autoconf/ci/b-automake'
set -e; set -u; \
if test -d ../s-automake/.git; then \
  rm -f ChangeLog-t \
&& /usr/bin/perl ../s-automake/lib/gitlog-to-changelog \
   --amend=../s-automake/.git-log-fix --since='2011-12-28 00:00:00' 
--no-cluster --format '%s%n%n%b' >ChangeLog-t \
&& chmod a-w ChangeLog-t \
&& mv -f ChangeLog-t ChangeLog \
|| exit 1; \
elif test ! -f ../s-automake/ChangeLog; then \
  echo "Source tree is not a git checkout, and no pre-existent" \
   "ChangeLog file has been found there" >&2; \
  exit 1; \
fi
gitlog-to-changelog:../s-automake/.git-log-fix: unused entry: 
3b369e6bbe0fb6d7359398935706c87dd9375cb6
gitlog-to-changelog:../s-automake/.git-log-fix: unused entry: 
22729165f6bb902daeb8a4d8e7cb06982390f327
make[2]: *** [../s-automake/maintainer/maint.mk:48: ChangeLog] Error 1
make[2]: Leaving directory '/home/zack/projects/gnu/autoconf/ci/b-automake'
make[1]: *** [Makefile:3245: distdir] Error 2
make[1]: Leaving directory '/home/zack/projects/gnu/autoconf/ci/b-automake'
make: *** [Makefile:: dist] Error 2

Looks like some kind of problem with automatic ChangeLog generation?

The commits it's complaining about are

commit 22729165f6bb902daeb8a4d8e7cb06982390f327
Author: Peter Rosin 
Date:   Fri Feb 17 10:13:15 2012 +0100

fixup: .git-log-fix should not be executable

* .fix-git-log: Set mode 644.

commit 3b369e6bbe0fb6d7359398935706c87dd9375cb6
Author: Stefano Lattarini 
Date:   Thu Feb 16 22:29:32 2012 +0100

maint: use AC_PACKAGE_BUGREPORT to avoid duplication

* configure.ac: In the message reporting whether the user is about
to build an alpha or beta version, use the autoconf-provided
AC_PACKAGE_BUGREPORT macro instead of duplicating the bur reporting
address.




Re: `make dist` fails with current git

2021-10-13 Thread Zack Weinberg
On Wed, Oct 13, 2021, at 11:54 AM, Bob Friesenhahn wrote:
> On Wed, 13 Oct 2021, Zack Weinberg wrote:
>>
>> Looks like some kind of problem with automatic ChangeLog generation?
>
> To me this appears to be the result of skipping an important step in 
> what should be the process.  It seems that there should be a 
> requirement that 'make distcheck' succeed prior to a commit.  Then 
> this issue would not have happened.

Well, yes, but how do I _fix_ it? :-)

<.< I _may_ be investigating the possibility of setting up CI for automake so 
that problems like this are at least noticed in a timely fashion.  But if the 
tree is broken to begin with, it's troublesome...

zw



Re: `make dist` fails with current git

2021-10-13 Thread Zack Weinberg
On Wed, Oct 13, 2021, at 2:11 PM, Nick Bowler wrote:
> I think this happened because your CI system has done a shallow clone.
> So the changelog generation failed because the git log is incomplete.

I did a --single-branch clone, but not a shallow one.  Shouldn't the trunk be 
self-contained?

zw



Re: `make dist` fails with current git

2021-10-14 Thread Zack Weinberg
On Wed, Oct 13, 2021, at 2:32 PM, Nick Bowler wrote:
> On 2021-10-13, Zack Weinberg  wrote:
>> On Wed, Oct 13, 2021, at 2:11 PM, Nick Bowler wrote:
>>> I think this happened because your CI system has done a shallow clone.
>>> So the changelog generation failed because the git log is incomplete.
>>
>> I did a --single-branch clone, but not a shallow one.  Shouldn't the trunk
>> be self-contained?
>
> I think --single-branch should be fine: it seems to work for me...
>
> But with e.g., a "--depth 1" clone I get the exact same error you
> reported (because the commit IDs mentioned in .git-log-fix are not
> available).

After a little more investigation, a --single-branch clone indeed works fine.  
I was confused by two unrelated bugs, which happen regardless of what you clone.

1. `make dist` immediately after `configure` fails due to a missing dependency 
somewhere:

$ make dist
make  dist-xz dist-gzip am__post_remove_distdir='@:'
make[1]: Entering directory '/home/zack/projects/gnu/B-automake'
make  distdir-am
make[2]: Entering directory '/home/zack/projects/gnu/B-automake'
make[3]: Entering directory '/home/zack/projects/gnu/B-automake'
make[3]: Leaving directory '/home/zack/projects/gnu/B-automake'
Updating ../automake/doc/version.texi
  GEN  ../automake/doc/amhello-1.0.tar.gz
../automake/doc/amhello-1.0.tar.gz: recipe failed.
See file '/home/zack/projects/gnu/automake/doc/amhello/amhello-output.tmp' for 
details
make[2]: *** [Makefile:3759: ../automake/doc/amhello-1.0.tar.gz] Error 1
make[2]: Leaving directory '/home/zack/projects/gnu/B-automake'
make[1]: *** [Makefile:3245: distdir] Error 2
make[1]: Leaving directory '/home/zack/projects/gnu/B-automake'
make: *** [Makefile:: dist] Error 2

Running `make` first is a sufficient workaround.

2. `make dist` from a build directory that is not a subdirectory of the source 
directory fails during ChangeLog generation, with a different error than the 
one I reported earlier:

make[1]: Entering directory '/home/zack/projects/gnu/B-automake'
make  distdir-am
make[2]: Entering directory '/home/zack/projects/gnu/B-automake'
  GEN  ChangeLog
fatal: not a git repository (or any parent up to mount point /)
Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
gitlog-to-changelog: error closing pipe from git log --log-size 
'--pretty=format:%H:%ct  %an  <%ae>%n%n%s%n%n%b' '--since=2011-12-28 00:00:00'
make[2]: *** [../automake/maintainer/maint.mk:48: ChangeLog] Error 1
make[2]: Leaving directory '/home/zack/projects/gnu/B-automake'
make[1]: *** [Makefile:3245: distdir] Error 2
make[1]: Leaving directory '/home/zack/projects/gnu/B-automake'
make: *** [Makefile:: dist] Error 2

I will see if I can find time to patch these.

zw



Re: bug#19539: [1.15] AC_CONFIG_AUX_DIR should be called early

2022-02-09 Thread Zack Weinberg
On Wed, Feb 9, 2022, at 1:02 AM, Mike Frysinger wrote:
> On 08 Feb 2022 22:11, Glenn Morris wrote:
>>
>> I see autoconf uses savannah for bug reports.
>> So long as simply sending a mail to bug-autoconf doesn't open a savannah 
>> issue,
>> I can set the autoconf maintainer to bug-autoconf, so that mails go to
>> the right place. (It tends not to work well if two bug trackers talk to
>> each other.)
>
> Zach: do you know if there's an e-mail<->savannah gateway ?

I don't know either way, but I do know that bug-autoconf is just a regular 
mailing list, not connected to savannah issue tracking.  So Glenn's suggestion 
is probably the best for the short term.

This situation is exactly why I was thinking about switching autoconf over to 
debbugs: it's _useful_ for automake to be able to reassign bugs to autoconf and 
vice versa.  In fact, just last week I had to tell someone to re-file a bug 
against automake because we don't have the ability to reassign bugs in that 
direction.  (https://savannah.gnu.org/support/?110599)  However, I do not have 
time right now to wrangle the switch-over, and I also don't know how to get the 
existing set of open autoconf bugs migrated into debbugs.  (The closed bugs can 
stay where they are.)  Anyone reading this have time to tackle the work?

zw



Re: python debugging on trunk

2022-09-26 Thread Zack Weinberg
On Mon, Sep 26, 2022, at 12:23 PM, Karl Berry wrote:
> Is anyone up for debugging some Python-related test failures on
> RHEL-based systems?

I have access to a RHEL7 system, I know Python, and this sounds much
less unpleasant than everything else I'm supposed to be doing today.

> I have a suspicion the problem is that on RHEL systems, "python" invokes
> "python2" (because that's what the Python maintainers recommend, as I
> understand it), but the tests are assuming it invokes "python3".

Indeed:

$ lsb_release -d
Description: Red Hat Enterprise Linux Workstation release 7.9 (Maipo)
$ python --version
Python 2.7.5
$ python3 --version
Python 3.6.8

I have very strong opinions on this topic, having had to deal with a
lot of badly-written Python 2 code that, in the worst case, silently
corrupts data when run under Python 3.  IMNSHO, the bare command name
"python" and the absolute pathname "/usr/bin/python" MUST[rfc2119] be
*permanently reserved* for the Python 2 interpreter.  It's fine for
them not to exist at all, but if they exist and they invoke Python 3,
that is a *bug* in the distribution.

(Yes, there's a PEP that says something different -- IMO it is wrong,
its authors probably hadn't encountered the *really* troublesome
Python 2 code that I have.)

---
RHEL7, unsurprisingly, ships with Autoconf 2.69.  Before attempting to
build Automake from Git, I built Autoconf 2.71 from the tarball
release, installed it in a subdirectory of my homedir, and made that
directory come first on PATH.  There were no unexpected test failures
in Autoconf's testsuite. Other relevant tool versions include:

$ perl --version
This is perl 5, version 16, subversion 3 (v5.16.3) built for 
x86_64-linux-thread-multi
(with 44 registered patches, see perl -V for more detail)

$ m4 --version
m4 (GNU M4) 1.4.16

$ /bin/sh --version
GNU bash, version 4.2.46(2)-release (x86_64-redhat-linux-gnu)

$ make --version
GNU Make 3.82

$ sort --version
sort (GNU coreutils) 8.22

$ gcc --version
gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44)

$ libtool --version
libtool (GNU libtool) 2.4.2

I see these Automake testsuite failures:

Testsuite summary for GNU Automake 1.16i

# TOTAL: 2988
# PASS:  2877
# SKIP:  54
# XFAIL: 39
# FAIL:  18
# XPASS: 0
# ERROR: 0

FAIL: t/canon7
FAIL: t/fort5
FAIL: t/instmany-python
FAIL: t/ltinit
FAIL: t/nobase-python
FAIL: t/pr401b
FAIL: t/py-compile-basedir
FAIL: t/py-compile-basic
FAIL: t/py-compile-destdir
FAIL: t/py-compile-option-terminate
FAIL: t/python-pr10995
FAIL: t/python-prefix
FAIL: t/python-vars
FAIL: t/python10
FAIL: t/python12
FAIL: t/python3
FAIL: t/subobj9
FAIL: t/suffix5

That's _most_ of the failures you got, plus some more:

+FAIL: t/canon7
+FAIL: t/fort5
+FAIL: t/ltinit
+FAIL: t/pr401b
+FAIL: t/python10
+FAIL: t/subobj9
+FAIL: t/suffix5
-FAIL: t/subobj-pr13928-more-langs

There are only two failure modes:

canon7, fort5, ltinit, pr401b, subobj9, suffix5

ERROR: files left in build directory after distclean:
./.__afs

I did the build in an AFS volume; these are the AFS equivalent of
the NFS "silly rename" hack (apparently it can happen with SMB as well).
"make distcleancheck" should ignore these.

instmany-python, nobase-python, py-compile-basic, py-compile-destdir,
py-compile-option-terminate, pyghon-pr10995, python-prefix,
python-vars, python10, python12, python3

AttributeError: 'module' object has no attribute 'implementation'

This is coming from this code in lib/py-compile (there are three
copies of it):

if hasattr(sys.implementation, 'cache_tag'):
py_compile.compile(filepath, 
importlib.util.cache_from_source(filepath), path)
else:
py_compile.compile(filepath, filepath + 'c', path)

sys.implementation was added in Python 3.3 and
importlib.util.cache_from_source was added in 3.4.
Python 2.7's importlib _only_ has 'import_module'.

The appended patch should address both issues.  Note that I have only
tested it with Python 2.7 and 3.6.  It _may_ not be correct for Python
in the 3.0--3.3 (inclusive) range; I cannot conveniently test that.

zw

  * lib/am/distdir.am (distcleancheck_listfiles): Filter out "silly rename"
  files, unavoidably created by deleting files that are still open in some
  process on various network file systems.

  * lib/py-compile: Test directly for availability of
  importlib.util.cache_from_source.  Untangle logic for when to generate
  -O and -OO bytecode.  Reformat embedded Python fragments.

diff --git a/lib/am/distdir.am b/lib/am/distdir.am
index 0f591266d..b8d90e572 100644
--- a/lib/am/distdir.am
+++ b/lib/am/distdir.am
@@ -568,7 +568,9 @@ distuninstallcheck:
 ## Define distcleancheck_listfiles and distcleancheck separately
 ## from distcheck, so that they can be overridden by the user.
 .PHONY: distcleancheck
-distcleancheck_listfiles = find . -type f -print
+distcleancheck_listfiles = \
+  find . \( -type f -a \! \
+  \( -name .nfs* -o -name .smb* -o -name .__afs* \) \) -print
 distcleancheck: di

Re: make check(s) pre-release problems

2022-10-06 Thread Zack Weinberg

On 2022-10-04 6:58 PM, Karl Berry wrote:

With Zack's latest Python fixes, I was hoping to move towards an
Automake release, but I find myself stymied by apparently random and
unreproducible test failures. I haven't exhausted every conceivable
avenue yet, but I thought I would write in hopes that others (Zack, past
Automake developers, anyone else ...) could give it a try, and/or have
some insights.

For me, running a parallel make check (with or without parallelizing the
"internal" makes), or make distcheck, fails some tests, e.g., nodef,
nodef2, testsuite-summary-reference-log. The exact tests that fail
changes from run to run. Running the tests on their own succeeds. Ok, so
it's something in the parallelism. But why? And how to debug?


I can't reproduce this problem myself, but my first thought is that some 
of the tests, when run concurrently, could be overwriting each other's 
files somehow.  I can think of two ways to investigate that hypothesis: 
look for tests that write files outside a directory dedicated to that 
test, and, after a failed test run, look for files that are corrupted,

then try to figure out which tests would be stomping on those files.


Perhaps easier to debug: there are two targets to be run before making a
release, check-no-trailing-backslash-in-recipes and check-cc-no-c-o,
to try to ensure no reversion wrt these features. A special shell and
compiler are configured, respectively (shell scripts that check the
behavior).


I'm running these targets now and will report what I get.

zw



Re: make check(s) pre-release problems

2022-10-06 Thread Zack Weinberg
On Thu, Oct 6, 2022, at 1:04 PM, Zack Weinberg wrote:
> On 2022-10-04 6:58 PM, Karl Berry wrote:
>> Perhaps easier to debug: there are two targets to be run before making a
>> release, check-no-trailing-backslash-in-recipes and check-cc-no-c-o,
>> to try to ensure no reversion wrt these features. A special shell and
>> compiler are configured, respectively (shell scripts that check the
>> behavior).
>
> I'm running these targets now and will report what I get.

No errors on RHEL7+autoconf2.71 (same environment I used for the Python fixes)
from a serial "make check-no-trailing-backslash-in-recipes".  The other one is
running now.

zw



Re: make check(s) pre-release problems

2022-10-07 Thread Zack Weinberg
On Thu, Oct 6, 2022, at 4:25 PM, Karl Berry wrote:
> No errors on RHEL7+autoconf2.71
>
> Puzzling. Can you easily try RHEL8 or one of its derivatives?
> It surprises me that that is the culprit, but it seems possible.

Unfortunately, no.  CMU is mostly an Ubuntu shop these days.  It's only dumb 
luck that I happen to have access to a machine that hasn't had a major system 
upgrade since 2012 (and with my day-job hat on I'm actively trying to *get* it 
upgraded -- to Ubuntu.)

zw



_AM_FILESYSTEM_TIMESTAMP_RESOLUTION incorrect result (was Re: make check(s) pre-release problems)

2022-10-07 Thread Zack Weinberg
On Thu, Oct 6, 2022, at 4:19 PM, Zack Weinberg wrote:
> On Thu, Oct 6, 2022, at 1:04 PM, Zack Weinberg wrote:
>> On 2022-10-04 6:58 PM, Karl Berry wrote:
>>> Perhaps easier to debug: there are two targets to be run before making a
>>> release, check-no-trailing-backslash-in-recipes and check-cc-no-c-o,
>>> to try to ensure no reversion wrt these features. A special shell and
>>> compiler are configured, respectively (shell scripts that check the
>>> behavior).
>>
>> I'm running these targets now and will report what I get.
>
> No errors on RHEL7+autoconf2.71 (same environment I used for the Python fixes)
> from a serial "make check-no-trailing-backslash-in-recipes".  The other one is
> running now.

One failure from a serial "make check-cc-no-c-o":

FAIL: t/aclocal-autoconf-version-check
==

Running from installcheck: no
Test Protocol: none
...
configure: error: newly created file is older than distributed files!
Check your system clock
make: *** [config.status] Error 1

This doesn't appear to have anything to do with "make check-cc-no-c-o" mode, 
the problem is that the filesystem timestamp resolution was incorrectly 
detected:

configure:1965: checking whether sleep supports fractional seconds
configure:1979: result: true
configure:1982: checking the filesystem timestamp resolution
configure:2020: result: 0.1
configure:2024: checking whether build environment is sane
configure:2079: error: newly created file is older than distributed files!
Check your system clock

The filesystem I'm working in only records timestamps at second granularity. (I 
don't know why ls is printing .1 instead of .0 but it's always 
.1.)

$ touch A && sleep 0.1 && touch B && ls --full-time -t A B
-rw-r--r-- 1 zweinber users 0 2022-10-07 11:20:05.1 -0400 A
-rw-r--r-- 1 zweinber users 0 2022-10-07 11:20:05.1 -0400 B

I *think* this is a bug in _AM_FILESYSTEM_TIMESTAMP_RESOLUTION where, if it 
starts looping at *just the wrong time*, the first 0.1-second sleep will 
straddle a second boundary and we'll break out of the loop immediately and 
think we have 0.1-second timestamp resolution.

zw



Re: make check(s) pre-release problems

2022-10-11 Thread Zack Weinberg
Please don't top-post on this mailing list.

On Tue, Oct 11, 2022, at 12:15 PM, Frederic Berat wrote:
> On Fri, Oct 7, 2022 at 6:11 PM Zack Weinberg  wrote:
>> On Thu, Oct 6, 2022, at 4:25 PM, Karl Berry wrote:
>>>> No errors on RHEL7+autoconf2.71
>>>
>>> Puzzling. Can you easily try RHEL8 or one of its derivatives?
>>> It surprises me that that is the culprit, but it seems possible.
>>
>> Unfortunately, no.
>
> I don't know if that will help, or if that is completely unrelated,
> but I'm currently stumbling into a weird issue while working on a
> new package release for autoconf on Fedora: about 200 tests are now
> failing, all related to aclocal checks.
>
> My current investigation shows that it would be related to a bash
> update from 5.1.x to 5.2x which seems to have changed the behavior
> of the "SHLVL" shell variable.

This is a known issue, fixed on Autoconf development trunk, see
https://git.savannah.gnu.org/cgit/autoconf.git/commit/?id=412166e185c00d6eacbe67dfcb0326f622ec4020

I intend to make a bugfix release of Autoconf in the near future.

> I actually wonder if your sudden "parallelism" failure could be
> somehow linked to an update of bash, similar to mine ?

It's certainly possible.

zw



Re: Question WHY is gnu make does not stop on error on rule

2022-12-08 Thread Zack Weinberg
On Thu, Dec 8, 2022, at 5:18 AM, aotto wrote:> Hi,
> I use "automake" to setup a "gnu make" build-environment and I have the 
> following rule to create a special file:
> Problem: the "$(c_Meta)" failed with error but MAKE continue… why??
...
> $(csmkkernel_meta) $(csmqmsgque_meta) $(cslcconfig_meta):
>     @$(call green,$(c_Meta) -meta $@ -header $<)
>     @$(c_Meta) -meta $@ -header $< && touch $@
...
>    (file ".../NHI1/sbin/meta/c_Meta.tcl" line 1441)
> Reaping losing child 0x55901938a2b0 PID 26076
> Removing child 0x55901938a2b0 PID 26076 from chain.
> Successfully remade target file 
> '.../NHI1/theLink/csmsgque/.LibMqMsgque_cs.meta'.

Please: Get rid of those leading @ signs, run it again *without* -d, and post 
the *complete and unedited* output.

zw



Re: [bug#59991] [PATCH 0/2] Port tests to modern C

2022-12-13 Thread Zack Weinberg
On Mon, Dec 12, 2022, at 5:57 PM, Karl Berry wrote:
> Zack, would you like be co-maintainer or at least co-developer of
> Automake? There is, evidently, no one else in the world interested in
> being actively involved with Automake on the maintainer side.

I have to decline; I don't have anything like the time required and also I am 
almost completely unfamiliar with the innards of Automake (except for the 
handful of Perl modules shared with Autoconf).

zw



Re: Builds with 'configure' not at the top of srcdir

2023-01-13 Thread Zack Weinberg
On Thu, Jan 12, 2023, at 9:24 PM, Eduardo Hernández wrote:
> I've been trying to separate the build system and source directory
> completely. Part of that would be to have the 'configure' script in the
> 'build' directory, away from the 'src' directory

This goes against a basic design assumption in both Automake and Autoconf: you 
are expected to be making tarball releases that include the configure script 
and a bunch of other generated files (aclocal.m4, config.h.in, Makefile.in if 
Automake is used, etc.)  That assumption goes back to an era when most software 
distribution did _not_ involve direct access to the core developers' version 
control system (if there even _was_ one) and when asking people to install 
Automake and Autoconf (and all the stuff _they_ depend on, notably M4 and 
Perl), before they could install any _other_ GNU software, would've been a 
non-starter.

So, all the generated code assumes that the location of the configure script 
*defines* the top level of the source tree.  The `--srcdir` option is only 
intended for use as a manual override when the automatic probe for the location 
of the configure script doesn't work.

This design arguably could stand to be reconsidered, but it'll be a lot of work 
to get to a place where you can do the thing you would like to do.  Are you up 
for that work?

zw



Re: rhel8 test failure confirmation?

2023-04-04 Thread Zack Weinberg
On Tue, Apr 4, 2023, at 3:51 PM, Bogdan wrote:
>   Nice. The 0 and 1 may not be portable to each OS in the Universe 
> (see EXIT_SUCCESS and EXIT_FAILURE in exit(3)), but should be 
> good/portable enough for our goals. Or maybe some other simple solution.

ISO C guarantees that exit(0) has the same effect as exit(EXIT_SUCCESS).
[It does *not*, however, guarantee that EXIT_SUCCESS == 0.]

zw



Re: Issue with AM_PROG_LEX

2023-07-31 Thread Zack Weinberg
On Mon, Jul 31, 2023, at 7:37 AM, FX Coudert wrote:
> Hello,
>
> I have a configure.ac file that calls AM_PROG_LEX. This now generates 
> warnings:
...
> I am not sure I can actually fix those: AM_PROG_LEX does not seem to 
> accept an argument. Probably it should, and pass it down to AC_PROG_LEX 
> if provided?

Yes, this does ultimately need to be fixed in automake, but in the meantime you 
*may* be able to work around it by writing

AC_PROG_LEX([noyywrap])
AM_PROG_LEX

(Change "noyywrap" to "yywrap" if you actually need yywrap from libl.a.)

This will work iff AM_PROG_LEX uses AC_REQUIRE to invoke AC_PROG_LEX. I'm 
writing this on my phone so I can't easily check.

zw 



Re: Detect --disable-dependency-tracking in Makefile.am

2023-09-30 Thread Zack Weinberg
On Sat, Sep 30, 2023, at 1:07 AM, Jan Engelhardt wrote:
> On Saturday 2023-09-30 05:27, Dave Hart wrote:
>
>>I've added code to the ntp.org Makefile.am files to ensure the static
>>utility library libntp.a is up-to-date for each program that uses it, to
>>ensure the build is correct.  When building the project, this adds a bunch
>>of extra submake invocations which slows down the build.  I'd like to omit
>>those when --disable-dependency-tracking is used, to speed one-off builds
>>that are done by end-users and packagers.
>
> submake is pretty much independent of and from dependency tracking.
>
> The general direction contemporary projects take is to just not do submake
> anymore, because it's a slowdown whether or not deptracking is used.

To elaborate on that: it's been known since *1997* that recursive make is
always a bad idea. See  for gory
details.

It's too bad that Automake still, 26 years later, can only support non-recursive
builds in a fairly clunky way (manually adding subdirectory prefixes to every
single object file).  It shouldn't be *that* hard to make "subdirs = lib bin"
mush all the rules derived from {lib,bin}/Makefile.am into the top-level
generated Makefile.in, at least when there aren't any manually written rules
in the subdirectories.

zw



Re: rhel8 test failure confirmation?

2023-12-02 Thread Zack Weinberg
On Sat, Dec 2, 2023, at 6:58 AM, Mike Frysinger wrote:
> On 06 Apr 2023 21:29, Jacob Bachmeyer wrote:
>> Karl Berry wrote:
>> > jb> a more thorough test would locate the autom4te script and grep it
>> > for the perllibdir that was substituted when autoconf was
>> > configured.
>> >
>> > I guess that would work.
>> 
>> Challenge accepted.  Here's a refined version: 
>
> this doesn't work on systems that wrap `autom4te`.  Gentoo for example wraps
> all autoconf & automake scripts to support parallel installs of different
> versions.  this way we can easily have access to every autoconf version.  we
> got this idea from Mandrake, so we aren't the only ones ;).
...
> what is the first autoconf release that has the fix ?
> -mike

2.71 does not have the fix. Can you please test the 2.72d beta that I just put 
out
(see https://lists.gnu.org/archive/html/autoconf/2023-11/msg0.html)?

(yes, the version numbering for autoconf betas looks strange, it's a convention
that predates semver and I don't have time to argue with anyone about it)

zw



Re: rhel8 test failure confirmation?

2023-12-02 Thread Zack Weinberg
On Sat, Dec 2, 2023, at 6:37 PM, Karl Berry wrote:
> The best way to check if high-resolution 
> timestamps are available to autom4te is to have perl load 
> Autom4te::FileUtils and check if that also loaded Time::HiRes.
>
> The problem with that turned out to be that Time::HiRes got loaded from
> other system modules, resulting in the test thinking that autom4te used
> it when that wasn't actually the case. That's what happened in practice
> with your patch.

Would it help if we added a command line option to autom4te that made it report 
whether it thought it could use high resolution timestamps? Versions of 
autom4te that didn't recognize this option should be conservatively assumed not 
to support them.

(Of course there's the additional wrinkle that whether high resolution 
timestamps *work* depends on what filesystem autom4te.cache is stored in, but 
that's even harder to probe... one problem at a time?)

zw



Re: rhel8 test failure confirmation?

2023-12-02 Thread Zack Weinberg
On Sat, Dec 2, 2023, at 7:33 PM, Jacob Bachmeyer wrote:
> Zack Weinberg wrote:
>> Would it help if we added a command line option to autom4te that made
>> it report whether it thought it could use high resolution timestamps?
>> Versions of autom4te that didn't recognize this option should be
>> conservatively assumed not to support them.
>
> Why not just add that information to the --version message?  Add a
> "(HiRes)" tag somewhere if Time::HiRes is available?

Either way is no problem from my end, but it would be more work
for automake (parsing --version output, instead of just checking the
exit status of autom4te --assert-high-resolution-timestamp-support)
Karl, do you have a preference here?  I can make whatever you decide
on happen, in the next couple of days.

>> (Of course there's the additional wrinkle that whether high
>> resolution timestamps *work* depends on what filesystem
>> autom4te.cache is stored in
>
> Is this actually still a problem (other than for ensuring the cache is
> used in the testsuite)

I don't *think* so but I don't understand the problem 100% so I
could be missing something.

zw



Re: rhel8 test failure confirmation?

2023-12-02 Thread Zack Weinberg
On Sat, Dec 2, 2023, at 8:58 PM, Jacob Bachmeyer wrote:
> Zack Weinberg wrote:
>> Either way is no problem from my end, but it would be more work
>> for automake (parsing --version output, instead of just checking the
>> exit status of autom4te --assert-high-resolution-timestamp-support)
>> Karl, do you have a preference here?  I can make whatever you decide
>> on happen, in the next couple of days.
>
> There would not need to be much parsing, just "automake --version | grep 
> HiRes" in that case, or "case `automake --version` in *HiRes*) ...;; 
> easc" to avoid running grep if you want.

I specifically want to hear what Karl thinks.

zw



Re: rhel8 test failure confirmation?

2023-12-04 Thread Zack Weinberg
On Sun, Dec 3, 2023, at 4:49 PM, Karl Berry wrote:
>> There would not need to be much parsing, just "automake --version | grep 
> > HiRes" in that case, or "case `automake --version` in *HiRes*) ...;; 
> > easc" to avoid running grep if you want.
>
> I specifically want to hear what Karl thinks.
>
> I lean towards Jacob's view that automake --version | grep HiRes will
> suffice. Not having a new option seems simpler/better in terms of later
> understanding, too. --thanks, karl.

Did I misunderstand which program's --version output we are talking about?
I thought we were talking about automake's testsuite probing the behavior
of *autom4te*, but all the quoted text seems to be imagining a probe of
*automake* instead.  Yinz can change automake yourselves, you don't need
me to jump in :-)

Assuming this is really about autom4te, how does this look?

$ ./tests/autom4te --version
autom4te (GNU Autoconf) 2.72d.6-49ab3-dirty (subsecond timestamps supported)
Copyright (C) 2023 Free Software Foundation, Inc.
[etc]

If subsecond timestamps are *not* supported, you'll get instead

autom4te (GNU Autoconf) 2.72d.6-49ab3-dirty
Copyright (C) 2023 Free Software Foundation, Inc.
[etc]

I'm not using the identifier "HiRes" because the use of Time::HiRes is an
implementation detail that could change.  For instance, if there's a third
party CPAN module that lets us get at nanosecond-resolution timestamps
*without* loss of precision due to conversion to an NV (aka IEEE double)
we could, in the future, look for that first.

The implementation is just

BEGIN
{
  our $subsecond_timestamps = 0;
  eval
{
  require Time::HiRes;
  import Time::HiRes qw(stat);
  $subsecond_timestamps = 1;
}
}

Jacob, can you confirm that's an accurate test, given all the things you
said earlier about ways that grepping the source code might get it wrong?

zw



[RFC PATCH]: autom4te: report subsecond timestamp support in --version

2023-12-04 Thread Zack Weinberg
The Automake test suite wants this in order to know if it’s safe to
reduce the length of various delays for the purpose of ensuring files
in autom4te.cache are newer than the corresponding source files.

* lib/Autom4te/FileUtils.pm: Provide (but do not export) a flag
  $subsecond_mtime, indicating whether the ‘mtime’ function reports
  modification time with precision greater than one second.
  Reorganize commentary and import logic for clarity.  Add
  configuration for emacs’ perl-mode to the bottom of the file.

* bin/autom4te.in ($version): If $Autom4te::FileUtils::subsecond_mtime
  is true, add the text “ (subsecond timestamps supported)” to the
  first line of --version output.
---
 bin/autom4te.in   |  6 ++--
 lib/Autom4te/FileUtils.pm | 71 ---
 2 files changed, 56 insertions(+), 21 deletions(-)

diff --git a/bin/autom4te.in b/bin/autom4te.in
index 38a61ac9..9a2e5f12 100644
--- a/bin/autom4te.in
+++ b/bin/autom4te.in
@@ -207,8 +207,10 @@ General help using GNU software: 
.
 
 # $VERSION
 # 
-$version = "autom4te (@PACKAGE_NAME@) @VERSION@
-Copyright (C) @RELEASE_YEAR@ Free Software Foundation, Inc.
+$version = "autom4te (@PACKAGE_NAME@) @VERSION@"
+  . ($Autom4te::FileUtils::subsecond_mtime
+ ? " (subsecond timestamps supported)\n" : "\n")
+  . "Copyright (C) @RELEASE_YEAR@ Free Software Foundation, Inc.
 License GPLv3+/Autoconf: GNU GPL version 3 or later
 , 
 This is free software: you are free to change and redistribute it.
diff --git a/lib/Autom4te/FileUtils.pm b/lib/Autom4te/FileUtils.pm
index c1e8e8c3..06f87c31 100644
--- a/lib/Autom4te/FileUtils.pm
+++ b/lib/Autom4te/FileUtils.pm
@@ -38,24 +38,45 @@ use 5.006;
 use strict;
 use warnings FATAL => 'all';
 
-use Exporter;
+BEGIN
+{
+  require Exporter;
+  our @ISA = qw (Exporter);
+  our @EXPORT = qw (&contents
+   &find_file &mtime
+   &update_file
+   &xsystem &xsystem_hint &xqx
+   &dir_has_case_matching_file &reset_dir_cache
+   &set_dir_cache_file);
+}
+
+# Use sub-second resolution file timestamps if available, carry on
+# with one-second resolution timestamps if Time::HiRes is not available.
+#
+# Unfortunately, even if Time::HiRes is available, we don't get
+# timestamps to the full precision recorded by the operating system,
+# because Time::HiRes converts timestamps to floating-point, and the
+# rounding error is hundreds of nanoseconds for circa-2023 timestamps
+# in IEEE double precision.  But this is the best we can do without
+# dropping down to C.
+#
+# $subsecond_mtime is not exported, but is intended for external
+# consumption, as $Autom4te::FileUtils::subsecond_mtime.
+BEGIN
+{
+  our $subsecond_mtime = 0;
+  eval
+{
+  require Time::HiRes;
+  import Time::HiRes qw(stat);
+  $subsecond_mtime = 1;
+}
+}
+
 use IO::File;
-
-# use sub-second resolution timestamps if available,
-# carry on with one-second resolution timestamps if that is all we have
-BEGIN { eval { require Time::HiRes; import Time::HiRes qw(stat) } }
-
 use Autom4te::Channels;
 use Autom4te::ChannelDefs;
 
-our @ISA = qw (Exporter);
-our @EXPORT = qw (&contents
- &find_file &mtime
- &update_file
- &xsystem &xsystem_hint &xqx
- &dir_has_case_matching_file &reset_dir_cache
- &set_dir_cache_file);
-
 =over 4
 
 =item C
@@ -122,11 +143,6 @@ sub mtime ($)
 $atime,$mtime,$ctime,$blksize,$blocks) = stat ($file)
 or fatal "cannot stat $file: $!";
 
-  # Unfortunately Time::HiRes converts timestamps to floating-point, and the
-  # rounding error can be hundreds of nanoseconds for circa-2023 timestamps.
-  # Perhaps some day Perl will support accurate file timestamps.
-  # For now, do the best we can without going outside Perl.
-
   return $mtime;
 }
 
@@ -394,3 +410,20 @@ sub set_dir_cache_file ($$)
 =cut
 
 1; # for require
+
+### Setup "GNU" style for perl-mode and cperl-mode.
+## Local Variables:
+## perl-indent-level: 2
+## perl-continued-statement-offset: 2
+## perl-continued-brace-offset: 0
+## perl-brace-offset: 0
+## perl-brace-imaginary-offset: 0
+## perl-label-offset: -2
+## cperl-indent-level: 2
+## cperl-brace-offset: 0
+## cperl-continued-brace-offset: 0
+## cperl-label-offset: -2
+## cperl-extra-newline-before-brace: t
+## cperl-merge-trailing-else: nil
+## cperl-continued-statement-offset: 2
+## End:
-- 
2.41.0




Re: [RFC PATCH]: autom4te: report subsecond timestamp support in --version

2023-12-05 Thread Zack Weinberg
On Mon, Dec 4, 2023, at 7:26 PM, Jacob Bachmeyer wrote:
> Now that I have seen the actual patch, yes, this test should be
> accurate.  The test in the main autom4te script will also work, even
> if there is a mismatch between the script and its library

Good.

> This appears to be misaligned with the GNU Coding Standards, which
> states:  "The first line is meant to be easy for a program to parse;
> the version number proper starts after the last space."
>
> Perhaps the best option would be to conditionally add a line "This
> autom4te supports subsecond timestamps." after the license notice?

I don't like putting anything after the license notice because it's
convenient to be able to pipe --version output to sed '/Copyright/,$d'
without losing anything relevant for troubleshooting.  So how about

$ autom4te --version
autom4te (GNU Autoconf) 2.71
Features: subsecond-timestamps

Copyright (C) 2021 Free Software Foundation, Inc.
License GPLv3+/Autoconf: GNU GPL version 3 or later
, 
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.

Written by Akim Demaille.

This preserves the effectiveness of sed '/Copyright/,$d' and also
leaves room for future additions to the "Features:" line.

zw



Re: rhel8 test failure confirmation?

2023-12-05 Thread Zack Weinberg
On Mon, Dec 4, 2023, at 7:14 PM, Jacob Bachmeyer wrote:
> Zack Weinberg wrote:
[snip everything addressed in the other thread]
> Yes, there was a bit of confusion here; not only is the FileUtils
> module synchronized between autom4te and automake

Thanks for reminding me that I need to make sure all those files are
actually in sync before I cut the final 2.72 release.

>>   require Time::HiRes;
>>   import Time::HiRes qw(stat);
>
> I believe that the import is not actually necessary

The previous line is a "require", not a "use", so I believe it _is_
necessary.  Have I misunderstood?

> ... should do no harm as long as any use of stat in the code
> is prepared to handle floating-point timestamps.

There's only one use, in 'sub mtime', and that's the place
where we actively want the floating-point timestamps.

zw



[PATCH v2, committed] autom4te: report subsecond timestamp support in --version

2023-12-06 Thread Zack Weinberg
The Automake test suite wants this in order to know if it’s safe to
reduce the length of various delays for the purpose of ensuring files
in autom4te.cache are newer than the corresponding source files.  We
can also take advantage of this to speed up a couple of tests in our
own testsuite.

* lib/Autom4te/FileUtils.pm: Provide (but do not export) a flag
  $subsecond_mtime, indicating whether the ‘mtime’ function reports
  modification time with precision greater than one second.
  Reorganize commentary and import logic for clarity.  Add
  configuration for emacs’ perl-mode to the bottom of the file.
* bin/autom4te.in ($version): If $Autom4te::FileUtils::subsecond_mtime
  is true, print “Features: subsecond-mtime” as the second line
  of --version output.

* lib/autotest/general.m4: Move definitions of AS_MESSAGE_LOG_FD,
  AT_JOB_FIFO_IN_FD, and AT_JOB_FIFO_OUT_FD to top level and change
  the latter two to be defined using _AT_DEFINE_INIT.  This enables
  use of AS_MESSAGE_LOG_FD in AT_TESTS_PREPARE code.

* tests/local.at (AT_MTIME_DELAY): New utility that delays a test for
  an appropriate time to ensure all files created before its use are
  considered older than all files created afterward.  The delay will
  be as short as possible.
  (AT_TESTS_PREPARE): Calculate and log the delay used by AT_MTIME_DELAY.
  Only use a delay shorter than one second if the build filesystem,
  our autom4te, and the automake on the PATH all support this.
  (AT_CMP): En passant removal of unnecessary blank lines.

* tests/tools.at: Use AT_MTIME_DELAY, instead of sleeping for a
  hardcoded interval, everywhere the delay was to control file
  timestamps.
---
 bin/autom4te.in   |  6 ++-
 lib/Autom4te/FileUtils.pm | 71 +++-
 lib/autotest/general.m4   | 20 ++---
 tests/local.at| 86 +--
 tests/tools.at| 26 +---
 5 files changed, 164 insertions(+), 45 deletions(-)

diff --git a/bin/autom4te.in b/bin/autom4te.in
index 38a61ac9..8957a6c9 100644
--- a/bin/autom4te.in
+++ b/bin/autom4te.in
@@ -207,8 +207,10 @@ General help using GNU software: 
.
 
 # $VERSION
 # 
-$version = "autom4te (@PACKAGE_NAME@) @VERSION@
-Copyright (C) @RELEASE_YEAR@ Free Software Foundation, Inc.
+$version = "autom4te (@PACKAGE_NAME@) @VERSION@\n"
+  . ($Autom4te::FileUtils::subsecond_mtime
+ ? "Features: subsecond-mtime\n" : "")
+  . "\nCopyright (C) @RELEASE_YEAR@ Free Software Foundation, Inc.
 License GPLv3+/Autoconf: GNU GPL version 3 or later
 , 
 This is free software: you are free to change and redistribute it.
diff --git a/lib/Autom4te/FileUtils.pm b/lib/Autom4te/FileUtils.pm
index c1e8e8c3..06f87c31 100644
--- a/lib/Autom4te/FileUtils.pm
+++ b/lib/Autom4te/FileUtils.pm
@@ -38,24 +38,45 @@ use 5.006;
 use strict;
 use warnings FATAL => 'all';
 
-use Exporter;
+BEGIN
+{
+  require Exporter;
+  our @ISA = qw (Exporter);
+  our @EXPORT = qw (&contents
+   &find_file &mtime
+   &update_file
+   &xsystem &xsystem_hint &xqx
+   &dir_has_case_matching_file &reset_dir_cache
+   &set_dir_cache_file);
+}
+
+# Use sub-second resolution file timestamps if available, carry on
+# with one-second resolution timestamps if Time::HiRes is not available.
+#
+# Unfortunately, even if Time::HiRes is available, we don't get
+# timestamps to the full precision recorded by the operating system,
+# because Time::HiRes converts timestamps to floating-point, and the
+# rounding error is hundreds of nanoseconds for circa-2023 timestamps
+# in IEEE double precision.  But this is the best we can do without
+# dropping down to C.
+#
+# $subsecond_mtime is not exported, but is intended for external
+# consumption, as $Autom4te::FileUtils::subsecond_mtime.
+BEGIN
+{
+  our $subsecond_mtime = 0;
+  eval
+{
+  require Time::HiRes;
+  import Time::HiRes qw(stat);
+  $subsecond_mtime = 1;
+}
+}
+
 use IO::File;
-
-# use sub-second resolution timestamps if available,
-# carry on with one-second resolution timestamps if that is all we have
-BEGIN { eval { require Time::HiRes; import Time::HiRes qw(stat) } }
-
 use Autom4te::Channels;
 use Autom4te::ChannelDefs;
 
-our @ISA = qw (Exporter);
-our @EXPORT = qw (&contents
- &find_file &mtime
- &update_file
- &xsystem &xsystem_hint &xqx
- &dir_has_case_matching_file &reset_dir_cache
- &set_dir_cache_file);
-
 =over 4
 
 =item C
@@ -122,11 +143,6 @@ sub mtime ($)
 $atime,$mtime,$ctime,$blksize,$blocks) = stat ($file)
 or fatal "cannot stat $file: $!";
 
-  # Unfortunately Time::HiRes converts timestamps to floating-point, and the
-  # rounding error can be hundreds of nanoseconds for circa-2023 timestamps.
-  # Perhaps some day Perl will 

Re: [RFC PATCH]: autom4te: report subsecond timestamp support in --version

2023-12-06 Thread Zack Weinberg
On Tue, Dec 5, 2023, at 11:30 PM, Jacob Bachmeyer wrote:
> Zack Weinberg wrote:
>> $ autom4te --version
>> autom4te (GNU Autoconf) 2.71
>> Features: subsecond-timestamps
>>
>> Copyright (C) 2021 Free Software Foundation, Inc.
>> License GPLv3+/Autoconf: GNU GPL version 3 or later
>> <https://gnu.org/licenses/gpl.html>, 
>> <https://gnu.org/licenses/exceptions.html>
>> This is free software: you are free to change and redistribute it.
>> There is NO WARRANTY, to the extent permitted by law.
>>
>> Written by Akim Demaille.
>>
>> This preserves the effectiveness of sed '/Copyright/,$d' and also
>> leaves room for future additions to the "Features:" line.
>
> That looks like a good idea to me [...]

Karl also approved, so I have gone ahead and implemented this and
pushed it to Autoconf.  In the process I discovered that *both*
automake and autom4te need to advertise the capability for subsecond
timestamps; expect a patch for Automake shortly.

zw



What does (did?) it mean when aclocal exits with code 63?

2023-12-22 Thread Zack Weinberg
One of Autoconf's regression tests includes:

# If Autoconf is too old, or the user has turned caching off, skip:
AT_CHECK([aclocal || { ret=$?; test $ret -eq 63 && ret=77; exit $ret; }],
 [], [], [ignore])

The effect is to skip the rest of the test if aclocal's exit status is 63.
This code was written in 2006, I can't find any documentation on what
it means for aclocal to exit with code 63, and I also can't find any
code in the current automake git repo that looks like it would make
aclocal exit with code 63.  (*automake* can exit with code 63 under
some circumstances, but it really looks like aclocal never will.)

Does anyone remember what situation this might have been trying to detect?

Thanks,
zw



Unavailable due to hardware problems

2024-01-23 Thread Zack Weinberg
In case anyone is waiting for me to respond to questions or write a patch or 
anything, please be advised that my work computer has crashed hard and until 
replacement components arrive (early next week) I can't do much of anything.



Re: GNU Coding Standards, automake, and the recent xz-utils backdoor

2024-04-01 Thread Zack Weinberg
On Sun, Mar 31, 2024, at 3:17 AM, Jacob Bachmeyer wrote:
> Eric Gallager wrote:
>> Specifically, what caught my attention was how the release tarball
>> containing the backdoor didn't match the history of the project in its
>> git repository. That made me think about automake's `distcheck`
>> target, whose entire purpose is to make it easier to verify that a
>> distribution tarball can be rebuilt from itself and contains all the
>> things it ought to contain.
>
> The problem is that a release tarball is a freestanding object, with no 
> dependency on the repository from which it was produced.  In this case, 
> the attacker added a bogus "update" of build-to-host.m4 from gnulib to 
> the release tarball, but that file is not stored in the Git repository.  
> This would not have tripped "make distcheck" because the crocked tarball 
> can indeed be used to rebuild another crocked tarball.
>
> As Alexandre Oliva mentioned in his reply, there is not really any good 
> way to prevent this, since the attacker could also patch the generated 
> configure script more directly.

I have been thinking about this incident and this thread all weekend and
have seen a lot of people saying things like "this is more proof that tarballs
are a thing of the past and everyone should just build straight from git".
There are a bunch of reasons why one might disagree with this as a blanket
statement, but I do think there's a valid point here: the malicious xz
maintainer *might* have been caught earlier if they had committed the
build-to-host.m4 modification to xz's VCS.  (Or they might not have!
Witness the three (and counting) malicious patches that they barefacedly
submitted to *other* software and got accepted because the malice was
subtle enough to pass through code review.)

It might indeed be worth thinking about ways to minimize the difference
between the tarball "make dist" produces and the tarball "git archive"
produces, starting from the same clean git checkout, and also ways to
identify and audit those differences.

...
> Maybe the best revision to the GNU Coding Standards would be that 
> releases should, if at all possible, contain only text?  Any binary 
> files needed for testing can be generated during "make check" if 
> necessary

I don't think this is a good idea.  It's only a speed bump for someone
trying to smuggle malicious data into a package (think "base64 -d") and
it makes life substantially harder for honest authors of programs that
work with binary data, and authors of material whose "source code"
(as GPLv3 uses that term) *is* binary data.  Consider pngsuite, for
instance (http://www.schaik.com/pngsuite/) -- it would be a *ton* of
work to convert each of these test PNG files into GNU Poke scripts,
and probably the result would be *less* ergonomic for purposes of
improving the test suite.

I would like to suggest that a more useful policy would be "files
written to $prefix by 'make install' should not have any data
dependency on files labeled as part of the package's testsuite".
That doesn't constrain honest authors and it seems within the
scope of what the reproducible builds people could test for.
(Build the package, install to nonce prefix 1, unpack the tarball
again, delete the test suite, build again, install to prefix 2, compare.)
Of course a sufficiently determined malicious coder could detect
the reproducible-build test environment, but unlike "no binary data"
this is a substantial difficulty increment.

zw



Re: GNU Coding Standards, automake, and the recent xz-utils backdoor

2024-04-01 Thread Zack Weinberg
On Mon, Apr 1, 2024, at 2:04 PM, Russ Allbery wrote:
> "Zack Weinberg"  writes:
>> It might indeed be worth thinking about ways to minimize the
>> difference between the tarball "make dist" produces and the tarball
>> "git archive" produces, starting from the same clean git checkout,
>> and also ways to identify and audit those differences.
>
> There is extensive ongoing discussion of this on debian-devel. There's
> no real consensus in that discussion, but I think one useful principle
> that's emerged that doesn't disrupt the world *too* much is that the
> release tarball should differ from the Git tag only in the form of
> added files. Any files that are present in both Git and in the release
> tarball should be byte-for-byte identical.

That dovetails nicely with something I was thinking about myself.
Obviously the result of "make dist" should be reproducible except for
signatures; to the extent it isn't already, those are bugs in automake.
But also, what if "make dist" produced *two* disjoint tarballs? One of
which is guaranteed to be byte-for-byte identical to an archive of the
VCS at the release tag (in some clearly documented fashion; AIUI, "git
archive" does *not* do what we want).  The other contains all the files
that "autoreconf -i" or "./bootstrap.sh" or whatever would create, but
nothing else.  Diffs could be provided for both tarballs, or only for
the VCS-archive tarball, whichever turns out to be more compact (I can
imagine the diff for the generated-files tarball turning out to be
comparable in size to the generated-files tarball itself).

This should make it much easier to find, and therefore audit, the pre-
generated files, and to validate that there's no overlap. It would add
an extra step for people who want to build from tarball, without having
to install autoconf (or whatever) first -- but an easier extra step
than, y'know, installing autoconf. :)  Conversely, people who want to
build from tarballs but *not* use the pre-generated configure, etc,
could now download the 'bare' tarball only.

("Couldn't those people just build from a git checkout?"  Not if they
don't have the tooling for it, not during early stages of a distribution
bootstrap, etc.  Also, the act of publishing a tarball that's a golden
copy of the VCS at the release tag is valuable for archival purposes.)

zw



Re: GCC reporting piped input as a security feature

2024-04-11 Thread Zack Weinberg
On Tue, Apr 9, 2024, at 11:35 PM, Jacob Bachmeyer wrote:
> Jan Engelhardt wrote:
>> On Tuesday 2024-04-09 05:37, Jacob Bachmeyer wrote:
>>
 In principle it could be posible to output something different to
 describe this stramge situation explicitly.  For instance, output
 "via stdin" as a comment, or output `stdin/../filename' as the file
 name. (Programs that optimize the file name by deleting XXX/.../
 are likely not to check whether XXX is a real directory.)
...

How about `/dev/stdin/-` if no filename has been specified with #line or
whatever, and `/dev/stdin/[filename]` if one has, where [filename] is
the specified filename with all leading dots and slashes stripped,
falling back to `-` if empty? /dev/stdin can be relied on to either not
exist or not be a directory, so these shouldn't ever be openable.

zw



Re: Bug Resilience Program of German Sovereign Tech Fund

2024-06-06 Thread Zack Weinberg
On Thu, Jun 6, 2024, at 10:37 AM, Dan Kegel wrote:
> That's a really good idea.  Automake and Autotools in general underpin
> a fair amount of key open source software, but is taken for granted.
>
> Ideas for making the case for funding: identify...
> - how many commonly used Debian/Ubuntu/Alpine packages and/or GitHub
> projects rely on automake/Autotools
> - markers of risk (e.g. very out of date or buggy embedded Autotools files)
> - problems this has caused in the past
> - related government standards for software quality
> All of that might help make the case.

On the autoconf side of the fence, I do not have spare cycles to write a
proposal all by myself, but I would be interested in working with one or
more other people to write such a proposal, and in doing some of the
work.  In particular, I already have a mental list of things that
*ought* to be done in the near future for autoconf but I do not have the
capacity to do by myself on a volunteer basis.

I'm adding Sumana Harihareswara to the cc: list; she helped me secure the
funding for the release of autoconf 2.70 and might be interested in working
on this pitch as well.

zw



Re: 1.16.90 regression: configure now takes 7 seconds to start

2024-06-12 Thread Zack Weinberg
On Tue, Jun 11, 2024, at 7:00 PM, Karl Berry wrote:
> Maybe this is a silly question, but, is there a reason why this test
> needs to be performed in every single package that uses Automake?
>
> I was under the impression that the purpose of this test was
> merely to speed up running Automake's own test suite.
>
> There are other packages that have many tests that presumably also
> substantially benefit?

The timestamp resolution is actually used by core automake code, namely
the "ensuring generated files are newer than source" config.status step,
which is necessary to make tarball builds work reliably on systems that
don't have the `automake`, `autoconf`, `aclocal`, etc. tools already
installed.

I have an idea for how to streamline the whole mess, and will tinker
with it early next week.  Have to concentrate on the day job for the
remainder of this week.

zw



Re: 1.16.90 regression: configure now takes 7 seconds to start

2024-06-17 Thread Zack Weinberg
On Mon, Jun 17, 2024, at 10:30 PM, Jacob Bachmeyer wrote:
...

Don't have enough brain right now to comment on any of the rest of your 
suggestions, but:

>once conftest.file is newer than configure, surely 
> config.status, which is produced after all tests are run, /must/ also be 
> newer than configure?
>
> How is this last check/delay actually necessary?  Are there broken 
> systems out there that play games with causality?

I regret to say, yes, there are. For example, this can happen with NFS if there 
are multiple clients updating the same files and they don't all agree on the 
current time. Think build farm with several different configurations being 
built out of the same srcdir - separate build dirs, of course, but that doesn't 
actually help here since the issue is ensuring the Makefile doesn't think 
*configure* (not config.status) needs rebuilt.

zw



Re: 1.16.90 regression: configure now takes 7 seconds to start

2024-06-18 Thread Zack Weinberg
On Tue, Jun 18, 2024, at 12:02 AM, Jacob Bachmeyer wrote:
> Zack Weinberg wrote:
>> On Mon, Jun 17, 2024, at 10:30 PM, Jacob Bachmeyer wrote:
>>
>> I regret to say, yes, there are. For example, this can happen with
>> NFS if there are multiple clients updating the same files and they
>> don't all agree on the current time. Think build farm with several
>> different configurations being built out of the same srcdir -
>> separate build dirs, of course, but that doesn't actually help here
>> since the issue is ensuring the Makefile doesn't think *configure*
>> (not config.status) needs rebuilt.
>
> Wait... all of configure's non-system dependencies are in the release
> tarball and presumably (if "make dist" worked correctly) backdated
> older than configure when the tarball is unpacked.

In my experience, tarballs cannot be trusted to get this right, *and*
tar implementations cannot be trusted to unpack them accurately
(e.g. despite POSIX I have run into implementations that defaulted to
the equivalent of GNU tar's --touch mode).  Subsequent bounces through
downstream repackaging do not help.   Literally as I type this I am
watching gettext 0.22 run its ridiculous number of configure scripts a
second time from inside `make`.

> Does "make dist" need to touch configure to ensure that it is newer
> than its dependencies before rolling the tarball?

It ought to, but I don't think that will be more than a marginal
improvement, and touching the top-level configure won't be enough,
you'd need to do a full topological sort on the dependency graph
leading into every configure + every Makefile.in + every other
generated-but-shipped file and make sure that each tier of generated
files is newer than its inputs.

I wonder if a more effective approach would be to disable the rules to
regenerate configure, Makefile.in, etc. unless either --enable-maintainer-mode
or we detect that we are building out of a VCS checkout.

zw



Re: 1.16.90 regression: configure now takes 7 seconds to start

2024-06-18 Thread Zack Weinberg
On Tue, Jun 18, 2024, at 10:19 AM, Bruno Haible wrote:
> Zack Weinberg wrote:
>> Literally as I type this I am
>> watching gettext 0.22 run its ridiculous number of configure scripts a
>> second time from inside `make`.
>
> You can run into such problems:
>   - if you take a tarball of a git repository that contains the generated
> 'configure' file, or

Closely related scenario for me: building from a *checkout* of a git repository
that contains the generated configure files.  This is very common for
Linux distributions and I think we have to do what we can to facilitate it.

> That might be overkill. For simple packages with 1 configure script,
> the following is sufficient in my experience:
>
>   # Bring the time stamps into an order that will not require autoconf, 
> automake, etc. to run again.
>   sleep 1; touch aclocal.m4
>   sleep 1; touch configure
>   sleep 1; touch config.h.in
>   sleep 1; touch `find . -name Makefile.in -type f`

I think that if we are going to bother addressing this at all (and I currently
think we *should*, see above), we need to handle the general case.  Isn't that
the whole point of Automake, that it deals with this kind of annoying detail
*correctly*, so that all its users don't have to worry about it?

(Why *does* gettext need seven configure scripts, anyway?)

zw



RFC: Add public macros AC_LOG_CMD and AC_LOG_FILE.

2024-06-23 Thread Zack Weinberg
I'm thinking of making AC_RUN_LOG, which has existed forever but is
undocumented, an official documented macro under the new name
AC_LOG_CMD.  I'm renaming it because I also want to add AC_LOG_FILE, a
generalization of _AC_MSG_LOG_CONFTEST.

These are handy any time you want to record details of why some test got
the result it did in config.log; AC_COMPILE_IFELSE and friends have been
able to do this forever, but hand-written tests of the shell environment
or of interpreters can't without reaching into autoconf's guts.
Automake has been carrying around its own copy of AC_LOG_CMD (under the
name AM_LOG_CMD) forever as well...

I haven't *implemented* this yet, but here's the proposed documentation.
What do you think?

zw

+@node Logging
+@section Logging Details of Tests
+
+It is helpful to record details of @emph{why} each test got the result
+that it did, in @file{config.log}.  The macros that compile test
+programs (@code{AC_COMPILE_IFELSE} etc.; @pxref{Writing Tests}) do this
+automatically, but if you write a test that only involves M4sh and basic
+shell commands, you will need to do it yourself, using the following macros.
+
+@anchor{AC_LOG_CMD}
+@defmac AC_LOG_CMD (@var{shell-command})
+Execute @var{shell-command}.
+Record @var{shell-command} in @file{config.log}, along with any error
+messages it printed (specifically, everything it wrote to its standard
+error) and its exit code.
+
+This macro may be used within a command substitution, or as the test
+argument of @code{AS_IF} or a regular shell @code{if} statement.
+@end defmac
+
+@anchor{AC_LOG_FILE}
+@defmac AC_LOG_FILE (@var{file}, @var{label})
+Record the contents of @var{file} in @file{config.log}, labeled with
+@var{label}.
+@end defmac
+
+Here is an example of how to use these macros to test for a feature of
+the system `awk'.
+
+@smallexample
+AC_PROG_AWK
+AC_MSG_CHECKING([whether $AWK supports 'asort'])
+cat > conftest.awk <<\EOF
+[@{ lines[NR] = $0 @}
+END @{
+  ORS=" "
+  asort(lines)
+  for (i in lines) @{
+print lines[i]
+  @}
+@}]
+EOF
+
+AS_IF([result=`AC_LOG_CMD([printf 'z\ny\n' | $AWK -f conftest.awk])` &&
+   test x"$result" = x"y z"],
+  [AWK_HAS_ASORT=yes],
+  [AWK_HAS_ASORT=no
+   AC_LOG_FILE([conftest.awk], [test awk script])])
+AC_MSG_RESULT([$AWK_HAS_ASORT])
+AC_SUBST([AWK_HAS_ASORT])
+rm -f conftest.awk
+@end smallexample
 
 @c == Programming in M4.



Re: RFC: Add public macros AC_LOG_CMD and AC_LOG_FILE.

2024-06-24 Thread Zack Weinberg
On Mon, Jun 24, 2024, at 2:56 AM, Nick Bowler wrote:
> On 2024-06-23 22:23, Zack Weinberg wrote:
>> I'm thinking of making AC_RUN_LOG, which has existed forever but is
>> undocumented, an official documented macro ...
>
> Yes, please!
>
> I will note that Autoconf has a lot of "run and log a command" internal
> macros with various comments of the form "doesn't work well" suggesting
> that this is a hard feature to get right.

... Wow, this is a bigger mess than I thought last night.  Up to bad
quotation in third party macros, however, I *think* almost all of it
is obsolete and can be scrapped.  Stay tuned.

> I think at the same time it would be worth documenting the AS_LINENO
> functionality, which is the main internal functionality of these
> macros that (unless you just goes ahead and use it) Autoconf users
> can't really replicate in their own logging.

I believe what you mean is you want _AS_ECHO_LOG to be promoted to a
documented and supported macro, and for AS_LINENO_PUSH and AS_LINENO_POP
also to be documented for external use.  Is this correct?  Did I miss
any other internal macros that ought to be available for external
use?  I don't think we should tell people to be using $as_lineno directly,
is there some use case for it that isn't covered by existing macros?

> If you implement this, please explain in the manual what "labeled with
> /label/" really means, otherwise I'm left wondering why this macro
> exists when we can almost as easily write something like:
>
>   { echo label; cat file; } >&AS_MESSAGE_LOG_FD
>
> Including example logfile output together with the example program
> might be sufficient.

Will do.  The main point of the macro is that it does something a little
fancier than "cat file", so it's unambiguous where normal log output
resumes. Like the existing _AC_MSG_LOG_CONFTEST does:

configure: failed program was:
| /* confdefs.h */
| #define PACKAGE_NAME "lexlib-probe"
| #define PACKAGE_TARNAME "lexlib-probe"
| #define PACKAGE_VERSION "1"
| ... etc ...
configure: result: no

The "label" will go where _AC_MSG_LOG_CONFTEST prints "failed program was".

zw



Re: RFC: Add public macros AC_LOG_CMD and AC_LOG_FILE.

2024-06-26 Thread Zack Weinberg
On Sun, Jun 23, 2024, at 10:23 PM, Zack Weinberg wrote:
> I'm thinking of making AC_RUN_LOG, which has existed forever but is
> undocumented, an official documented macro under the new name
> AC_LOG_CMD.  I'm renaming it because I also want to add AC_LOG_FILE, a
> generalization of _AC_MSG_LOG_CONFTEST.

FYI, this seemed straightforward to implement but has exposed a whole
bunch of latent problems, so it'll be a week or two more before I have
patches ready to go, but there's no insurmountable obstacles.  You can
all follow along at
<https://git.savannah.gnu.org/cgit/autoconf.git/log/?h=zack/ac-log-cmd>.
(This branch will definitely be rebased at least once before things are
ready to merge.)

zw



Re: [platform-testers] automake-1.16.92 released

2024-07-01 Thread Zack Weinberg
On Sun, Jun 30, 2024, at 4:28 PM, Karl Berry wrote:
...
> introspection/%.h: introspection/%.c
>   $()
...
> As an aside, I'm curious as to why the $() is used. It seems
> a mysterious way to do nothing. Do you know?

I don't know why someone chose to do it this way, but I do know that
GCC's Makefiles do something similar with

# gen-foo generates both foo.h and foo.c at the same time
foo.c: gen-foo other files...
$(builddir)gen-foo arguments...

# clue Make that gen-foo also updated foo.h whenever foo.c is new
foo.h: foo.c
@:

If I had to guess, I would guess that someone thought Make would be more
likely to skip invoking a shell if the command was actually empty rather
than ":".  As it happens, GNU Make 4.4.1 appears to recognize ":" as a
no-op; using strace I see it issue the same number of forks for both
constructs.  But perhaps older versions of gnumake did not do this.
(This is clearly not a portable makefile to begin with, so questions of
what other implementations do are moot.)

zw



Re: Possible typo in config.sub

2024-07-06 Thread Zack Weinberg
On Sat, Jul 6, 2024, at 12:36 AM, Shiro Kawai wrote:
> Hi there, my first post to this list.
> I was browsing config.sub and stumbled on a possible typo.

You are correct.  I posted the same patch more than a month ago but
it has not been applied.

FYI, patches to config.sub and config.guess go to a separate mailing
list, config-patc...@gnu.org.

zw



Re: \# within quotes

2024-07-18 Thread Zack Weinberg
On Thu, Jul 18, 2024, at 5:09 AM, Tijl Coosemans wrote:
> Automake 1.17 produces a warning for the use of \# here:
>
> https://github.com/ddclient/ddclient/blob/d88e6438efbc53e977546693f6835f7517072a06/Makefile.am#L22

For reference, the construct in question is

subst = sed \
-e 's|@PACKAGE_VERSION[@]|$(PACKAGE_VERSION)|g' \
-e '1 s|^\#\!.*perl$$|\#\!$(PERL)|g' \
-e 's|@localstatedir[@]|$(localstatedir)|g' \
[etc]

and the warning triggers on the '1 s|^\#\!...' line.

> Is use within quotes also not portable?

I could be wrong about this, but I don't think any implementation of Make
pays _any_ attention to shell quotation in commands.  Therefore, no, this
is not portable.  If there was a portable way to define a Make variable
that expanded to a single # then you could write

subst = sed \
-e 's|@PACKAGE_VERSION[@]|$(PACKAGE_VERSION)|g' \
-e '1 s|^$(HASH)!.*perl$$|$(HASH)! $(PERL)|g' \
-e 's|@localstatedir[@]|$(localstatedir)|g' \
[etc]

but I think the only portable way to define that variable is from
outside the Makefile, e.g. by typing

make 'HASH=#'

every single time, ick.

My default recommendation for this sort of thing - for *any* circumstance
where you find yourself embedding more than a couple lines of shell script in
a Makefile, frankly - is that you split out the _entire_ troublesome construct
into a standalone script, like, put this in build-aux/subst

---
#! /bin/sh

set -e -u

if [ $# -ne 1 ]; then
printf 'usage: %s out-file\n' "$0" >&2
printf 'PACKAGE_VERSION, PERL, CURL, srcdir, localstatedir,\n' >&2
printf 'runstatedir, and sysconfdir must be set in environment\n' >&2
exit 1
fi

out="$1"
tmp="${out}.tmp.$$"
trap 'rm -f $tmp' 0

if [ -f "${out}.in" ]; then
in="${out}.in"
elif [ -f "${srcdir}/${out}.in" ]; then
in="${srcdir}/${out}.in"
else
printf '%s: error: %s not found in . or %s\n' \
   "$0" "${out}.in" "$srcdir" >&2
exit 1
fi

sed < "$in" > "$tmp" \
-e 's|@PACKAGE_VERSION[@]|'"${PACKAGE_VERSION}"'|g' \
-e '1 s|^#!.*perl$|#! '"${PERL}"'|g' \
-e 's|@localstatedir[@]|'"${localstatedir}"'|g' \
-e 's|@runstatedir[@]|'"${runstatedir}"'|g' \
-e 's|@sysconfdir[@]|'"${sysconfdir}"'|g' \
-e 's|@CURL[@]|'"${CURL}"'|g'

if [ -x "$in" ]; then
chmod +x "$tmp"
fi
mv "$tmp" "$out"
trap '' 0
---

and then change your $(subst_files) rule to

---
$(subst_files): Makefile build-aux/subst
PACKAGE_VERSION="$(PACKAGE_VERSION)" \
PERL="$(PERL)" \
CURL="$(CURL)" \
srcdir="$(srcdir)" \
localstatedir="$(localstatedir)" \
runstatedir="$(runstatedir)" \
sysconfdir="$(sysconfdir)" \
$(SHELL) $(srcdir)/build-aux/subst $@
---

and then you don't have to worry about layering Makefile quotation
on top of shell quotation.

zw



Re: \# within quotes

2024-07-18 Thread Zack Weinberg
On Thu, Jul 18, 2024, at 1:50 PM, Jan Engelhardt wrote:
> On Thursday 2024-07-18 15:40, Zack Weinberg wrote:
>>On Thu, Jul 18, 2024, at 5:09 AM, Tijl Coosemans wrote:
>>> Automake 1.17 produces a warning for the use of \# here:
>>>
>>> https://github.com/ddclient/ddclient/blob/d88e6438efbc53e977546693f6835f7517072a06/Makefile.am#L22
>>
>>subst = sed \
>>  -e 's|@PACKAGE_VERSION[@]|$(PACKAGE_VERSION)|g' \
>>  -e '1 s|^\#\!.*perl$$|\#\!$(PERL)|g' \
>>  -e 's|@localstatedir[@]|$(localstatedir)|g' \
>>[etc]
>
> How much if EBCDIC a thing still, or can we assume ASCII and use \x23?

Many, possibly all, non-GNU implementations of sed do not support \x escapes
(nor any other C-style character escape sequences).

zw



Re: \# within quotes

2024-07-18 Thread Zack Weinberg
On Thu, Jul 18, 2024, at 2:21 PM, Zack Weinberg wrote:
> On Thu, Jul 18, 2024, at 1:50 PM, Jan Engelhardt wrote:
>> On Thursday 2024-07-18 15:40, Zack Weinberg wrote:
>>>On Thu, Jul 18, 2024, at 5:09 AM, Tijl Coosemans wrote:
>>>> Automake 1.17 produces a warning for the use of \# here:
>>>>
>>>> https://github.com/ddclient/ddclient/blob/d88e6438efbc53e977546693f6835f7517072a06/Makefile.am#L22
>>>
>>>subst = sed \
>>> -e 's|@PACKAGE_VERSION[@]|$(PACKAGE_VERSION)|g' \
>>> -e '1 s|^\#\!.*perl$$|\#\!$(PERL)|g' \
>>> -e 's|@localstatedir[@]|$(localstatedir)|g' \
>>>[etc]
>>
>> How much if EBCDIC a thing still, or can we assume ASCII and use \x23?
>
> Many, possibly all, non-GNU implementations of sed do not support \x escapes
> (nor any other C-style character escape sequences).

Slight correction: POSIX requires sed to support \n in regular expressions
(meaning to match a newline) but does not require support for any other
escape sequence that works in C strings.

I would not be in the least bit surprised to learn that there were sed
implementations in the 1990s that didn't support \n, and I would also
not be surprised to learn that Solaris or AIX or even MacOS still ships
a sed like that in its default $PATH, but you probably don't have to
worry about such implementations until you trip over one.

zw



Re: First draft of application to Sovereign Tech Fund

2024-07-31 Thread Zack Weinberg
On Wed, Jul 31, 2024, at 9:56 AM, Detlef Riekenberg wrote:
> For all contributions, they have to be checked, commented and later
> committed by the few busy people with commit rights.

Because of this ...

> We need a list of "todo projects", similar to what is available for
> projects, who apply for Google Summer of Code.

... and because neither Automake nor Autoconf currently has a list of
"good first bugs/projects to work on", and also because both are
abnormally difficult to write good patches for, I'd suggest that the
most useful way for a new contributor to help at this time is to analyze
the backlog of bug reports.  Classify them as recent regressions,
longstanding bugs, or desirable new features.  Estimate the importance
and urgency of fixing each.  Identify affected projects and/or systems.
Construct minimized test cases.  Identify root causes.  Try to estimate
how difficult it would be to write a patch for each ... but don't actually
write the patches, not at first, not until you've spent a couple months
on understanding the bugs.

zw



Re: module level flags

2002-09-28 Thread Zack Weinberg

On Sat, Sep 28, 2002 at 04:46:21PM -0700, Bruce Korb wrote:
> 
> For the time being, I'll test for GCC 3.[12] and disable
> optimization for that compiler, I guess.  :-(

If you've found an optimizer bug in 3.[12] not present in 3.0 and
prior, you could have the courtesy to *tell us what it is* so we can
*fix* it.

zw