Hi Eric,
My basic question was "are there any other benchmarks for frozen
files?". You provided a long answer to my post, but never answered my
basic question. :( I take it the answer is "no".
> It depends on how complex those macros were.
>
Admittedly, the macros I write are simple, not complex. That is by
design. I happen to disagree with the notion: "Once really addicted,
users pursue writing of sophisticated m4 applications even to solve
simple problems, devoting more time debugging their m4 scripts than
doing real work. Beware that m4 may be dangerous for the health of
compulsive programmers." I use m4 very differently from that scenario,
which sounds like a programming nightmare. Anyway, it would make sense
that complex macros MIGHT be sped up more by frozen files. But unless
you have a benchmark it's more speculation than fact. I have nothing for
or against frozen files. I just see little evidence of benefit so far,
but do see a significant maintenance / complexity burden.
> Yay that you were able to measure a difference for your test case.
>
I don't know what that means. It sounds kind of flippant, like you don't
take my post very seriously. Of course there was a difference. But it
was disappointingly small.
Yes. At one point, autoconf had some O(n^2) (and even O(n^3)) macro
definitions, some of which were encountered in the raw file and thus
caused some noticeable delays before even getting to the user's
configure.ac file. The reason that GNU m4 introduced frozen files was
because autoconf had SO MANY MACRO definitions, and that some of those
definitions were computed definitions created by quadratic algorithms,
where loading the frozen file really did make a difference (it's much
faster to read in a macro's definition in frozen form than it is to
parse one word at a time to recompute the same definition). This was
before GNU m4 1.4 was released. Since then, two things have improved -
m4's parsing speed has been improved (for example, it now uses
getc_unlocked instead of getc, which itself had a noticeable speedup on
processing raw files), and autoconf has moved away from O(n^2)
algorithms (while it still defines a LOT of macros that still get loaded
faster from frozen, it is no longer as complex to compute what those
macros will contain). So I'm not sure if you will see better or worse
numbers from autoconf, but I do know that autoconf still makes heavy use
of loading frozen files so that it gets to parsing the user's
configure.ac sooner.
I already figured there were historical reasons why frozen files were
introduced. My post said "Maybe frozen files were useful decades ago."
So all the history doesn't seem relevant.
Maybe the historical reasons were bad reasons. Maybe they don't apply
today. My guess is there were better ways to deal with those O(n^2) (and
even O(n^3)) macro definitions. You say they are gone now, so apparently
someone found a better way. I'm sure someone was trying to do their best
way back when. But it's possibly they messed up, that frozen files were
a failed experiment. Programmers make bad design decisions all the time,
and they can persist for many years. It's possible that happened here.
As to "SO MANY MACRO definitions", I think my benchmark also had a lot
of definitions. Again, without a benchmark, it's just speculation to
suggest that frozen files are required to work with large numbers of
macro definitions. It's not THAT difficult to do benchmarks. It seems
pretty weak they spent all that time writing all those complex macros,
but there are no benchmarks.
What are the autoconf "quadratic algorithms" you are referring to? Are
they still around? If so, maybe there is a better approach. I would
suggest that if there is a composite macro that is more or less general,
widely used, and computation intensive, that would be a good candidate
to consider using a builtin, which could potentially be much faster,
much better, and much easier to use. But that would require an openness
to adding builtin macros.
You totally make my point when you say "I'm not sure if you will see
better or worse numbers from autoconf". If it's not faster, there is NO
point to use frozen files. Perhaps without intending to, you make my
point that there is a possibility frozen files are not so hot.
BTW, I'm sure you would not see "worse numbers". I am NOT suggesting
that frozen files slow things down. :)
They're not mandatory to use. But at this point, we can't rip it out of
m4 - there are users that depend on it. The code is designed to not
penalize people that aren't using it.
I never suggested frozen files were mandatory to use, so I don't get
your point. I am suggesting is they are mandatory to maintain. And my
guess is they add significant complexity to the software (you would be
best placed to comment on that). And as m4 development seems more or
less stuck based on what I read, maybe it might be a good idea to
strategize before adding some other "feature", and to figure out how to
get m4 development unstuck. And again, sometimes "less is more".
And I never suggested "rip it out of m4". My exact words were "might be
deprecated at some point". I'm sure you understand what deprecated
means. So I totally don't understand why you say "rip it out". Softwares
deprecate things when called for. It's how you move forward.
As pointed out by another poster, I have tried to be pretty diplomatic.
It's really great you maintain m4, I appreciate your volunteer work. But
just my impression, you don't seem very open to the possibility of
figuring out a better way to do things. There are perhaps valid reasons
for that, or maybe I got the wrong impression. I write software. I love
it when someone finds a usability problem, especially a bug, or suggests
a possible way to improve the software. I understand some developers
don't like that kind of input. In any case, I'm just trying to help by
pointing out some possible ways to improve m4. Maybe it's more or less
an echo chamber. :( Who knows? Anyway, as a user, I am just noting some
ways m4 could probably serve the user better.
Thanks,
Daniel