Wiadomość napisana w dniu 2007-01-24, o godz04:32, przez Andrew Pinski:
It's "too good" to be usable. The time required for a full test suite
run can be measured by days not hours.
Days, only for slow machines. For our PS3 toolchain (which is really
two sperate compilers), it takes 6 hours to run the testsuite, this
is doing one target with -fPIC. So I don't see how you can say it
takes days.
Quantitatively:
gcc/testsuite dalecki$ find ./ -name "*.[ch]" | wc
6644 6644 213514
ejh216:~/gcc-osx/gcc/testsuite dalecki$ find ./ -name "*.[ch]" -exec
cat {} \; | wc
254741 1072431 6891636
That's just about a quarter million lines of code to process and you
think the infrastructure around it isn't crap on the order of 100?
Well... since
one "can drive a horse dead only once" the whole argument could
actually stop here.
No, not really, it took me a day max to get a spu-elf cross compile
building and runing with newlib and all.
Building and running fine, but testing??? And I guess of course that
it wasn't
a true cross, since the SPUs are actually integrated in to the same
OS image as
the main CPU for that particular target.
My favorite tactic to decrease the number of
bugs is to set up a unit test framework for your code base (so
you can
test changes to individual functions without having to run the whole
compiler), and to strongly encourage patches to be accompanied by
unit
tests.
That's basically a pipe dream with the autoxxxx based build system.
Actually the issues here are unrelated at all to auto* and unit
test framework.
So what do the words "full bootstrap/testing" mean, which you hear
when providing
any kind of tinny fix? What about the involvement of those utilities
through
zillions of command line defines and embedded shell scripting for
code generation
on the ACUTAL code which makes up the gcc executables? Coverage? Unit
testing? How?!
Heck even just a full reliable symbol index for an editor isn't easy
to come by...
Or are your just going to permute all possible configure options?
The real reason why toplevel libgcc took years to come
by is because nobody cared enough about libgcc to do any kind of
clean up.
Because there are actually not that many people who love to dvelve
inside the
whole .m4 .guess and so on... Actually It's not that seldom that
people are incapable
to reproduce the currently present build setup.
The attitude has
changed recently (when I say recent I mean the last 3-4 years) to
all of these problems and
in fact all major issues with GCC's build and internals are
changing for the better.
And now please compare this with the triviality of relocating source
files in:
1. The FreeBSD bsdmake structure. (Which is pretty portable BTW.)
2. The solaris source tree.
3. A visual studio project.
4. xcode project.
PS auto* is not to blame for GCC's problems, GCC is older than auto*.
It sure isn't the biggest problem by far. However it's the upfront
one, if you
start to seriously look in to GCC. Remember - I'm the guy who
compiles the whole
of GCC with C++, so it should be clear where I think the real issues
are.