>> Is there any reason we don't set this to something like 64 or 128 in >> the distributed program itself? > > It affects the size of many arrays, so making it arbitrarily large > eats up memory.
So that begs the question: How much memory? Anyway, the big picture is this: For the average user, being told to change a number in a #define and then recompile is a big turn-off. For many users, this is an intimidating request, since they may have never compiled a program before. Indeed, lots of distros don't include gcc in the base install anymore. If PCB is an application program, we can't expect users to recompile it every time they want to change a parameter. Better to just set the max layer count to something larger than most people will ever encounter, and then tell the world that we support XXX layers, where XXX is large. That's good marketing. Anyway, memory is cheap. If we lose users in order to save 1 K, or even 1 Meg of RAM, that doesn't make sense. just MO, Stuart _______________________________________________ geda-user mailing list geda-user@moria.seul.org http://www.seul.org/cgi-bin/mailman/listinfo/geda-user