>So ripping all this 'cruft' would save us about 100-160 kB, still
>leaving us with well over a 1MB-plus executable.  It's Perl itself
>that's big, not the thin glue to the system functions.

Moreover, if you rip out that 100-160 kb, allocating it to modules,
then I can guarantee you that it will cost significantly more memory
as a module to pull in that if it here already there.  There's always
some overhead involved in these things.  Notice how the Byteloader
produces much bigger executables than the originals.  Some work has
been done to fix that, but as a minimum, you've got the Byteloader 
module itself.  And it gets worse... 

Disastrously, you will then also lose the shared text component,
which is what makes all this cheap when Perl loads.  Since the
modules will have to be pasted in the data segment of each process
that wants them, they aren't going to be in a shared region, except
perhaps for some of the non-perl parts of them on certain architectures.
But certainly the Perl parts are *NEVER* shared.  That's why the
whole CGI.pm or IO::whatever.pm stuff hurts so badly: you run with
10 copies of Perl on your system (as many people do, if not much
more than that), then you have to load them, from disk, into each
process that wants them, and eth result of what you've loaded cannot
be shared, since you loaded and compiled source code into non-shared
parse trees.  This is completely abysmal.  Loading bytecode is no win:
it's not shared text.

--tom

Reply via email to