> >I don't understand this desire to not want anything to change.
>
> You misread.

I sympathise.  There are definate goals and focuses that each language is
built around.. Change these too much, and you have a different language,
while at the same time, alienating the people that chose that language
because of those met goals.

Perl is fun, functional, and efficient.  I can artistically scuplt a program
in one line at the command prompt and do things that wield incredible power,
and feel an incredible sence of happiness.  While at the same time, I can
write a massive program, and not have to worry about many of the mundane
details that we learn in algorithms / data-structures.

> >This is an
> >opportunity to clean up the language, make it more useable, and more fun.
> >I would have a lot more fun if perl were a better performer and if it was
> >easy for me to expand it, contract it, reshape it, improve it, etc.

> You will *not* improve the performance of the inner interpreter
> loop by having fewer opcodes to switch on.  Whether the number is
> 20 or 200, it's the same thing--*think* aboutit.

Well, not strictly true.  Though the raw code will not switch any faster,
the bloat of the core will hurt you with non-locality, through cache misses,
or worse memory-paging.  A highly tuned 50K core interpreter will do much
better than a 500K list of regularly used libraries.  Essentially, it should
be possible to perform profiling on suites of programs to find out what is
really most needed, and only allow them in the core.  The rest could be
delegated to slower dynamic linking or outer-edge statically linked code.
It's the general idea of RISC, of course.  I have no real backing to say
that this would actually improve performance, nor that it would even be
possible to stream-line the core of perl any further.. Hell, C++ is only
going to make the situation worse (performance lost to fun of development,
which is acceptible).  Typically, when the core does nothing more than
handle basic flow, and memory management, you can have a rock-solid kernel.
If you can build out apon this (all op codes seem identical, beit built-in
through statically compiled extensions, or linked in), then you will have a
much more solid development base.  This is the advantage of C, python, java,
etc.  Their extensions feel just like built-ins.  Granted, perl has come a
long way with giving keyword power to user-defines.

> Furthermore,  It's
> been demonstrated that this supposed "gain", including in size, is
> negligible.  I have yet to see one iota of justification for denuding
> perl of its math functions.

I would agree that math function should not be compromised.  If dynamically
linking them seriously decreases performance, then we're going to see a
massive slow-down in operation, since some crazed programmers actually like
performing fast fourier transforms.  Getting a 20-30% slow-down is not going
to be well accepted (I'm guessing at the over-head of dynamic linking;  I
just hear how bad it is).  As above, however, I don't see a problem with
keeping them statically linked, so long as they're not part of the core.

> If math functions should be removed,
> then so too string functions.  And that binmode silliness is in
> line before any of them, since it's absolutely useless because I
> never use it.

As in all things in life.. Moderation.. Profiling should decide what
compromises clean-ness of code for the benifit of optimization.  Many string
operations are close to memory management, such as array modification,
string concatenation, simple assignment, references.. I would advocate that
they be part of the memory management system, and thus part of the core.

> Perl is *not* fun when it supplies nothing by default, the way C
does(n't).

I hear ya.  Dynamic memory management (garbage collection, dynamic arrays,
etc), powerful string operations, powerful datatypes, etc.  These are what
attract me to perl the most.. Not so much readily available socket
functions.  Sure having java.util.* built into java would make it easier to
use out of the box, but it's more to learn about the internals, especially
since they may change over time.  I can more easily walk the lib-tree and
read pod-files about each library to learn what I can do.

>
> If you want a language that can do nothing by itself, fine, but don't
> call it Perl.  Given these:
>
>     * Scalar manipulation
>     * Regular expressions and pattern matching

At the very least, built-in reg-ex's allows the compiler to make incredible
optimizations which it simply wouldn't be able to do if it were a function
call:

$str =~ s/${var} xxx \s xxx/ abs( $1 ) /xegso;

would have to be:

$reg_ex = gen_regex_replace( expr => "${var} xxx \s xxx", subs =>
"abs( $1 )", modifiers => "xegso"  );
apply_reg_ex( regex => $reg_ex, str => $str );

It would be non-trivial to get a compiler to efficiently handle this.. Thus,
I don't support any removal of expressions.  I definately don't support
making one function for each possible modifier.

>     * Numeric functions
>     * Array processing
>     * List processing
>     * Hash processing

These things are what make perl so great.  They are also very common, so
profiling would most likely require them to be part of the core.

>     * Input and output

Since quick-and-dirty scripts make such heavy use of this, the worst I'd
advocate is auto-loading them.  They simply need to be globally accessible.

>     * Fixed-length data and records
>     * Filehandles, files, and directories
>     * Flow of program control
>     * Scoping
>     * Miscellaneous
>     * Processes and process groups

As was pointed out earlier. this can be a pain for migration.  True, perl
has always been the UNIX tool box; a sh on steroids.  I don't really know
how I feel about this.  The reduction of cryptic globals can only be a good
thing, IMHO, yet, unless we delegated access to this information more like
java / python (where you have a system or os module that handles the
specific details), I can't imagine how it could happen..  My guess is that
things will stay the way they are, and NT will continue to look more and
more like UNIX under perl.  There's just too much nostalgia (and general
geek pride).  I personally believe a seperate module would be the better
solution, pride aside.

>     * Library modules
>     * Classes and objects
>     * Low-level socket access
>     * System V interprocess communication

See, again, I can't really see the advantage of IPC at the low level.  I
can't imagine writing much useful code at the command-prompt that does IPC.
Anything that you might want to do, you could achieve with ipcs, or other
UNIX tools.  And command-line tooling is the main reason I advocate keeping
things global.  Does socket() run noticably slower if it's dynamically
linked?  Are you really going to run it in a tight loop?  Doubtfully, so I
would advocate autoloading here.

>     * Fetching user and group information
>     * Fetching network information

Again, following along the lines of system-specific IPC.  Profiling should
tell the tale.

>     * Time

Laugh, I'm biased.. I like this one.  It has always frustrated me in C when
I couldn't quickly get the time.  It's really IPC though (asking the system
for time).  It should also be trivial (minus the OS call).

> Your brain-damaged Perl would likely end up having nothing in it but
these:
>
>     * Flow of program control
>     * Scoping
>
> Anything else would be fobbed off into back-of-the-bus modules.
>
> But I tell you this: your whole language will get fobbed off as
> a pain in the royal ass.

So long as auto-loading (for legacy functions) makes it transparent, the
usefulness will not decrease one bit.  The only things that are affected are
the performance (which depends on how old-op-code auto-loading is achieved),
and what happens when you ship the perl executable without any libraries
(which I've had to do on occasion).

>
> Since day 1, perl has been useful because it's had so much in it.

Read, perl has soo much power through desirable and accessible features...
So long as those features are "easily" accessible, perl will not have
changed.

> You don't want a language with a whole bunch of the commonly needed
> functions *already* in it, fine -- but it's not going to be perl,
> and it's not going to be useful.

Once again, and in summary, perl is about making hard problems easy, and
impossible ones possible.  Having functionality immediately available is
desirable in so much as it removes the need to perform wasteful setup.  The
ultimate example of perl's power is command-line scripting.  If a task is
going to involve too much to fit on one line, either perl should be adapted
to handle the problem more efficiently, or a more verbose solution is going
to be necessary (which will involve some degree of setup).  At some point in
every problem, however, the solution is going to require scaling.  The
language needs to be able to scale with the problem.  This often means,
facilitating good programming practices, so a project can scale to multiple
developers.  At the one end, perl has little or no setup (I can complete an
entire perl program before the java interpreter even comes up, never mind
the savings of coding time).  At the other end, I can perform OO or modular
programming which allows easy integration of multiple developers (with
defined and protected interfaces ).  Anything that enhances perl on either
the simplification / lack-of-setup side, or on the large scale side is a
Good Thing(tm).

-Michael


Reply via email to