On Tue, Dec 16, 2003 at 06:55:56PM -0500, Gordon Henriksen wrote:
: Michael Lazzaro wrote:
: 
: > I don't think so; we're just talking about whether you can extend a 
: > class at _runtime_, not _compiletime_.  Whether or not Perl can have 
: > some degree of confidence that, once a program is compiled, it won't 
: > have to assume the worst-case possibility of runtime alteration of 
: > every class, upon every single method call, just in case 
: > you've screwed with something.
: 
: That's a cute way of glossing over the problem.
: 
: How do you truly know when runtime is in the first place? Imagine an
: application server which parses and loads code from files on-demand.
: This shouldn't be difficult. Imagine that that code references a
: system of modules.
: 
: Imagine if Perl "finalizes" classes after "primary compilation"
: (after parsing, say, an ApacheHandler file), and proceeds to behave
: quite differently indeed afterwards.
: 
: Imagine that a perfectly innocent coder finds that his class
: library doesn't run the same (doesn't run at all) under the
: application server as it does when driven from command line scripts:
: His method overrides don't take effect (or, worse, Perl tells him he
: can't even compile them because the class is already "finalized"! And
: he thought Perl was a dynamic language!).
: 
: What's his recourse? Nothing friendly. Tell Perl that he's going
: to subclass the classes he subclasses? Why? He already subclasses
: them! Isn't that "tell" enough? And how? Some obscure configuration
: file of the application server, no doubt. And now the app server needs
: to be restarted if that list changes. His uptime just went down. And
: now he can't have confidence that his system will continue to behave
: consistently over time; "apachectl restart" becomes a part of his
: development troubleshooting lexicon.

Any such application server would probably just

    use DYNAMIC_EVERYTHING;

(or whatever we call it) and have done with it.

: Java doesn't make him do that; HotSpot can make this optimization at
: runtime and back it out if necessary. Maybe he'll just write a JSP
: instead.

If Parrot turns out to be able to make this optimization, then the
individual declarations of dynamism merely become hints that it's
not worth trying to optimize a particular class because it'll get
overridden anyway.  It's still useful information on an individual
class basis.  The only thing that is bogus in that case is the global
DYNAMIC_EVERYTHING declaration in the application server.  So I could
be argued into making that the default.  A program that wants a static
analysis at CHECK time for speed would then need to declare that.
The downside of making that the default is that then people won't
declare which classes need to remain extensible under such a regime.
That's another reason such a declaration does not belong with the
class itself, but with the users of the class.  If necessary, the main
program can pick out all the classes it things need to remain dymanic:

    module Main;
    use STATIC_CLASS_CHECK;
    use class Foo is dynamic;
    use class Bar is dynamic;

or whatever the new C<use> syntax will be in A11...

: C# and VB.NET do likewise. ASP.NET isn't looking so bad, either. The
: .NET Frameworks are sure a lot less annoying than the Java class
: library, after all.

On the other hand, those guys are also doing a lot more mandatory
static typing to get their speed, and that's also annoying.
(Admittedly, they're working on supporting dynamic languages better.)

: Point of fact, for a large set of important usage cases, Perl simply
: can't presume that classes will EVER cease being introduced into the
: program. That means it can NEVER make these sorts of optimizations
: unless it is prepared to back them out. Even in conventional programs,
: dynamic class loading is increasingly unavoidable. Forcing virtuous
: programmers to declare "virtual" (lest their program misbehave or
: their perfectly valid bytecode fail to load, or their perfectly valid
: source code fail to compile) is far worse than allowing foolish
: programmers to declare "final."

The relative merit depends on who declares the "final", methinks.  But
if we can avoid both problems, I think we should.

: Making semantic distinctions of this scale between "compile time"
: and "runtime" will be a significant blow to Perl, which has always been
: strengthened by its dynamism. Its competitors do not include such
: artifacts; they perform class finalization optimizations on the fly,
: and, despite the complexity of the task, are prepared to back out these
: optimizations at runtime--while the optimized routines are executing,
: if necessary. Yes, this requires synchronization points, notifications
: (or inline checks), and limits code motion. Better than the
: alternative, I say. It is very simply a huge step backwards to
: create a semantic wall between primary compilation and program
: execution.
: 
: So write the complicated code to make it work right.
: - or -
: Take the performance hit and go home.
: 
: Dynamism has a price. Perl has always paid it in the past. What's
: changed?

Nothing, except I'd like like to at least have the *option* of not
paying the price.  But you've argued me into keeping the default
the same, because it works better if and when Parrot *does* support
deoptimization/reoptimization, and even before then, because the
optional dynamic hints can be used to work around spots where the
developing optimizer is broken.  And in the limiting case where
the optimizer is completely broken because it's not implemented
yet, we get to work around that too.  Optionally...

Larry

Reply via email to