I thought this was a very germane quote of the day, I got a chuckle out of
it anyways...
"There are two types of languages: ones everybody complains about and ones
nobody uses."


On Thu, May 2, 2013 at 1:57 AM, Levi Pearson <[email protected]> wrote:

> On Wed, May 1, 2013 at 10:25 PM, Stuart Jansen <[email protected]>
> wrote:
>
> > I don't follow compilers either, but I do remember being convinced of
> > the value of runtime optimization when the following came out in 1999:
> > http://www.hpl.hp.com/techreports/1999/HPL-1999-78.html
>
> Dynamo was cool stuff.  Basically gives you the benefits of
> whole-program optimization along with the benefits of separate
> compilation and run-time linking. And without the cost of optimizing
> *everything* instead of just hot paths.
>
> > I also know that these days, even sophisticated JavaScript engines
> > perform multiple levels of optimization and deoptimization based on
> > observed hot spots.
>
> Many of the sophisticated JIT techniques were developed in the context
> of Smalltalk, which is a highly-dynamic language that requires a lot
> of optimization to perform well. Everything in Smalltalk is a method
> call (even simple arithmetic and conditional expressions) and there
> are no type declarations, so a naive implementation would be
> punishingly slow.
>
> One of the biggest wins comes from a technique called polymorphic
> inline caching.  Every method call site, when it is first dispatched,
> can be replaced by an inline call to the method on the type that is
> resolved the first time. A quick check to verify that it didn't make a
> type mistake is then made, and assuming every call is operating on the
> same type, the method dispatch overhead is essentially erased.  If the
> check fails due to a polymorphic call, it can use standard method
> lookup and possibly change the inline method to the new
> implementation.
>
> With polymorphic inline caches and JIT compiling, you can have a
> highly abstract program compile to perform efficiently on a job of one
> data type, then reconfigure itself to perform efficiently on the next
> one.  In a language without dynamic optimizations, you'd have to
> anticipate the data types to be used and instruct the compiler to
> generate efficient code for all of them.  Ahead-of-time compilers are
> typically able to apply more sophisticated optimization techniques,
> but only if enough information is available at compile time.  The more
> dynamic a program's behavior is, the less information there will be at
> compile time for an optimizer to take advantage of, and the more
> benefit you'll get from dynamic optimizations.
>
> As I mentioned in the context of Dynamo, run-time compilation also
> allows you to perform optimization across boundaries that would be
> impossible to optimize across in many languages like C and C++, such
> as compilation unit boundaries and library boundaries. Calling a
> library function in a tight loop, or even calling a function
> implemented in a separate compilation unit of the same program, could
> create significant overhead due to the inability to inline or
> otherwise optimize the calls.
>
> Modern Javascript optimizers use a technique similar to Dynamo called
> tracing just-in-time compilation, in which the compilation unit is not
> the method/function, but a repeated code path that may involve
> multiple methods or just a small part of a method.  By "tracing"
> execution through these paths as they are taken, they can have
> optimizations applied to them that would not have appeared to a static
> ahead-of-time compiler, thereby eliminating a lot of language-induced
> overhead for commonly executed code paths.
>
> C++ gives opportunities to specialize code via templates, but that can
> cause both compilation time issues and severe code bloat, as static
> code must be generated at compile time for all types used in
> templates.  JIT compiling with dynamic or generic types allows for the
> same generality in code and efficiency in implementation, but without
> needing to generate code for all types in the program, just the ones
> that are actually being used at the time.
>
> > Digging for the old HP article turned up the following, which seems like
> > a good starting point for someone interested in learning more:
> > http://en.wikipedia.org/wiki/Dynamic_recompilation
>
> That's all good stuff, and there's a lot more out there as well, both
> for static and dynamic optimizing compilers.  C and C++ are really
> very poor examples of languages that compilers can do optimization
> well with.  They can, of course, both be compiled to very efficient
> code, but the programmer must be very involved in the process at a low
> level.  Languages like Haskell give the compiler a *lot* more
> information about the program that can be used for optimization in
> ahead-of-time compilation, and JIT compilers and dynamic compilation
> techniques can deliver the optimizations you need at just the places
> you need them without bloating the code or compilation time with
> unnecessary "optimization".
>
> /*
> PLUG: http://plug.org, #utah on irc.freenode.net
> Unsubscribe: http://plug.org/mailman/options/plug
> Don't fear the penguin.
> */
>

/*
PLUG: http://plug.org, #utah on irc.freenode.net
Unsubscribe: http://plug.org/mailman/options/plug
Don't fear the penguin.
*/

Reply via email to