I just randomly came across this Ajaxian podcast episode from a while back
that answers the very question I posed.

http://ajaxian.com/archives/audible-ajax-episode-20-project-tamarin

Interestingly, Adobe open-sourced their JIT implementation of Javascript
after a 3 year complete rewrite, and that's what Mozilla,  Adobe, and others
are collaborating on for the Tamerin Project.  Some say it can increase
speeds up 1ox.  While Tamarin won't be in Firefox 3, it will be a later
addition, likely Firefox 4 sometime in late '08.  In typical Microsoft
fashion, they are currently rewriting their own Javascript engine (JScript)
and they intend to match, or exceed Tamarin's speed.  Why doesn't MS just
use Tamarin as well since it is open-source?  Here's one instance where I'm
glad MS is sticking to their own sandbox cause it just means more
competition.  If they can't beat Tamarin when they can look at Tamarin's
code as much as they want, that's pretty sad.

So, to completely answer my question, it looks as if the interpreter is the
current bottleneck, but one interviewee said the bottleneck for most AJAX
apps will be network speed in the near future.

On 10/2/07, Robert Koberg <[EMAIL PROTECTED] > wrote:
>
>
> On Tue, 2007-10-02 at 11:26 -0500, Derek Gathright wrote:
> > Thanks for the link, interesting stuff.
> >
> > After looking through info on Rhino, I was left with the question...
> > why build the JS core engine in Java and not a non-interpreted
> > language?
>
> You can compile them to byte code and create classes (I am guessing that
> is why rhino performs much better in the 'real world' test cases). I use
> them for a webapp in a servlet container.
>
> best,
> -Rob
>
>

Reply via email to