I'm currently building a language for writing games that compiles
directly to JavaScript. As a part of this I wrap JS arrays inside my
own Array object, and inside it's 'set' method I run 'parseInt' on the
given key to ensure the index is always an int.

Due to warnings given by the closure JavaScript optimizer I use, today
I changed 'parseInt( key )' to 'parseInt( key, 10 )' (the optimizer
gives you a warning if you fail to do this). However I found that
after adding the radix I received a major performance drop. I'm using
Chrome 11.0.672.2.

With some of the array intensive examples (namely this one
http://playmycode.com/play/game/Sandbox/Blobs) the loss in framerate
was almost 60% (from around 35fps on my machine to 15fps)! That's
surprising since it's also doing lots of drawing too (although most
time is spent on the number crunching). Simply removing the redux from
parseInt solved this issue and brought the performance back up.

In my own primitive benchmarks (running parseInt 1000's of times) I
find similar, but with less of a performance drop. I'm just really
stunned that simply supplying the radix can cause such a big drop in
performance. Could this be solved in the future?

-- 
v8-dev mailing list
v8-dev@googlegroups.com
http://groups.google.com/group/v8-dev

Reply via email to