This is a bit of a repeat of this thread:
https://groups.google.com/group/v8-users/browse_thread/thread/66fef74f0ba02c73/cadffb948e0132e3?lnk=gst&q=ascii&pli=1#cadffb948e0132e3
But my question is a bit different:
when creating a string from JS code, does it default to ascii? Under
what circumstance does it become ucs2?
I am creating a string with same ascii char, doing some concatenation
to obtain a large string, and it seems to end up ucs2.

On Oct 14, 9:53 am, ajg <[email protected]> wrote:
> I am looking to potentially switch my project from spidermonkey to v8.
> v8 is quite faster on most of my tests but then is 10x slower on a
> few.
> It involves large strings (about 1MB) being converted from JS to a C++
> cstr.
> The string is pure ascii, and I was hoping it would be ascii by
> default in JS.
> But when converting, it seems that it's using wchar:
>
> conversion to utf8: 2724ms
> conversion to ascii: 2622ms
> conversion to wchar: 10ms
>
> I know that when creating a string from C++ I can force it to be ascii
> in JS.
> But is there a way to get strings created in JS to be ascii by
> default?
> Here it's way too slow and also takes 2x the memory.
> thanks for pointers
> AG

-- 
v8-users mailing list
[email protected]
http://groups.google.com/group/v8-users

Reply via email to