Yeah this is just the way Javascript is for now. There is an issue open for
it:
http://code.google.com/p/v8/issues/detail?id=761

You don't run into the problem with HTML because that goes straight through
WebKit; it never hits v8.

On Tue, Aug 9, 2011 at 1:31 PM, ~flow <[email protected]> wrote:

> hi, i would like to cross-post a question:
>
>
> http://stackoverflow.com/questions/6985851/how-to-render-32bit-unicode-characters-in-google-v8-and-nodejs
>
> the whole issue boils down to this: when use javascript to write
> strings with 32bit unicode characters into an HTML page displayed
> inside google chrome, i take it that it's that browser's javascript
> engine that the string must pass through, so it's passed through V8---
> correct? i observe that such characters are rendered correctly.
>
> however, doing the equivalent using nodejs 0.4.10 and the
> console.log() method, all i get is those annoying ����. my sources are
> in utf-8, my terminal is correctly configured (have been doing this
> stuff for years using python, so i can be sure about that).
>
> my understanding is that unicode support in javascript is deeply
> flawed and the V8 team is committed to stick to the standard as
> closely as possible, for understandable reasons. but somehow the
> people who built chrome the browser must have found a way to minimize
> the impact of this difficult ECMA-legacy---what does document.write()
> do that console.log()can't?
>
> i just can't believe that a platform so finely crafted will go on
> mangling every single character outside the unicode BMP...
>
> --
> v8-users mailing list
> [email protected]
> http://groups.google.com/group/v8-users
>

-- 
v8-users mailing list
[email protected]
http://groups.google.com/group/v8-users

Reply via email to