Seems like this conversation stalled. I'm still curious to learn why
the ASCII encoding arbitrarily converts 0 to 32 when encountered in
the encoding, and if it is such great shakes, why not do it for UTF-8?

In the meantime, I'm zeroing the most significant bit and using the
UTF-8 decoder.

Alan Gutierrez

On Wed, Sep 22, 2010 at 11:30 AM, Alan Gutierrez <[email protected]> wrote:
> So, what does '\0' map to in ASCII? Its character code value is 0.
>
> Are you correcting semantics or are you adding something to the discussion?
>
> Alan Gutierrez - [email protected] - http://twitter.com/bigeasy
>
> On Wed, Sep 22, 2010 at 6:57 AM, Camilo Aguilar <[email protected]> 
> wrote:
>> No, it isn't. The valid ascii character is NUL or 000 or 0 in Char, Oct, Dec
>> and Hex respectively.

-- 
v8-users mailing list
[email protected]
http://groups.google.com/group/v8-users

Reply via email to