I wrote a (new) script that generates a regular expression that matches valid 
JavaScript identifiers as per ECMAScript 5.1 / Unicode v6.2.0. 
http://mathiasbynens.be/demo/javascript-identifier-regex

Then, I made it do the same thing according to the latest ECMAScript 6 draft, 
which refers to Unicode Standard Annex #31: Unicode Identifier and Pattern 
Syntax (http://www.unicode.org/reports/tr31/).

After comparing the output, I noticed that both regular expressions are 
identical except for the following: ECMAScript 5 allows U+2E2F VERTICAL TILDE 
in `IdentifierStart` and `IdentifierPart`, but ECMAScript 6 / Unicode TR31 
doesn’t.

Was this potentially breaking change intentional? I’m fine with disallowing 
U+2E2F, but only if we’re sure it doesn’t break any existing code.

Mathias
http://mathiasbynens.be/
_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to