On 16 January 2012 19:35, Brendan Eich <bren...@mozilla.org> wrote:
new ES6 features to classic
mode, as has been proposed by several people, clearly works against
(3),
I object. 3 is misstated to assume "switching" means all-or-nothing. If ES6 has new features and they are worth using, developers will want to use them piecewise and conveniently. Assuming they will do so at the price of a leading "use version 6" or equivalent pragma is debatable. We shouldn't just assume this conclusion.
OK. But that sounds like a clear departure from the "ES.next/Harmony
is based on strict mode" axiom that everybody seemed to have agreed on
long ago. Do we have consensus on abandoning that goal? (Thus my
description as "backporting".)
It's not quite the departure you might think. E.g. destructuring formal
parameters piggy-backs on strict mode logic to forbid duplicates
anywhere in the parameter list. More below.
and consequently, also against (5/6).
This does not follow. Users transitioning to use new features may be helped, not hindered, by we spec and implementation editors working harder. In particular allowing, e.g., destructuring in non-strict code may be boon to developers, whereas requiring them to maintain explicit opt-in pragmas before doing so simply to convenience spec editors and implementors may be a global loss.
Adding new features to both modes is almost guaranteed to lead to new
items to the list of non-strict vs. strict mode differences. That
implies new refactoring risks. I don't see a way around that.
I do. We have strict mode changes, both early errors and runtime changes
in meaning. For the early errors, we piggy-back to make new syntax in
non-strict code trigger the same error, as with the destructuring
parameters example above. For runtime shifts in meaning, we can still
piggy-back if the change is "enclosed" by the new syntax.
What do I mean by enclosed? Consider destructuring parameters. Should
they trigger strict arguments object semantics, a runtime shift? No, the
destructuring pattern in one formal parameter does not enclose the
whole parameter list, or body uses of arguments. So what should
arguments semantics be? Non-strict.
Why isn't this a problem? Because destructuring parameters have no named
formal for the whole object passed as the actual and destructured by
the pattern. There's no need to alias, and we do not want deep-aliasing,
ever (we = users, implementors).
Is this split between enabling early strict-like errors for duplicates,
but not triggering strict arguments, a refactoring risk? I don't know. I
think we should look at it. It could be a risk, certainly -- but it may
be tiny.
Similarly, opting in at
smaller scope, as has also been discussed, is a blatant violation of
(6) and (7), and works against (3), too.
Let's agree, based on your earlier arguments and also on Dave's point that lexical scope (free variable error analysis) won't work if opt-in is too local.
OK, I'm really glad. :)
See, toldya es-discuss is not scary ;-).
Meanwhile, JS1 grew via ES1-5 and various implementations (SpiderMonkey and Rhino) without any such opt-in, *except* for two keywords we couldn't reserve (and some incompatible change attempts that failed, vs. goal 8 -- one example: I tried to make == and != strict in JS1.2 to get ES1 to switch, but that broke the web, wherefore === and !==).
Doesn't that kind of make my point? Breaking changes, even small, need
opt-in.
No, JS1.2 with its version (via the old language="_javascript_1.2"
attribute, remember that) failed. ES1 changed JS de-facto standards
without opt-in. Same for ES2 and 3. We supported versioning via
language= and later type= but it was hardly used outside of
Mozilla-specific JS.
Developers do not want version opt-in, certainly not as a requirement.
One JS.
Have to stop here, out of time. Please feel free to reply to anything, I
hope I didn't miss something later in your message that invalidates
what I wrote in this message!
/be
|