On Nov 21, 2011, at 8:49 AM, Allen Wirfs-Brock wrote: > On Nov 20, 2011, at 8:30 PM, Brendan Eich wrote: > >> On Nov 20, 2011, at 1:18 PM, David Herman wrote: >> >>>> I would not add more implicit magic to JS. E4X had junk like this in it, >>>> which only ever concealed bugs. >>> >>> I'm of two minds about this. In the abstract, I agree with Brendan; >>> fail-soft conceals bugs. But in reality, our destructuring logic is >>> incredible fail-soft. Hardly anything in destructuring is treated as an >>> error. And the syntax really *wants* to match the common pattern. So I'm >>> torn. >> >> 1. Failing to write that means a destructuring parameter with default values >> within the pattern cannot be observed via arguments[i] as undefined (or >> null?). If missing, the undefined will be replaced by a fresh object. This >> isn't consistent with any other combination of destructuring parameters and >> parameter default values. > > Actually, I've specified parameter default value initialization such that the > arguments object is an array of the actual argument values. It contains no > default value substitutions. Even if function f({a,b}) is interpreted as > function f({a,b}=undefined) the value of arguments[0] for a call of the form > f(undefined) would still be undefined. > > I'm not particularly in favor of treating undefined/null as { }, but I don't > think the arguments object is particularly relevant to the issue.
If you had not preserved the actual undefined, I would argue differently. We all hate arguments but chipping away at consistency involving observations made using it has a bad smell. What about points 2 and 3? /be _______________________________________________ es-discuss mailing list es-discuss@mozilla.org https://mail.mozilla.org/listinfo/es-discuss