Le 14/02/2013 18:11, Andreas Rossberg a écrit :
On 13 February 2013 13:39, David Bruant <bruan...@gmail.com> wrote:
Warning: In this post, I'll be diverging a bit from the main topic.
Le 12/02/2013 14:29, Brendan Eich a écrit :
Loss oread onlyf identity, extra allocations, and forwarding overhead remain
problems.
I'm doubtful loss of identity matters often enough to be a valid argument
here. I'd be interested in being proved wrong, though.
I understand the point about extra allocation. I'll talk about that below.
The forwarding overhead can be made inexistent in the very case I've exposed
because in the handler, the traps you care about are absent from the
handler, so engines are free to optimize the [[Get]]&friends as operations
applied directly to the target.
You're being vastly over-optimistic about the performance and the
amount of optimisation that can realistically be expected for proxies.
Proxies are inherently unstructured, higher-order, and effectful,
which defeats most sufficiently simple static analyses. A compiler has
to work much, much harder to get useful results. Don't expect anything
anytime soon.
var handler = {set: function(){throw new TypeError}}
var p = new Proxy({a: 32}, handler);
p.a;
It's possible *at runtime* to notice that the handler of p doesn't have
a get trap, optimize p.[[Get]] as target.[[Get]] and guard this
optimization on handler modifications. Obviously, do that only if the
code is hot.
I feel it's not that much work than what JS engines do currently and the
useful result is effectively getting rid of the forwarding overhead.
Is this vastly over-optimistic?
I've seen this in a previous experience on a Chrome extension where someone
would seal an object as a form of documentation to express "I need these
properties to stay in the object". It looked like:
function C(){
// play with |this|
return Object.seal(this)
}
My point here is that people do want to protect their object integrity
against "untrusted parties" which in that case was just "people who'll
contribute to this code in the future".
Anecdotally, the person removed the Object.seal before the return because of
performance reasons, based on a JSPerf test [3].
Interestingly, a JSPerf test with a proxy-based solution [4] might have
convinced to do proxies instead of Object.seal.
Take all these JSPerf micro benchmark games with two grains of salt;
... that's exactly what I said right after :-/
"But that's a JSPerf test and it doesn't really measure the GC overhead
of extra objects."
"JSPerf only measures one part of perf the story and its nice conclusion
graph should be taken with a pinch of salt."
lots of them focus on premature optimisation.
I'm quite aware. I fear the Sphinx [1]. I wrote "might have convinced to
do proxies instead of Object.seal". I didn't say I agreed. and I
actually don't.
Also, seal and freeze
are far more likely to see decent treat than proxies.
Why so?
But more importantly, I think you get too hung up on proxies as the
proverbial hammer. Proxies are very much an expert feature. Using them
for random micro abstractions is like shooting birds with a nuke. A
language that makes that necessary would be a terrible language. All
programmers messing with home-brewed proxies on a daily basis is a
very scary vision, if you ask me.
hmm... maybe.
David
[1] https://twitter.com/ubench_sphinx
_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss