Warning: In this post, I'll be diverging a bit from the main topic.
Le 12/02/2013 14:29, Brendan Eich a écrit :
Loss of identity, extra allocations, and forwarding overhead remain
problems.
I'm doubtful loss of identity matters often enough to be a valid
argument here. I'd be interested in being proved wrong, though.
I understand the point about extra allocation. I'll talk about that below.
The forwarding overhead can be made inexistent in the very case I've
exposed because in the handler, the traps you care about are absent from
the handler, so engines are free to optimize the [[Get]]&friends as
operations applied directly to the target.
A handler-wise write barrier can deoptimize but in most practical cases,
the deoptimization won't happen because in most practical cases handlers
don't change.
It seems to me that you are focusing too much on "share ... to
untrusted parties."
Your very own recent words [1]:
"In a programming-in-the-large setting, a writable data property is
inviting Murphy's Law. I'm not talking about security in a mixed-trust
environment specifically. Large programs become "mixed trust", even when
it's just me, myself, and I (over time) hacking the large amount of code."
...to which I agree with (obviously?)
And "Be a better language for writing complex applications" is in the
first goals [2]
Maybe I should use another word than "untrusted parties". What I mean is
"any code that will manipulate something without necessarily caring to
learn about what this something expects as precondition and own invariants".
This includes security issues of course, but also buggy code (which, in
big applications, are often related to mismatch between a
precondition/expectation and how something is used).
I've seen this in a previous experience on a Chrome extension where
someone would seal an object as a form of documentation to express "I
need these properties to stay in the object". It looked like:
function C(){
// play with |this|
return Object.seal(this)
}
My point here is that people do want to protect their object integrity
against "untrusted parties" which in that case was just "people who'll
contribute to this code in the future".
Anecdotally, the person removed the Object.seal before the return
because of performance reasons, based on a JSPerf test [3].
Interestingly, a JSPerf test with a proxy-based solution [4] might have
convinced to do proxies instead of Object.seal.
But that's a JSPerf test and it doesn't really measure the GC overhead
of extra objects. Are there data on this? Are there methodologies to
measure this overhead? I understand it, but I find myself unable to pull
up numbers on this topic and convincing arguments that JSPerf only
measures one part of perf the story and its nice conclusion graph should
be taken with a pinch of salt.
It's true you want either a membrane or an already-frozen object in
such a setting.
Not a membrane, just a proxy that protects its target. Objects linked
from the proxy likely came from somewhere else. They're in charge of
deciding of their own "integrity policy".
And outside of untrusted parties, frozen objects have their uses --
arguably more over time with safe parallelism in JS.
Arguably indeed. I would love to see this happen.
Still, if (deeply) frozen "POJSO" could be part shared among contexts, I
think we can agree that it wouldn't apply to frozen proxies for a long
time (ever?)
I went a bit too far suggesting frozen objects could de-facto disappear
with proxies.
I'm still unclear on the need for specific seal/freeze/isSealed/isFrozen
traps
David
[1] https://mail.mozilla.org/pipermail/es-discuss/2013-February/028724.html
[2] http://wiki.ecmascript.org/doku.php?id=harmony:harmony#goals
[3] jsperf.com/object-seal-freeze/
[4] http://jsperf.com/object-seal-freeze/2
/be
David Bruant wrote:
Hi,
The main use case (correct me if I'm wrong) for freezing/sealing an
object is sharing an object to untrusted parties while preserving the
object integrity. There is also the tamper-proofing of objects
everyone has access to (Object.prototype in the browser)
In a world with proxies, it's easy to build new objects with high
integrity without Object.freeze: build your object, share only a
wrapped version to untrusted parties, the handler takes care of the
integrity.
function thrower(){
throw new Error('nope');
}
var frozenHandler = {
set: thrower,
defineProperty: thrower,
delete: thrower
};
function makeFrozen(o){
return new Proxy(o, frozenHandler);
}
This is true to a point that I wonder why anyone would call
Object.freeze on script-created objects any longer... By design and
for good reasons, proxies are a subset of "script-created objects",
so my previous sentence contained: "I wonder why anyone would call
Object.freeze on proxies..."
There were concerned about Object.freeze/seal being costly on proxies
if defined as preventExtension + enumerate + nbProps*defineProperty.
Assuming Object.freeze becomes de-facto deprecated in favor of
proxy-wrapping for high-integrity use cases, maybe that cost is not
that big of a deal.
David
_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss
_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss