Re: Deep cloning objects defined by JSON.

2012-01-29 Thread Peter van der Zee
On Sun, Jan 29, 2012 at 10:09 PM, Xavier MONTILLET
 wrote:
> I think it should keep it. Douglas did something to allow cyclic
> references ( https://github.com/douglascrockford/JSON-js/blob/master/cycle.js
> ) and he probably wouldn't have done that if it had no use. Plus
> you're talking of cloning data structures and a graph is one.
>
> And if you do not allow cyclic references, you still have to do
> something about it. Ignoring the properties will probably make devs
> wonder why this property isn't cloned and throwing an error is, IMHO,
> no the kind of behavior you want.

Ah I'm sorry, you're right. Cyclic refs do have to be taken care of
otherwise you'll end up in an endless loop. I have no opinion on the
matter though and would suggest it to be tackled the way
JSON.stringify does it now. Whatever way that might be. Imho the
cloned object (for json.parse) should not have any refs though.

- peter
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-29 Thread Xavier MONTILLET
I think it should keep it. Douglas did something to allow cyclic
references ( https://github.com/douglascrockford/JSON-js/blob/master/cycle.js
) and he probably wouldn't have done that if it had no use. Plus
you're talking of cloning data structures and a graph is one.

And if you do not allow cyclic references, you still have to do
something about it. Ignoring the properties will probably make devs
wonder why this property isn't cloned and throwing an error is, IMHO,
no the kind of behavior you want.

On Sun, Jan 29, 2012 at 9:56 PM, Peter van der Zee  wrote:
> On Sun, Jan 29, 2012 at 7:50 PM, Xavier MONTILLET
>  wrote:
>> With your last two implementations, you don't keep cyclic references.
>
> I did not intend to. In fact, my intention was to have a "clean"
> object with just structure (objects and arrays) and primitives.
> Nothing else, especially nothing invisible (like references, object
> instances or attributes). You can save that fancy stuff for
> Object.clone :)
>
> - peter
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-29 Thread Peter van der Zee
On Sun, Jan 29, 2012 at 7:50 PM, Xavier MONTILLET
 wrote:
> With your last two implementations, you don't keep cyclic references.

I did not intend to. In fact, my intention was to have a "clean"
object with just structure (objects and arrays) and primitives.
Nothing else, especially nothing invisible (like references, object
instances or attributes). You can save that fancy stuff for
Object.clone :)

- peter
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-29 Thread Xavier MONTILLET
With your last two implementations, you don't keep cyclic references.

On Sun, Jan 29, 2012 at 7:39 PM, Peter van der Zee  wrote:
> On Sun, Jan 29, 2012 at 7:23 PM, David Bruant  wrote:
>> Based on your description, it seems that the definition would be:
>>
>> JSON.clone = function(o){
>>  return JSON.parse(JSON.stringify(o));
>> }
>
> Yes. You can debate whether it should remove properties it can't
> serialize completely, or define them with null. You can also debate
> whether you want to treat objects that don't directly inherit from
> Object as alien (right now that doesn't seem to be the case in Chrome
> at least.
>
> JSON.clone({foo:5}) -> {foo:5}
>
> These are the cases I could (easily) see debatable...
>
> JSON.clone({foo:function(){}}) -> {foo:null} or {}
> function F(){}
> JSON.clone({foo:new F}) -> {foo:{}} or {foo:null} or {}
> var f = new F;
> f.x = 5;
> JSON.clone({foo:f}) -> {foo:{x:5}} or {foo:null} or {}
> F.prototype.x = 5;
> JSON.clone({foo:new F}) -> {foo:{x:5}} or {foo:{}} or {foo:null} or {}
>
> But I guess keeping the same behavior as JSON.stringify for a
> JSON.clone method might be best to keep things consistent. So
> something like Mark's last suggestion, to make the whole thing
> customizable.
>
> - peter
> ___
> es-discuss mailing list
> es-discuss@mozilla.org
> https://mail.mozilla.org/listinfo/es-discuss
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-29 Thread Peter van der Zee
On Sun, Jan 29, 2012 at 7:23 PM, David Bruant  wrote:
> Based on your description, it seems that the definition would be:
>
> JSON.clone = function(o){
>  return JSON.parse(JSON.stringify(o));
> }

Yes. You can debate whether it should remove properties it can't
serialize completely, or define them with null. You can also debate
whether you want to treat objects that don't directly inherit from
Object as alien (right now that doesn't seem to be the case in Chrome
at least.

JSON.clone({foo:5}) -> {foo:5}

These are the cases I could (easily) see debatable...

JSON.clone({foo:function(){}}) -> {foo:null} or {}
function F(){}
JSON.clone({foo:new F}) -> {foo:{}} or {foo:null} or {}
var f = new F;
f.x = 5;
JSON.clone({foo:f}) -> {foo:{x:5}} or {foo:null} or {}
F.prototype.x = 5;
JSON.clone({foo:new F}) -> {foo:{x:5}} or {foo:{}} or {foo:null} or {}

But I guess keeping the same behavior as JSON.stringify for a
JSON.clone method might be best to keep things consistent. So
something like Mark's last suggestion, to make the whole thing
customizable.

- peter
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-29 Thread Mark S. Miller
On Sun, Jan 29, 2012 at 1:23 PM, David Bruant  wrote:

> Based on your description, it seems that the definition would be:
>
> JSON.clone = function(o){
>  return JSON.parse(JSON.stringify(o));
> }


Or possibly:

  JSON.clone = function(o, opt_reviver, opt_replacer) {
return JSON.parse(JSON.stringify(o, opt_replacer), opt_reviver);
  };


-- 
Cheers,
--MarkM
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-29 Thread David Bruant
Le 29/01/2012 19:05, Peter van der Zee a écrit :
> Why can't we define a JSON.clone() or .deepClone() which would only
> clone properties that are primitives, object or array. If they are
> (instanceof) array, copy index properties and length value and create
> new array with that information. If object, create new object and copy
> all properties with the same restrictions as before.
>
> I suppose another discussion is whether you'd want/need to copy
> property attributes as well. For me, for at least JSON.clone, I would
> be happy with just a clone of the primitive value of a property.
>
> In other words, I think a JSON.clone would work as if the structure
> was first serialized to a string. The serialization would drop any
> value or property that's not primitive, object or array. Objects and
> arrays are serialized to [] and {} notation. For arrays, only index
> properties are copied (so not even length or other properties). The
> resulting string would then be deserialized by JSON.parse. Of course
> the serialization doesn't need to happen internally, but I think hope
> that makes it clear what I mean (drops all weird stuff from structures
> like getters, setters and attributes).
Based on your description, it seems that the definition would be:

JSON.clone = function(o){
  return JSON.parse(JSON.stringify(o));
}

David
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-29 Thread Mark S. Miller
Since all these candidates can be provided by libraries, this seems like an
area where libraries can explore and compete. If/once a consensus emerges
from such a competition, then we can revisit whether there's really any
reason to standardize.

If we were talking about this in time to prevent or displace html5's (IMO
horrible) "structured clone" operation, I would be much more positive about
rushing something sane through the standardization process. But we are
already too late to achieve that happy ending :(, and I see no other
adequate reason to rush this.

FWIW, I like the idea that any clone operation should be defined as
equivalent to JSON.parse(JSON.stringify(obj, ..), ..), and so I like your
suggestion of putting this on the JSON object. As for a more general
Object.clone, I agree with what I think is Allen's position: There is no
useful clone semantics/contract that applies in a uniform way across
different abstractions, and so it would be inappropriate to add such a
method to Object or Object.prototype. (Allen, apologies if I have
mischaracterized your position.)


On Sun, Jan 29, 2012 at 1:05 PM, Peter van der Zee  wrote:

> Why can't we define a JSON.clone() or .deepClone() which would only
> clone properties that are primitives, object or array. If they are
> (instanceof) array, copy index properties and length value and create
> new array with that information. If object, create new object and copy
> all properties with the same restrictions as before.
>
> I suppose another discussion is whether you'd want/need to copy
> property attributes as well. For me, for at least JSON.clone, I would
> be happy with just a clone of the primitive value of a property.
>
> In other words, I think a JSON.clone would work as if the structure
> was first serialized to a string. The serialization would drop any
> value or property that's not primitive, object or array. Objects and
> arrays are serialized to [] and {} notation. For arrays, only index
> properties are copied (so not even length or other properties). The
> resulting string would then be deserialized by JSON.parse. Of course
> the serialization doesn't need to happen internally, but I think hope
> that makes it clear what I mean (drops all weird stuff from structures
> like getters, setters and attributes).
>
> By putting such a method on JSON, you leave the way open for whatever
> clone should be on Object and still have an intuitive feeling for what
> JSON.clone would probably do (opposed to Object.clone).
>
> Cloning functions is a dangerous sport anyways due to closures, but I
> don't think anyone would expect JSON.clone to clone functions too.
>
> - peter
>
> On Tue, Jan 24, 2012 at 7:46 PM, Rick Waldron 
> wrote:
> > non-recursive, deep clone by John-David Dalton:
> >
> >
> https://github.com/bestiejs/benchmark.js/blob/master/benchmark.js#L1001-1161
> >
> >
> > Rick
> >
> >
> > ___
> > es-discuss mailing list
> > es-discuss@mozilla.org
> > https://mail.mozilla.org/listinfo/es-discuss
> >
> ___
> es-discuss mailing list
> es-discuss@mozilla.org
> https://mail.mozilla.org/listinfo/es-discuss
>



-- 
Cheers,
--MarkM
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-29 Thread Peter van der Zee
Why can't we define a JSON.clone() or .deepClone() which would only
clone properties that are primitives, object or array. If they are
(instanceof) array, copy index properties and length value and create
new array with that information. If object, create new object and copy
all properties with the same restrictions as before.

I suppose another discussion is whether you'd want/need to copy
property attributes as well. For me, for at least JSON.clone, I would
be happy with just a clone of the primitive value of a property.

In other words, I think a JSON.clone would work as if the structure
was first serialized to a string. The serialization would drop any
value or property that's not primitive, object or array. Objects and
arrays are serialized to [] and {} notation. For arrays, only index
properties are copied (so not even length or other properties). The
resulting string would then be deserialized by JSON.parse. Of course
the serialization doesn't need to happen internally, but I think hope
that makes it clear what I mean (drops all weird stuff from structures
like getters, setters and attributes).

By putting such a method on JSON, you leave the way open for whatever
clone should be on Object and still have an intuitive feeling for what
JSON.clone would probably do (opposed to Object.clone).

Cloning functions is a dangerous sport anyways due to closures, but I
don't think anyone would expect JSON.clone to clone functions too.

- peter

On Tue, Jan 24, 2012 at 7:46 PM, Rick Waldron  wrote:
> non-recursive, deep clone by John-David Dalton:
>
> https://github.com/bestiejs/benchmark.js/blob/master/benchmark.js#L1001-1161
>
>
> Rick
>
>
> ___
> es-discuss mailing list
> es-discuss@mozilla.org
> https://mail.mozilla.org/listinfo/es-discuss
>
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-24 Thread Rick Waldron
non-recursive, deep clone by John-David Dalton:

https://github.com/bestiejs/benchmark.js/blob/master/benchmark.js#L1001-1161


Rick
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-24 Thread Xavier MONTILLET
Since when you clone, you don't want things to interact but there
might still be some shared parts, how about something like that:
http://jsfiddle.net/xavierm02/5JmeU/

Instead of just cloning, you give an object and it gives you the
number you asked of objects that "deeply inherit" from the object you
gave. My code does that only for real objects (ie not arrays) but if
made native, it could probably be extended to arrays and you would
have less memory used.

And about not being able to clone functions, you can do it. All you
need is to keep:
- its name
- its action
- its scope

So if you define another function that simply calls the one you wanted
to clone, you have a cloned function because you keep the scope and
the action. And if you really want the name, you can use new Function
to keep it.

And for a clone that would clone anything, it could be something like that:
http://jsfiddle.net/xavierm02/pjwvV/

But I'm not sure there would be usecases for it.

On Tue, Jan 24, 2012 at 2:33 AM, Russell Leggett
 wrote:
>>
>> As part of the HTML5 specification, there is the structured clone
>> algorithm
>>
>>
>> Which is also incapable of copying a function.
>>
>
> True, but the original suggestion was limited only to the json subset.
> Structured cloning is strictly more powerful than that, and implemented in
> multiple browsers so I thought it was relevant to this conversation.
>
> - Russ
>
> ___
> es-discuss mailing list
> es-discuss@mozilla.org
> https://mail.mozilla.org/listinfo/es-discuss
>
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-23 Thread Russell Leggett
>
>
> As part of the HTML5 specification, there is the structured clone
> algorithm
>
>
>
> Which is also incapable of copying a function.
>
>
True, but the original suggestion was limited only to the json subset.
Structured cloning is strictly more powerful than that, and implemented in
multiple browsers so I thought it was relevant to this conversation.

- Russ
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-23 Thread Rick Waldron


On Jan 23, 2012, at 4:58 PM, Russell Leggett  wrote:

> On Mon, Jan 23, 2012 at 5:37 AM, Herby Vojčík  wrote:
> Allen Wirfs-Brock wrote:
> The following is just speculation...One possible such bottleneck
> might be whole object allocation. A JS clone function probably would
> have to allocate an empty object and then dynamically populate it by
> adding properties one at a time.  A native implementation is more
> like to have the ability to examine a complete object and create, in
> a single primitive operation, a new object with all of the same
> properties as the original object.  In other words, a native
> implementation of deep clone is likely to use some sort of shallow
> clone operation is that not available to pure JS code.  This suggest
> that a better way to get faster deep cloning functions is to make a
> native shallow clone function available to JS code.
> 
> +1, nice.
> Well, to see if this is the bottleneck, one needs to benchmark, first.
> 
> But I feel the need for shallow clone in the language. Such API should be 
> there, and if needed, with native implementation as well.
> 
>  
> As part of the HTML5 specification, there is the structured clone algorithm  

Which is also incapable of copying a function.


> whose primary use case is web workers. It has already been implemented in 
> Chrome and Firefox, I believe. Not sure if that might be a good place to 
> start in possibly exposing that as part of JavaScript.
> 
> If I were to make a feature request here, instead of a clone that might not 
> be terribly useful, I would love to see some better support for Clojure style 
> persistent data structures.
> 
> I know that's a bit of a long shot, but for all the uses of deep cloning I 
> can think of, I feel like persistent data structures are much nicer, and 
> harder to implement in JS.
> 
> - Russ
> ___
> es-discuss mailing list
> es-discuss@mozilla.org
> https://mail.mozilla.org/listinfo/es-discuss
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-23 Thread Russell Leggett
On Mon, Jan 23, 2012 at 5:37 AM, Herby Vojčík  wrote:

> Allen Wirfs-Brock wrote:
>
>> The following is just speculation...One possible such bottleneck
>> might be whole object allocation. A JS clone function probably would
>> have to allocate an empty object and then dynamically populate it by
>> adding properties one at a time.  A native implementation is more
>> like to have the ability to examine a complete object and create, in
>> a single primitive operation, a new object with all of the same
>> properties as the original object.  In other words, a native
>> implementation of deep clone is likely to use some sort of shallow
>> clone operation is that not available to pure JS code.  This suggest
>> that a better way to get faster deep cloning functions is to make a
>> native shallow clone function available to JS code.
>>
>
> +1, nice.
> Well, to see if this is the bottleneck, one needs to benchmark, first.
>
> But I feel the need for shallow clone in the language. Such API should be
> there, and if needed, with native implementation as well.
>
>
As part of the HTML5 specification, there is the structured clone
algorithm
 whose
primary use case is web workers. It has already been implemented in Chrome
and Firefox, I believe. Not sure if that might be a good place to start in
possibly exposing that as part of JavaScript.

If I were to make a feature request here, instead of a clone that might not
be terribly useful, I would love to see some better support for Clojure
style persistent data structures .

I know that's a bit of a long shot, but for all the uses of deep cloning I
can think of, I feel like persistent data structures are much nicer, and
harder to implement in JS.

- Russ
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-23 Thread Herby Vojčík

Allen Wirfs-Brock wrote:

The following is just speculation...One possible such bottleneck
might be whole object allocation. A JS clone function probably would
have to allocate an empty object and then dynamically populate it by
adding properties one at a time.  A native implementation is more
like to have the ability to examine a complete object and create, in
a single primitive operation, a new object with all of the same
properties as the original object.  In other words, a native
implementation of deep clone is likely to use some sort of shallow
clone operation is that not available to pure JS code.  This suggest
that a better way to get faster deep cloning functions is to make a
native shallow clone function available to JS code.


+1, nice.
Well, to see if this is the bottleneck, one needs to benchmark, first.

But I feel the need for shallow clone in the language. Such API should 
be there, and if needed, with native implementation as well.



Allen


Herby
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Rick Waldron
On Sun, Jan 22, 2012 at 8:13 PM, Wes Garland  wrote:

> On 22 January 2012 16:05, Jake Verbaten  wrote:
>
>> The idea here is that methods do not belong in data structures (clone
>> should be to efficiently clone data).
>>
>
> Method vs. Property is a false dichotomy in functional languages, IMO.  A
> method is merely a property whose value is a function instead of some other
> type.
>

Right, as it is defined in 4.3.27 (see: http://es5.github.com/#x4.3.27) and
they shouldn't be lost as a result of a copy/cloning process.




>
> --
> Wesley W. Garland
> Director, Product Development
> PageMail, Inc.
> +1 613 542 2787 x 102
>
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Wes Garland
On 22 January 2012 16:05, Jake Verbaten  wrote:

> The idea here is that methods do not belong in data structures (clone
> should be to efficiently clone data).
>

Method vs. Property is a false dichotomy in functional languages, IMO.  A
method is merely a property whose value is a function instead of some other
type.

-- 
Wesley W. Garland
Director, Product Development
PageMail, Inc.
+1 613 542 2787 x 102
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Allen Wirfs-Brock

On Jan 22, 2012, at 11:58 AM, Jake Verbaten wrote:

> Most of the time the purpose of deep cloning objects is deep cloning data 
> structures. 
> 
> It's been discussed that generically deep cloning proxies, privates and 
> functions is a non-trivial problem. 
> 
> However it would be of value to have a mechanism to deep clone anything that 
> would be valid JSON (Limiting to JSON is arbitary, it's a well defined subset 
> and none of the subset would involve difficult points to resolve).

So why should we expect a "native" deep clone function to be significantly 
faster than a  JavaScript version of the same function running on a modern high 
performance  JS engine. After all, a clone basically just does function calls, 
property lookup and object creation and property creation. These really are the 
foundation operations of  most data structure intensive applications.  If a 
pure JS deep clone is too slow then many other data structure driven functions 
are also going to be too slow.  If, in fact, a pure JS implementation of deep 
clone on an optimizing engine is still significantly slower than a native code 
implementation on the same engine then  perhaps we would be better served to 
focus on eliminating the bottlenecks that slow down the JS version of deep 
clone instead for putting the effort into creating an native version of that 
particular function.

The following is just speculation...One possible such bottleneck might be whole 
object allocation. A JS clone function probably would have to allocate an empty 
object and then dynamically populate it by adding properties one at a time.  A 
native implementation is more like to have the ability to examine a complete 
object and create, in a single primitive operation, a new object with all of 
the same  properties as the original object.  In other words, a native 
implementation of deep clone is likely to use some sort of shallow clone 
operation is that not available to pure JS code.  This suggest that a better 
way to get faster deep cloning functions is to make a native shallow clone 
function available to JS code. 

Allen





___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Allen Wirfs-Brock

On Jan 22, 2012, at 11:58 AM, Jake Verbaten wrote:

> Most of the time the purpose of deep cloning objects is deep cloning data 
> structures. 
> 
> It's been discussed that generically deep cloning proxies, privates and 
> functions is a non-trivial problem. 
> 
> However it would be of value to have a mechanism to deep clone anything that 
> would be valid JSON (Limiting to JSON is arbitary, it's a well defined subset 
> and none of the subset would involve difficult points to resolve).

So why should we expect a "native" deep clone function to be significantly 
faster than a  JavaScript version of the same function running on a modern high 
performance  JS engine. After all, a clone basically just does function calls, 
property lookup and object creation and property creation. These really are the 
foundation operations of  most data structure intensive applications.  If a 
pure JS deep clone is too slow then many other data structure driven functions 
are also going to be too slow.  If, in fact, a pure JS implementation of deep 
clone on an optimizing engine is still significantly slower than a native code 
implementation on the same engine then  perhaps we would be better served to 
focus on eliminating the bottlenecks that slow down the JS version of deep 
clone instead for putting the effort into creating an native version of that 
particular function.

The following is just speculation...One possible such bottleneck might be whole 
object allocation. A JS clone function probably would have to allocate an empty 
object and then dynamically populate it by adding properties one at a time.  A 
native implementation is more like to have the ability to examine a complete 
object and create, in a single primitive operation, a new object with all of 
the same  properties as the original object.  In other words, a native 
implementation of deep clone is likely to use some sort of shallow clone 
operation is that not available to pure JS code.  This suggest that a better 
way to get faster deep cloning functions is to make a native shallow clone 
function available to JS code. 

Allen





___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread gaz Heyes
I was pondering about this on twitter, at first I thought using cycrllic
variables to resolve references to object within the JSON object but
actually we just need "this" to work within object literals and be allowed
in the specification. For example this works currently :
({a:function(){
  return this.b;
},b:123}).a()

But it would be nicer to resolve "this" inside a object literal property to
be itself rather than window or undefined:
({a:this,b:123}).a.b

This would make JSON much smaller and allow circular references without
losing data.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Jake Verbaten
>
> The idea here is that methods do not belong in data structures (clone
>> should be to efficiently clone data).
>>
>
> This is already too much "unfortunate" restriction.
>
> What about calculated "get" properties:
>
> > var o = {
> ...   get foo() {
> ... return "foo";
> ...   }
> ... },
> ... clone = JSON.parse(JSON.stringify(o));
>
> > clone
> { foo: 'foo' }
>

I don't know what the sensible choice here is. Could be either way.

Your right, restrictions are annoying.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Rick Waldron
On Sun, Jan 22, 2012 at 4:05 PM, Jake Verbaten  wrote:

> On Sun, Jan 22, 2012 at 8:29 PM, Rick Waldron wrote:
>
>> Potential issues
>>>
>>>  - subset of JSON is too restricted to be useful
>>>
>>
>> This alone seems like a deal-breaker/non-starter. How would you copy
>> methods? Forgetting about cyclic reference exceptions for a moment:
>>
>
> The idea here is that methods do not belong in data structures (clone
> should be to efficiently clone data).
>

This is already too much "unfortunate" restriction.

What about calculated "get" properties:

> var o = {
...   get foo() {
... return "foo";
...   }
... },
... clone = JSON.parse(JSON.stringify(o));

> clone
{ foo: 'foo' }




> a possible solution would be allow you to set the [[Prototype]] of the
> returned clone through the API somehow and then store methods on prototypes.
>
> It does gain the benefit of not having to document the edge-case behaviour
> for cloning methods. It would presumably also be an API that can
> efficiently clone the new binary data types. The main purpose is efficient
> in memory copies of data and not generic cloning of things.
>
> If we add a clone, we probably want to add support for cloning binary data
> types to the list as well.
>
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Jake Verbaten
On Sun, Jan 22, 2012 at 8:29 PM, Rick Waldron wrote:

> Potential issues
>>
>>  - subset of JSON is too restricted to be useful
>>
>
> This alone seems like a deal-breaker/non-starter. How would you copy
> methods? Forgetting about cyclic reference exceptions for a moment:
>

The idea here is that methods do not belong in data structures (clone
should be to efficiently clone data). a possible solution would be allow
you to set the [[Prototype]] of the returned clone through the API somehow
and then store methods on prototypes.

It does gain the benefit of not having to document the edge-case behaviour
for cloning methods. It would presumably also be an API that can
efficiently clone the new binary data types. The main purpose is efficient
in memory copies of data and not generic cloning of things.

If we add a clone, we probably want to add support for cloning binary data
types to the list as well.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Re: Deep cloning objects defined by JSON.

2012-01-22 Thread Rick Waldron
>
> Potential issues
>
>  - subset of JSON is too restricted to be useful
>

This alone seems like a deal-breaker/non-starter. How would you copy
methods? Forgetting about cyclic reference exceptions for a moment:


var o = {
  s: "string",
  n: 1,
  a: [ 1, 2, 3, 4 ],
  o: {
method: function( prop ) {
  return "stuff";
},
n: null,
u: undefined
  },
  bt: true,
  bf: false
},
clone = JSON.parse(JSON.stringify(o));

> clone

{
  s: 'string',
  n: 1,
  a: [ 1, 2, 3, 4 ],
  o: {
n: null
  },
  bt: true,
  bf: false
}


While it has the benefit of losing all of its references to the original
object, it also lost any methods or initialized but unassigned (undefined)
properties. Security concern trumps the inclusion of methods in valid JSON

Rick





>  - Proxies/private state may cause issues (the same issues would apply to
> JSON.stringify ?)
>  - What's the value of the [[Prototype]] of the clone? (JSON.parse uses
> the standard [[Prototype]] for the stringified object)
>  - Do we expect anything sensible to happen with host objects? (JSON.parse
> returns objects with few or no properties for host objects)
>  - Do we solve cyclic references? (JSON.parse fails on them)
>
> ___
> es-discuss mailing list
> es-discuss@mozilla.org
> https://mail.mozilla.org/listinfo/es-discuss
>
>
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss


Deep cloning objects defined by JSON.

2012-01-22 Thread Jake Verbaten
Most of the time the purpose of deep cloning objects is deep cloning data
structures.

It's been discussed that generically deep cloning proxies, privates and
functions is a non-trivial problem.

However it would be of value to have a mechanism to deep clone anything
that would be valid JSON (Limiting to JSON is arbitary, it's a well defined
subset and none of the subset would involve difficult points to resolve).

This gives us

 - An efficient deep clone implementation in the js engine
 - Solves difficulties with deep cloning by disallowing difficult objects
from being deep cloned.
 - gets rid of every clone function that every library has.

I presume this would literally be a highly optimised version of
JSON.parse(JSON.stringify(o)).

Potential issues

 - subset of JSON is too restricted to be useful
 - Proxies/private state may cause issues (the same issues would apply to
JSON.stringify ?)
 - What's the value of the [[Prototype]] of the clone? (JSON.parse uses the
standard [[Prototype]] for the stringified object)
 - Do we expect anything sensible to happen with host objects? (JSON.parse
returns objects with few or no properties for host objects)
 - Do we solve cyclic references? (JSON.parse fails on them)
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss