On May 28, 2010, at 1:40 PM, Sam Ruby wrote:

On 05/28/2010 01:20 PM, Brendan Eich wrote:

That said, I'm not sure I understand why this should gate anything in
this thread. Value types should be frozen (shallowly immutable)
regardless, or other things break, e.g., they could no longer be
transparently passed by copy. C# got this wrong, and paid a semantic
complexity price we must avoid. Non-frozen structs should not be value
types. Frozen structs could be value types, or could be wrapped in
value types or something.

Agreed. Sam?

There are so many undefined terms in that paragraph that I don't know what I would be agreeing to.

Ok -- sorry about that. Mark and I are definitely on the same page here, and used to the lingo.


For example, I don't know why the word "shallowly" was inserted there. Was that just reflex, or is there an actual requirement to allow object references inside a struct?

Mark was referring to value types, not the recent struct idea. Object.freeze (ES5) is shallow, it does not try to walk an object graph spanning tree freezing whatever it can reach (the Ice-9 disaster!). Henry Baker's egal does shallow bit comparison. That's the key idea. So the freezing need not be deep.


Looking at the syntax that Brendan put out for discussion purposes, it isn't clear to me how one would do that.

const TA =
   Array.newTypedArray(fixed_length,
Object.newStructType({x:"u32", y:"u32", z:"u32", r:"u8", g:"u8", b:"u8", a:"u8"}));
let a = new TA(...);

Mark mentions passed by copy. What happens if I pass a[1] as an parameter on a method call? Does something semantically different happen if the struct is frozen vs non-frozen? Is that complexity worth it?

In the sketchy proposal for structs so far, a[i] reifies an object reflecting the i'th element of a, not a new struct type. The array still can be used to read and write packed, machine-typed, GPU- friendly data, specifically via reads and writes of a[i].x, a[j].y, etc.

But if you extract a[i] and pass it around, you are definitely allocating an object to represent the struct element, and however fast it is to allocate a new object, you're not on the ultra-high- performance path that WebGL wants, and uses today via typed arrays. This is not a loss compared to the typed-arrays-as-views-of-one-shared- byte-buffer idea because there's no way with typed arrays to view or extract the whole element as a struct instance.

So far, I've been talking about the "structs defined by runtime type descriptors" idea we've entertained as an alternative for the WebGL use-case satisfied by typed arrays.

If you don't insist that structs are value types, then there's no issue here. These structs are mutable, but not extensible; you can write as well as read their members.

Your question about about frozen vs. non-frozen, by freezing the structs, breaks the WebGL use-case in order to make structs be value types. That's no good for the WebGL use-cases, but I'll answer your question anyway: *if* the struct were a value type, then it would be frozen, and loading a[i] to pass as a parameter to a function call would be free to copy the struct data rather than pass an object reference, if doing so were more efficient.

This freedom to copy is not just an optimization win (sometimes, depending on size and other considerations). It's a different type model for programmers to count on. More below.


Putting that aside for the moment, my more specific questions is: under what conditions would it ever be possible for a[1]===a[2] to be true?

BTW, === and other operators are not wanted for WebGL structs. Again you can't even reference the aggregate element (struct) type via typed arrays -- you can just load and store primitive types in *members* of the struct elements via aliasing typed array "views" of the underlying bytes.


There is much wrapped in that simple question. I'm inferring a lot from the discussion: "typed arrays" have a fixed number of elements, each of which has a fixed length. It should be possible for implementations to store entire arrays in contiguous storage.

Right.


As such, a[1] and a[2] in a typed array can never be the same object. Which means that they can never be ===, much less egal. By contrast, they could conceivably be the same object in a "classic" Array, depending on how they were constructed.

That's right, an array of references to objects in the heap can have two elements referring to the same object. Object identity is by address or equivalent "safe pointer".

With anything like struct or value types, the array elements are not safe pointers into the object heap -- they are bits stored in bytes or larger units stored adjacent to one another for a[i] and a[i+1]. Identity is by bit-string value.


To facilitate discussion, I toss out the following:

 var a = new Array();
 a[0] = "a";
 a[1] = "ab";
 a[0] += "b";

 if (a[0] === a[1]) { ... }

To the casual developer, I will assert that the fact that these strings are treated as being equal is an indication that they have the same value (i.e., sequence of bits) and not an indication that they occupy the same storage location (which could conceivably be true, but that's not generally something the implementation directly exposes).

Notice how it is critical that primitive string instances cannot be mutated. That is, suppose you could assign to an element of a string via a[1][1] = "c", and thereby cause a[1] to be "ac".

Well, no big deal, right? a[1] *was* === "ab", and it is *now* === "ac". Except there could be other references to a[1], passed to a function a second before we observed "ab", or shared via copying references to other vars or properties. Are those references expecting "ac", or will they be surprised not to have received what looks like an immutable string with chars "ab"?

You can make both kinds of strings work, and various programming languages have chosen one way or the other. JS has immutable-seeming strings, however. We are not looking to make an incompatible change to string semantics. But more crucially, we believe the ability to mutate any shared *value* (not a referenced object) is likely to result in more bugs than otherwise.

The obvious cases of numeric value types seem to be excluded, but they prove the rule. There's no exception for JS strings. Why should there be one for structs?

Object mutation is the name of the game in JS if you program imperatively by mutating the store, and local variables are not enough. But mutable objects are a source of bugs. Mutation is a bitch. Yet at least with objects, which have "reference type" semantics, it is a known and accepted part of the deal.

String, number, boolean, and possibly decimal, rational, etc. are "value types". You can't change someone's value received via an argument "behind their back." No one can call foo(x = 42) and for function foo(a) { setTimeout(alert, 0, a); } have other than 42 be alerted, even if the caller of foo(x = 42) immediate does x.mutatingAdd(1). There's no way to implement mutatingAdd on a primitive number. Any such method on a Number object instance that's automatically wrapped around 42 at the point of the method call won't be able to change the primitive value either.

But structs as sketched for WebGL array-of-struct use cases must be writable. Therefore they can' t be value types.


The real question here: are what is currently being called a struct more like a bit string with convenient methods of accessing slices, or are they more like objects where no matter how close the sequence of bits are, two objects in different locations are never the same.

WebGL structs are the latter, and operators are not wanted anyway.

Value types are the former, which is why I keep urging you not to mix up the very recent "schematic structs for WebGL (instead of typed array views)" idea with the value types proposal at http://wiki.ecmascript.org/doku.php?id=strawman:value_types -- the way to prototype Decimal or any other new numeric type is not as a "struct" of the proposed WebGL-array-of-structs kind, but as a "value type".

Sure, value types might *use* structs-as-proposed (assuming they make a future spec) to store the bits being compared by ===. But that would be up to the value type implementation, and the frozen restriction would apply, overriding any writable by default status that structs as hypothesized have.


It might very well be that the requirements are such that the final conclusion will reluctantly be that it isn't worth trying to make === have a sane definition for these structs. That just isn't something I would expect as a starting position.

It is a "starting point" given the value-types-are- (shallowly-)immutable premise recorded in the wiki'd discussion from last fall. But that may not be shared premise. We need to agree that objects are the only reference types, and therefore value types may be copied (not passed by reference) safely without fear of someone changing the received value's bits "behind your back".

/be
_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to