On 11.07.2011 16:49, David Bruant wrote:
Le 11/07/2011 14:29, Dmitry A. Soshnikov a écrit :
On 11.07.2011 2:42, David Bruant wrote:
Le 10/07/2011 22:46, Dmitry A. Soshnikov a écrit :
Here I put some extensions for arrays standard library (separated from this thread: https://mail.mozilla.org/pipermail/es-discuss/2011-July/015856.html where Array.of and Array.from were considered).

We can consider also the following (as a first step):

*- Array.prototype.remove(value, all)*

[1, 2, 3, 2].remove(2); // [1, 3, 2]
[1, 2, 3, 2].remove(2, true); // [1, 3]

(seems this function is required more than Array.of, because at least I saw it implemented in all frameworks and used it myself).

*- Array.prototype.subtract(array)*

[1, 2, 3, 4].subtract([2, 4]); // [1, 3]

*- Array.seq(from, to)* // or Array.range(from, to)

Array.seq(1, 5); // [1, 2, 3, 4, 5]

*- Array.build(n, fn)*

Array.build(5, function(index) index + 1); // [1, 2, 3, 4, 5]

*- Array.min(array), Array.max(array)* (can be implemented with Math.max/min and apply though)

Array.min = (array) -> Math.min.apply(Math, array)

*- Array.prototype.split(n)*

["a", "b", "c", "d", "e"].split(3) // [["a", "b", "c"], ["d", "e", "f"]]

Perhaps even to build objects from lists of keys and values (this function is usually called as `zip`):

*- Object.fromArrays(["a", "b", "c"], [1, 2, 3]);* // {a: 1, b: 2, c: 3}

*- Array.prototype.unique*

[1, 3, 2, 5, 5, 3].unique(); // [1, 3, 2, 5]

Thus, all names of methods can be discussed.
I like a lot all of these ideas, but I can't help thinking that they do not seem to be aligned with the initial ECMAScript array design which is that arrays are ECMAScript objects (which is very different from what we'd understand of "array" in C or "lists" in Erlang as you cite them). The question I ask for each of your Array.prototype ideas is "how does it apply to non-dense arrays?".


Yes, there is a difference in implementation, but ideologically all these concepts (C's array, JS array, List, etc) stand nearly. E.g. Python's lists are similar to JS arrays. Abstract operations are completely OK -- regardless the implementation and terminology.

About the question on sparse (non-dense) arrays -- the answer is -- the same as array methods (e.g. map, forEach, etc) currently do when meat holes -- just skip them.

E.g. for "remove" method:

Object.defineProperty(Array.prototype, "remove", {
    value: function (item, all) {
        for (var k = 0; k < this.length; k++) {
            if (!(k in this)) continue; // support sparse arrays
            if (this[k] === item) {
                this.splice(k, 1);
                if (!all) break;
            }
        }
        return this;
    },
    configurable: true,
    writable: true
});

console.log([1, 2, 3, 4, 2].remove(2)); // [1, 3, 4, 2]
console.log([1, 2, 3, 4, 2].remove(2, true)); // [1, 3, 4]

// sparse array
var data = Array(5);
data[3] = 2;
data[0] = 1;
console.log(data); // [1, , , 2, ]
console.log(data.remove(2)); // [1, , , ,]
So .length is kept from the initial array?
If so, then:
console.log([1, 2, 3, 4, 2].remove(2)); // [1, 3, 4, 2, ] // added empty element


No, the length is modified in this particular case by removing the items. I missed in output in first case, should be [1, , , , 2, ] and then [1, , , ,]. I.e. just 2 is removed, and length is decremented, all other positions are kept.



Creating a List or a DenseArray (or both?) type sounds to better capture your intentions (especially since you provided a link to Erlang "list" methods).

I still don't see a big issue with handling sparse arrays as shown above. It's in JS array's nature to be either sparse or dense (regardless implementations of course -- for example, SpiderMonkey may create even a real low-level C's array for this -- [1, 2, 3], but not JS object -- it's dense and contains only numbers -- why not to allocate C's array -- SM does it).

It could inherit everything from Array.prototype for free.
Actually, this could be implemented with proxies :-)


Sure, but we need the case for this. Actually we have already ideas in this direction -- typed arrays, which are effectively dense.
The idea of creating a non-sparse array type is indeed in between the additions of high-level abstractions (maps and sets) and what is done in typed arrays which are dense by nature.



Since we're suggesting array additions, I would be interested in trying to address one issue of forEach, map, every, some and filter. They all have a well-defined algorithm. Consequently, if the callback function has side-effects, these are deterministic. This, however, prevent efficient (parallelized, for instance) implementation. This is unfortunate since in a lot of cases, people don't do side-effect and would certainly trade the side-effect determinism guarantee for performance. Could it be considered to add non-deterministic versions of these functions? They would be defined like Array.prototype.sort is, in terms of guarantees (like "the callback function will be called at most once on which array element" for 'every' and 'some' for instance) rather than with an algorithm. I have no strong opinion on how to name them. Maybe adding an N (for "Non-deterministic") at the end of the equivalent method (Array.prototype.forEachN, Array.prototype.mapN, etc.)?


We can think on it, though of course a real usage use-cases are required for it. Currently I can say that algorithms of map, forEach, etc are quite logical and how they should be. Do you propose not to check e.g. holes and handle them too?
Sorry for the confusion, my point was not about holes at all. I'm not concerned by holes but by the fact that current forEach, map, etc. spec is in term of algorithms. When I write [1, 2, 3, 4].map(function(e){return e*e;}), I do not care of the order in which elements of the new array are created (the relative index order of function calls). However, if engines implemented non-deterministic .map method, I could notice it with the side effects if my callback function has some, which would make programming unpredictable, etc. As a consequence, the map spec had to be a deterministic algorithm. My point is that the map spec is a deterministic algorithm because side-effects would be noticeable otherwise. However, this prevent implementations where function calls would be done in parallel for instance (for better performances). In some cases (like the one I showed), the exact order in which the function calls are performed does not matter, but I have no way to tell the JS engine "I don't need the execution order guarantee", allowing it to do the calls in parallel. The addition of the functions I suggested would be the way to say it.


I see, so this is for sort-of parallel "map-reduce" requests to be able handle an array in parallel? Yes, perhaps, though, it's even different separate topic to discuss. These array-method additions something simpler than parallel processing with non-determined behavior ;)

Dmitry.


Or to handle elements which were added/removed during the enumeration?
Non-deterministic too.

Or how the non-determinism looks like here?
faster :-)
Of course, that's a trade-off between determinism and (potential) performance. The person who'd use forEachN would need to acknowledge the risk of non-determinism.

David

_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss

Reply via email to