On Nov 11, 2007 6:01 PM, Garrett Smith <[EMAIL PROTECTED]> wrote: > Function findDuplicate is more like "mark duplicates". The side effect > is that it adds a __marker property to each object. As it stands, this > function can not be called more than once. The second call might be > passed a different array, but containing one of those objects that had > the __marker left over. Calling delete in a second loop would make the > algorithm 2n (best time). But this is not entirely safe; calling > delete on a Host object or window property will throw errors in > JScript env; and the property will still be there. So you can set > __marker = false, but then you've got a 2N algorithm that leaks a > "__marker" to objects in the program.
The standard trick to avoid resetting the mark is to treat it not as a bit, but to set it to a distinguished value every time (a serial number). Just to complete the example for ES4, function findDuplicate(array) { let m = new Map.<*,null> for ( let i=0, limit=array.length ; i < limit ; i++ ) { let v = array[i] if (v is Object) { if (m.has(v)) alert("Found duplicate") else m.put(v,null) } } } Obviously that's reentrant and all that. By default, Map uses intrinsic::hashcode and intrinsic::=== to handle objects, so the above code handles object identity without any string conversions, and unless intrinsic::hashcode is out to lunch, it's roughly constant time, giving an O(n) algorithm for duplicate finding. Rehashing behind the scene may in practice make it more expensive, of course. (We've not seen fit to add a method to give a hint to the Map about how large it may need to be.) > Consider a case where we have an array-like collection of objects and > we want to exclude duplicates. Not uncommon at all. Map provides a collection of key-value mappings without duplicates; you can always use "null" for the value as I did above. It does not have the Array methods, nor do the generic Array methods work on it, because you can't talk about elements in terms of their indices. On the other hand, "Array-like" and "exclude duplicates" don't combine all that well, maybe. I suspect what you're saying is an object that behaves like an Array so that the Array generic methods can be used on it, but to which elements are added by special functionality, or excluded later by calls to unique(). It's pretty simple in ES4 to create objects that behave like Arrays in their Get/Put/HasProperty/Delete behavior. For example, here's a sketch for an implementation of UniqueArray.<T> that does not store duplicates: class UniqueArray.<T> { private var m = new Map.<uint,T> private var s = new Map.<T,null> private var _length = 0; // meta methods intercept all requests for unknown properties meta function get(n) { if (m.has(n)) return m.get(n) return undefined } meta function set(n,v) { // do not store duplicates if (!(s.has(v))) m.put(n,v) if (n >= _length) _length = n+1 } meta function has(n) m.has(n) meta function delete(n) m.remove(n) function get length() _length function set length(n) { if (n < _length) { for ( let i=n ; i < _length ; i++ ) m.remove(i) _length = n; } } } > What's missing from ES4: > Set, SortedSet type Set.<T> = Map.<T,null>, though obviously does not have intersect, union -- clearly a weakness. Good enough in simple cases. --lars _______________________________________________ Es4-discuss mailing list Es4-discuss@mozilla.org https://mail.mozilla.org/listinfo/es4-discuss