OK, the complete silence leads me to believe that not many people reduce
with set/intersection.
However, I thought of a couple of other functions that don't work with the
new framework: max and min.
user> (require '[clojure.core.reducers :as r])
user=> (reduce min [1 2])
1
user=> (r/reduce min [1 2])
ArityException Wrong number of args (0) passed to: core$min
clojure.lang.AFn.throwArity (AFn.java:437)
Min, like intersection, has a noncomputable "identity" value -- infinity.
And max and min are used in many more algorithms than intersection, so
maybe this will start some discussion.
I really think that single elements are the place to bottom out the
recursion.
On Sunday, May 20, 2012 3:24:57 PM UTC-4, Leif wrote:
>
> From the article: "The combining fn must supply an identity value when
> called with no arguments" This means that you can't use combining
> functions whose identity value can't be computed, but satisfies the proper
> rules. E.g. set intersection:
>
> (intersection) == e == the set of all possible elements of a set
>
> Of course this arity doesn't exist, but the arity-1 could be viewed as
> shorthand for:
> (intersection s) == s
> ; == (intersection s e) == (intersection e s) ; if e could actually be
> computed
>
> So, the new lib behaves slightly differently than core/reduce here:
> (use 'clojure.set)
> (require '(clojure.core [reducers :as r]))
> (reduce intersection [#{1 2} #{2 3}]) ;==> #{2}
> (r/reduce intersection [#{1 2} #{2 3}]) ;==> throws ArityException
> ; for completeness
> (reduce intersection []) ;==> throws ArityException
> (r/reduce intersection []) ;==> throws ArityException
>
> It might fix things to special-case empty collections and make the
> "leaves" of the recursion single elements, but maybe functions with these
> weird non-computable identity elements, like set intersection, are too rare
> to bother. I can't think of another one off the top of my head.
>
> --Leif
>
> On Tuesday, May 8, 2012 11:20:37 AM UTC-4, Rich Hickey wrote:
>>
>> I'm happy to have pushed [1] today the beginnings of a new Clojure
>> library for higher-order manipulation of collections, based upon *reduce*
>> and *fold*. Of course, Clojure already has Lisp's *reduce*, which
>> corresponds to the traditional *foldl* of functional programming. *reduce*
>> is based upon sequences, as are many of the core functions of Clojure, like
>> *map*, *filter* etc. So, what could be better? It's a long story, so I'll
>> give you the ending first:
>>
>> * There is a new namespace: clojure.core.reducers
>> * It contains new versions of *map*, *filter* etc based upon transforming
>> reducing functions - reducers
>> * It contains a new function, **fold**, which is a parallel
>> reduce+combine
>> * *fold* uses **fork/join** when working with (the existing!) Clojure
>> vectors and maps
>> * Your new parallel code has exactly the same shape as your existing
>> seq-based code
>> * The reducers are composable
>> * Reducer implementations are primarily functional - no iterators
>> * The model uses regular data structures, not 'parallel collections' or
>> other OO malarkey
>> * It's fast, and can become faster still
>> * This is work-in-progress
>>
>> I've described the library in more detail here:
>>
>>
>> http://clojure.com/blog/2012/05/08/reducers-a-library-and-model-for-collection-processing.html
>>
>>
>> Rich
>>
>> [1]
>> https://github.com/clojure/clojure/commit/89e5dce0fdfec4bc09fa956512af08d8b14004f6
>>
>>
>>
--
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to [email protected]
Note that posts from new members are moderated - please be patient with your
first post.
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en