Also a ready-to-use server with Spark MLlib:
http://docs.prediction.io/recommendation/quickstart/

The source code is here:
https://github.com/PredictionIO/PredictionIO/tree/develop/templates/scala-parallel-recommendation


Simon

On Sun, Nov 30, 2014 at 12:17 PM, Pat Ferrel <p...@occamsmachete.com> wrote:

> Actually the spark-itemsimilarity job and related code in the Spark module
> of Mahout creates all-pairs similarity too. It’s designed to use with a
> search engine, which provides the query part of the recommender. Integrate
> the two and you have a near realtime scalable item-based/cooccurrence
> collaborative filtering type recommender.
>
>
> On Nov 30, 2014, at 12:09 PM, Sean Owen <so...@cloudera.com> wrote:
>
> There is an implementation of all-pairs similarity. Have a look at the
> DIMSUM implementation in RowMatrix. It is an element of what you would
> need for such a recommender, but not the whole thing.
>
> You can also do the model-building part of an ALS-based recommender
> with ALS in MLlib.
>
> So, no not directly, but there are related pieces.
>
> On Sun, Nov 30, 2014 at 5:36 PM, shahab <shahab.mok...@gmail.com> wrote:
> > Hi,
> >
> > I just wonder if there is any implementation for Item-based Collaborative
> > Filtering in Spark?
> >
> > best,
> > /Shahab
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to