Also a ready-to-use server with Spark MLlib:
http://docs.prediction.io/recommendation/quickstart/
The source code is here:
https://github.com/PredictionIO/PredictionIO/tree/develop/templates/scala-parallel-recommendation
Simon
On Sun, Nov 30, 2014 at 12:17 PM, Pat Ferrel wrote:
> Actually the
Actually the spark-itemsimilarity job and related code in the Spark module of
Mahout creates all-pairs similarity too. It’s designed to use with a search
engine, which provides the query part of the recommender. Integrate the two and
you have a near realtime scalable item-based/cooccurrence coll
There is an implementation of all-pairs similarity. Have a look at the
DIMSUM implementation in RowMatrix. It is an element of what you would
need for such a recommender, but not the whole thing.
You can also do the model-building part of an ALS-based recommender
with ALS in MLlib.
So, no not dir
The latest version of MLlib has it built in no?
J
Sent from my iPhone
> On Nov 30, 2014, at 9:36 AM, shahab wrote:
>
> Hi,
>
> I just wonder if there is any implementation for Item-based Collaborative
> Filtering in Spark?
>
> best,
> /Shahab
Hi,
I just wonder if there is any implementation for Item-based Collaborative
Filtering in Spark?
best,
/Shahab