[ https://issues.apache.org/jira/browse/MAHOUT-1346?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13820487#comment-13820487 ]
Dmitriy Lyubimov commented on MAHOUT-1346: ------------------------------------------ PPS. the spark module have a specific CDH-2.0 profile to build against CDH 2.0 releases (could be plain hadoop, but that's what i just happen to be using at the moment). Which is what Spark 0.8 is built against a lot. welcome to add more 2.0 profiles. > Spark Bindings (DRM) > -------------------- > > Key: MAHOUT-1346 > URL: https://issues.apache.org/jira/browse/MAHOUT-1346 > Project: Mahout > Issue Type: Improvement > Affects Versions: 0.8 > Reporter: Dmitriy Lyubimov > Assignee: Dmitriy Lyubimov > Fix For: Backlog > > > Spark bindings for Mahout DRM. > DRM DSL. > Disclaimer. This will all be experimental at this point. > The idea is to wrap DRM by Spark RDD with support of some basic > functionality, perhaps some humble beginning of Cost-based optimizer > (0) Spark serialization support for Vector, Matrix > (1) Bagel transposition > (2) slim X'X > (2a) not-so-slim X'X > (3) blockify() (compose RDD containing vertical blocks of original input) > (4) read/write Mahout DRM off HDFS > (5) A'B > ... -- This message was sent by Atlassian JIRA (v6.1#6144)