[ 
https://issues.apache.org/jira/browse/MAHOUT-1544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13990043#comment-13990043
 ] 

Dmitriy Lyubimov commented on MAHOUT-1544:
------------------------------------------

Did you try to run simple.mscala examples in it in _standalone_ spark mode? 
"local" master will not do. 

I think you will encounter that workers have troubles deserializing the tasks. 
I've read REPL code enough to be fairly confident it is not that simple. Most 
of complexity stems from the necessity to expose user-defined data types and 
closures to remote classloaders. I am not sure your code takes care of all that.

> make Mahout DSL shell depend dynamically on Spark
> -------------------------------------------------
>
>                 Key: MAHOUT-1544
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1544
>             Project: Mahout
>          Issue Type: Improvement
>            Reporter: Anand Avati
>             Fix For: 1.0
>
>         Attachments: 0001-spark-shell-rename-to-shell.patch, 
> 0002-shell-make-dependency-on-Spark-optional-and-dynamic.patch, 
> 0002-shell-make-dependency-on-Spark-optional-and-dynamic.patch, 
> 0002-shell-make-dependency-on-Spark-optional-and-dynamic.patch
>
>
> Today the Mahout's scala shell depends on spark.
> Create a cleaner separation between the shell and Spark. For e.g, the in core 
> scalabindings and operators do not need Spark. So make Spark a runtime 
> "addon" to the shell. Similarly in the future new distributed backend engines 
> can transparently (dynamically) be available through the DSL shell.
> The new shell works, looks and feels exactly like the shell before, but has a 
> cleaner modular architecture.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to