What is the recommended Spark setup? 

I imagine most of us will have HDFS configured (with either local files or an 
actual cluster). 

Since most of Mahout is recommended to be run on Hadoop 1.x we should use 
Mesos? https://github.com/mesos/hadoop

This would mean we’d need to have at least Hadoop 1.2.1 (in mesos and current 
mahout pom). We’d use Mesos to manage hadoop and spark jobs but HDFS would be 
controlled separately by hadoop itself.

Is this about right? Is there a setup doc I missed?

Reply via email to