Currently I've HDFS with version hadoop0.20.2-cdh3u6 on Spark 0.9.1. I want
to upgrade to Spark 1.0.0 soon and would also like to upgrade my HDFS
version as well.

What's the recommended version of HDFS to use with Spark 1.0.0? I don't
know much about YARN but I would just like to use the Spark standalone
cluster mode.

Thanks
-Soumya

Reply via email to