Hi, I wanted to try the 1.6.0 version of Spark, but when I run it into my local machine, it throws me this exception :
java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Thing is, this problem happened to me in the 1.5.1 version, and some people even had it in the 1.5.0 version <http://stackoverflow.com/questions/33234311/spark-1-5-1-spark-shell-throws-runtimeexception> Thanks for any help. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-shell-throws-java-lang-RuntimeException-tp25903.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org