[ https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14738118#comment-14738118 ]
Aliaksei Belablotski commented on SPARK-10528: ---------------------------------------------- Hi Marcelo, thanks for quick response. I'm using Spark in a local mode, starting Spark Shell as spark-shell --master local[2] I have no such problem with 1.4.1 - it's working on my Windows 7 laptop, after setting HADOOP_HOME to folder with bin/winutils.exe according https://issues.apache.org/jira/browse/SPARK-2356 Initially 1.5.0 has problem with locating winutils.exe, setting HADOOP_HOME to folder with bin/winutils.exe helps here as well. But after that 1.5.0 drops "java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-" Sorry, I don't have hdfs-site.xml in my installation. Please find execution log here: https://docs.google.com/document/d/1L4ieIY0CKbQhZAzsoq98WkmwsHZnXk4U75BIifzEW8Y/edit?usp=sharing > spark-shell throws java.lang.RuntimeException: The root scratch dir: > /tmp/hive on HDFS should be writable. > ---------------------------------------------------------------------------------------------------------- > > Key: SPARK-10528 > URL: https://issues.apache.org/jira/browse/SPARK-10528 > Project: Spark > Issue Type: Bug > Components: Spark Shell > Affects Versions: 1.5.0 > Environment: Windows 7 x64 > Reporter: Aliaksei Belablotski > > Starting spark-shell throws > java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: > /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org