[ https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14739096#comment-14739096 ]
Marcelo Vanzin commented on SPARK-10528: ---------------------------------------- Ok, if you don't have hdfs-site.xml you're probably using the local filesystem. - Do you have a local "/tmp/hive" file in the same drive where you're starting the shell? - If you do, is it a file or a directory? If a directory, what are the permissions? - If you don't, can you try to create a world-writable "/tmp/hive" directory on that drive? I don't think there's currently a way to override that directory's location (it's the "hive.exec.scratchdir" hive conf, but there's no way to set it for HiveContext currently). > spark-shell throws java.lang.RuntimeException: The root scratch dir: > /tmp/hive on HDFS should be writable. > ---------------------------------------------------------------------------------------------------------- > > Key: SPARK-10528 > URL: https://issues.apache.org/jira/browse/SPARK-10528 > Project: Spark > Issue Type: Bug > Components: Spark Shell > Affects Versions: 1.5.0 > Environment: Windows 7 x64 > Reporter: Aliaksei Belablotski > Priority: Minor > > Starting spark-shell throws > java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: > /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org