[ https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15121792#comment-15121792 ]
Michel Lemay commented on SPARK-10528: -------------------------------------- Your issue is about deleting temp files at time of shutdown. "Exception while deleting Spark temp dir:" I'm not aware of a jira about it but there should be one somewhere. Here, the problem is related to the startup of spark-shell failing to initialize Hive on windows. Probably related to the difference between the way Windows OS deal with permission sets on files and folders. This failure leads to the inability to use major spark features like sqlContext and DataFrames. > spark-shell throws java.lang.RuntimeException: The root scratch dir: > /tmp/hive on HDFS should be writable. > ---------------------------------------------------------------------------------------------------------- > > Key: SPARK-10528 > URL: https://issues.apache.org/jira/browse/SPARK-10528 > Project: Spark > Issue Type: Bug > Components: Spark Shell > Affects Versions: 1.5.0 > Environment: Windows 7 x64 > Reporter: Aliaksei Belablotski > Priority: Minor > > Starting spark-shell throws > java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: > /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org