[ 
https://issues.apache.org/jira/browse/SPARK-12435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15066442#comment-15066442
 ] 

Maciej BryƄski commented on SPARK-12435:
----------------------------------------

I have the same problem. And workaround didn't work.


> Installing Spark
> ----------------
>
>                 Key: SPARK-12435
>                 URL: https://issues.apache.org/jira/browse/SPARK-12435
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.5.2
>         Environment: Windows 7 - 64 bit
>            Reporter: Amit
>            Priority: Blocker
>
> Hello There
> I am attempting to install Spark on Windows 7. I am able to get the Spark 
> version 1.2.2 for hadoop 2.3 running without a problem.
> However Installing spark version 1.5.2 for hadoop 2.6, I get the following 
> error :
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
>         at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
>         at 
> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
>               
> Please suggest.
> Thanks
> Amit



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to