[ 
https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15101742#comment-15101742
 ] 

Krzysztof Gawryƛ commented on SPARK-10528:
------------------------------------------

I have the same problem using spark 1.5.2 and windows 7 x64 and none of above 
fixes helped. The permissions to folder /tmp/hive are ok but they are changed 
during code execution.

In my case the problem is in hadoop-common:2.6.0:jar in 
org.apache.hadoop.fs.RawLocalFileSystem class in loadPermissionInfo() method.
It is trying to execute command F:\spark\bin\winutils.exe ls -F D:\tmp\hive  in 
shell and this command returns "Incorrect command line arguments." this result 
in exception which is catched in loadPermissionInfo and the permissions are 
changed to default because of that:
line 609 RawLocalFileSystem.java
{code}
        if (ioe.getExitCode() != 1) {
          e = ioe;
        } else {
          setPermission(null);
          setOwner(null);
          setGroup(null);
        }
{code}

> spark-shell throws java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable.
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10528
>                 URL: https://issues.apache.org/jira/browse/SPARK-10528
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.5.0
>         Environment: Windows 7 x64
>            Reporter: Aliaksei Belablotski
>            Priority: Minor
>
> Starting spark-shell throws
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to