[ 
https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14996667#comment-14996667
 ] 

Michel Lemay commented on SPARK-10528:
--------------------------------------

Looks like the problem is still there with precompiled binaries 1.5.1

I created and chmod using winutils.exe as explained earlier:
from /tmp:
winutils.exe ls hive 
drwxrwxrwx

Furthermore, hadoop LocalFileSystem does not seems to be able to change 
permissions as shown here:

`import org.apache.hadoop.fs._
val path = new Path("file:/tmp/hive")
val lfs = FileSystem.get(path.toUri(), sc.hadoopConfiguration)
lfs.getFileStatus(path).getPermission()`

Shows: res0: org.apache.hadoop.fs.permission.FsPermission = rw-rw-rw-

`lfs.setPermission(path, new 
org.apache.hadoop.fs.permission.FsPermission(0777.toShort))
lfs.getFileStatus(new Path("file:/tmp/hive")).getPermission()`

Still shows rw-rw-rw-




> spark-shell throws java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable.
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10528
>                 URL: https://issues.apache.org/jira/browse/SPARK-10528
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.5.0
>         Environment: Windows 7 x64
>            Reporter: Aliaksei Belablotski
>            Priority: Minor
>
> Starting spark-shell throws
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to