[ 
https://issues.apache.org/jira/browse/SPARK-10528?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14739292#comment-14739292
 ] 

Aliaksei Belablotski commented on SPARK-10528:
----------------------------------------------

Yes, Spark Shell is working - I'm experimenting with RDD transformations.
Sure, result is:

scala> import org.apache.hadoop.fs._
import org.apache.hadoop.fs._

scala> val path = new Path("file:/tmp/hive")
path: org.apache.hadoop.fs.Path = file:/tmp/hive

scala> val lfs = FileSystem.get(path.toUri(), sc.hadoopConfiguration)
lfs: org.apache.hadoop.fs.FileSystem = 
org.apache.hadoop.fs.LocalFileSystem@701489ca

scala> lfs.getFileStatus(path).getPermission()
res45: org.apache.hadoop.fs.permission.FsPermission = rw-rw-rw-

> spark-shell throws java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable.
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10528
>                 URL: https://issues.apache.org/jira/browse/SPARK-10528
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.5.0
>         Environment: Windows 7 x64
>            Reporter: Aliaksei Belablotski
>            Priority: Minor
>
> Starting spark-shell throws
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to