[ 
https://issues.apache.org/jira/browse/SPARK-10066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15137201#comment-15137201
 ] 

Sangeet Chourey commented on SPARK-10066:
-----------------------------------------

RESOLVED : Downloaded the correct Winutils version and issue was resolved. 
Ideally, it should be locally compiled but if downloading compiled version make 
sure that it is 32/64 bit as applicable.
I tried on Windows 7 64 bit, Spark 1.6 and downloaded winutils.exe from 
https://www.barik.net/archive/2015/01/19/172716/ and it worked..!!
Complete Steps are at : 
http://letstalkspark.blogspot.com/2016/02/getting-started-with-spark-on-window-64.html

> Can't create HiveContext with spark-shell or spark-sql on snapshot
> ------------------------------------------------------------------
>
>                 Key: SPARK-10066
>                 URL: https://issues.apache.org/jira/browse/SPARK-10066
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, SQL
>    Affects Versions: 1.5.0
>         Environment: Centos 6.6
>            Reporter: Robert Beauchemin
>            Priority: Minor
>
> Built the 1.5.0-preview-20150812 with the following:
> ./make-distribution.sh -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive 
> -Phive-thriftserver -Psparkr -DskipTests
> Starting spark-shell or spark-sql returns the following error: 
> java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: 
> /tmp/hive on HDFS should be writable. Current permissions are: rwx------
>         at 
> org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
>        ....      [elided]
> at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)   
>                     
> It's trying to create a new HiveContext. Running pySpark or sparkR works and 
> creates a HiveContext successfully. SqlContext can be created successfully 
> with any shell.
> I've tried changing permissions on that HDFS directory (even as far as making 
> it world-writable) without success. Tried changing SPARK_USER and also 
> running spark-shell as different users without success.
> This works on same machine on 1.4.1 and on earlier pre-release versions of 
> Spark 1.5.0 (same make-distribution parms) sucessfully. Just trying the 
> snapshot... 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to