[ 
https://issues.apache.org/jira/browse/SPARK-9701?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15030645#comment-15030645
 ] 

Xiu(Joe) Guo commented on SPARK-9701:
-------------------------------------

[~yhuai][~lian cheng] Would you mind reviewing my PR for SPARK-11562 and give 
me some feedback?

Thanks!

> allow not automatically using HiveContext with spark-shell when hive support 
> built in
> -------------------------------------------------------------------------------------
>
>                 Key: SPARK-9701
>                 URL: https://issues.apache.org/jira/browse/SPARK-9701
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.1
>            Reporter: Thomas Graves
>
> I build the spark jar with hive support as most of our grids have Hive.  We 
> were bringing up a new YARN cluster that didn't have hive installed on it yet 
> which results in the spark-shell failing to launch:
> java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate 
> org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
>         at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:374)
>         at 
> org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:116)
>         at 
> org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
>         at 
> org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
>         at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
>         at 
> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
> It would be nice to have a config or something  to tell it not to instantiate 
> a HiveContext



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to