[ 
https://issues.apache.org/jira/browse/SPARK-11472?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14986985#comment-14986985
 ] 

Pierre Beauvois commented on SPARK-11472:
-----------------------------------------

Hi Sean, thanks for you quick reply.

Sorry to say that but your feedback is losing me. 

I thought there was several ways to initialize a SparkContext: 

* during the shell startup (example below)

{noformat}
spark-shell -v --jars 
/opt/application/Spark/current/elastic/jar/elasticsearch-hadoop-2.1.1.jar 
--name elasticsearch-hadoop --master yarn-client --conf spark.es.net.ssl=true 
--conf spark.es.net.http.auth.user=asterix --conf 
spark.es.net.http.auth.pass=obelix --conf spark.es.nodes=potion.magique --conf 
spark.es.port=9200 --conf spark.es.field.read.empty.as.null=true
{noformat}

You can do something similar with spark-submit. 

==> working with Spark 1.5.1 and with or without the hive-site.xml

* from the command-line (example below)

{noformat}
scala> import org.apache.spark.SparkConf
scala> import org.apache.spark.SparkContext
scala> import org.apache.spark.SparkContext._
scala> sc.stop()
scala> val conf = new 
SparkConf().setAppName("elasticsearch-hadoop").setMaster("yarn-client")
conf.set("es.net.ssl", "true")
conf.set("es.net.http.auth.user","asterix")
conf.set("es.net.http.auth.pass","obelix")
conf.set("es.nodes", "potion.magique")
conf.set("es.port", "9200")
scala> val sc = new SparkContext(conf)
{noformat}

This is process is described here in the Spark documentation: 
[https://spark.apache.org/docs/latest/programming-guide.html#initializing-spark]

This is also explained here: 
[https://www.elastic.co/guide/en/elasticsearch/hadoop/current/spark.html#spark-native-cfg]
{color:red}
==> working with Spark 1.5.1 only without the hive-site.xml
{color}

* from an external file

{noformat}
spark-shell -v --jars 
/opt/application/Spark/current/elastic/jar/elasticsearch-hadoop-2.1.1.jar -i 
elastic-hadoop.scala
{noformat}

And the .scala file contains the commands used in the second point.

==> working with Spark 1.5.1 and with or without the hive-site.xml

If I'm not intended to stop the context or create a new one, why the option is 
still available? Moreover, why the Spark document explains how to stop/create a 
SparkContext? I'm lost...


> SparkContext creation error after sc.stop() when Spark is compiled for Hive
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-11472
>                 URL: https://issues.apache.org/jira/browse/SPARK-11472
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.5.1
>         Environment: Red Hat ES 6.7 x86_64
> Spark 1.5.1, Scala 2.10.4, Java 1.7.0_85, Hive 1.2.1
> Authentication done through Kerberos
>            Reporter: Pierre Beauvois
>
> Spark 1.5.1 has been compiled with the following command :
> {noformat}
> mvn -Pyarn -Phive -Phive-thriftserver -PsparkR -DskipTests -X clean package
> {noformat}
> After its installation, the file "hive-site.xml" has been added in the conf 
> directory (this is not an hard copy, it's a symbolic link). 
> When the spark-shell is started, the SparkContext and the sqlContext are 
> properly created. Nevertheless, when I stop the SparkContext and then try to 
> create a new one, an error appears. The output of this error is the following:
> {code:title=SparkContextCreationError.scala|borderStyle=solid}
> // imports
> scala> import org.apache.spark.SparkConf
> import org.apache.spark.SparkConf
> scala> import org.apache.spark.SparkContext
> import org.apache.spark.SparkContext
> scala> import org.apache.spark.SparkContext._
> import org.apache.spark.SparkContext._
> // simple SparkContext creation
> scala> val sc = new SparkContext(new SparkConf())
> // output error stack
> 15/11/03 09:10:05 ERROR Hive: MetaException(message:Delegation Token can be 
> issued only with kerberos authentication. Current AuthenticationMethod: TOKEN)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_delegation_token_result$get_delegation_token_resultStandardScheme.read(ThriftHiveMetastore.java)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_delegation_token_result$get_delegation_token_resultStandardScheme.read(ThriftHiveMetastore.java)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_delegation_token_result.read(ThriftHiveMetastore.java)
>         at 
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_delegation_token(ThriftHiveMetastore.java:3715)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_delegation_token(ThriftHiveMetastore.java:3701)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDelegationToken(HiveMetaStoreClient.java:1796)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
>         at com.sun.proxy.$Proxy19.getDelegationToken(Unknown Source)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.getDelegationToken(Hive.java:3150)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.yarn.Client$.org$apache$spark$deploy$yarn$Client$$obtainTokenForHiveMetastore(Client.scala:1260)
>         at 
> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:271)
>         at 
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:629)
>         at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:119)
>         at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
>         at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:523)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
>         at $line23.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
>         at $line23.$read$$iwC$$iwC$$iwC.<init>(<console>:41)
>         at $line23.$read$$iwC$$iwC.<init>(<console>:43)
>         at $line23.$read$$iwC.<init>(<console>:45)
>         at $line23.$read.<init>(<console>:47)
>         at $line23.$read$.<init>(<console>:51)
>         at $line23.$read$.<clinit>(<console>)
>         at $line23.$eval$.<init>(<console>:7)
>         at $line23.$eval$.<clinit>(<console>)
>         at $line23.$eval.$print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
>         at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 15/11/03 09:10:05 ERROR Client: Unexpected Exception 
> java.lang.reflect.InvocationTargetException
> 15/11/03 09:10:05 ERROR SparkContext: Error initializing SparkContext.
> java.lang.RuntimeException: Unexpected exception
>         at 
> org.apache.spark.deploy.yarn.Client$.org$apache$spark$deploy$yarn$Client$$obtainTokenForHiveMetastore(Client.scala:1280)
>         at 
> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:271)
>         at 
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:629)
>         at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:119)
>         at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
>         at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:523)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
>         at $line23.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
>         at $line23.$read$$iwC$$iwC$$iwC.<init>(<console>:41)
>         at $line23.$read$$iwC$$iwC.<init>(<console>:43)
>         at $line23.$read$$iwC.<init>(<console>:45)
>         at $line23.$read.<init>(<console>:47)
>         at $line23.$read$.<init>(<console>:51)
>         at $line23.$read$.<clinit>(<console>)
>         at $line23.$eval$.<init>(<console>:7)
>         at $line23.$eval$.<clinit>(<console>)
>         at $line23.$eval.$print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
>         at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.yarn.Client$.org$apache$spark$deploy$yarn$Client$$obtainTokenForHiveMetastore(Client.scala:1260)
>         ... 54 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:Delegation Token can be issued only with kerberos 
> authentication. Current AuthenticationMethod: TOKEN)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.getDelegationToken(Hive.java:3153)
>         ... 59 more
> Caused by: MetaException(message:Delegation Token can be issued only with 
> kerberos authentication. Current AuthenticationMethod: TOKEN)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_delegation_token_result$get_delegation_token_resultStandardScheme.read(ThriftHiveMetastore.java)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_delegation_token_result$get_delegation_token_resultStandardScheme.read(ThriftHiveMetastore.java)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_delegation_token_result.read(ThriftHiveMetastore.java)
>         at 
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_delegation_token(ThriftHiveMetastore.java:3715)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_delegation_token(ThriftHiveMetastore.java:3701)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDelegationToken(HiveMetaStoreClient.java:1796)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
>         at com.sun.proxy.$Proxy19.getDelegationToken(Unknown Source)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.getDelegationToken(Hive.java:3150)
>         ... 59 more
> 15/11/03 09:10:05 ERROR Utils: Uncaught exception in thread main
> java.lang.NullPointerException
>         at 
> org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
>         at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1228)
>         at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100)
>         at 
> org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1749)
>         at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
>         at org.apache.spark.SparkContext.stop(SparkContext.scala:1748)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:593)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at 
> $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $line23.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
>         at $line23.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
>         at $line23.$read$$iwC$$iwC$$iwC.<init>(<console>:41)
>         at $line23.$read$$iwC$$iwC.<init>(<console>:43)
>         at $line23.$read$$iwC.<init>(<console>:45)
>         at $line23.$read.<init>(<console>:47)
>         at $line23.$read$.<init>(<console>:51)
>         at $line23.$read$.<clinit>(<console>)
>         at $line23.$eval$.<init>(<console>:7)
>         at $line23.$eval$.<clinit>(<console>)
>         at $line23.$eval.$print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
>         at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> java.lang.RuntimeException: Unexpected exception
>         at 
> org.apache.spark.deploy.yarn.Client$.org$apache$spark$deploy$yarn$Client$$obtainTokenForHiveMetastore(Client.scala:1280)
>         at 
> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:271)
>         at 
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:629)
>         at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:119)
>         at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
>         at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:523)
>         at 
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
>         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
>         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
>         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
>         at $iwC$$iwC$$iwC.<init>(<console>:41)
>         at $iwC$$iwC.<init>(<console>:43)
>         at $iwC.<init>(<console>:45)
>         at <init>(<console>:47)
>         at .<init>(<console>:51)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at 
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
>         at 
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at 
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>         at 
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at 
> org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
>         at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at 
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at 
> org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
>         at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.spark.deploy.yarn.Client$.org$apache$spark$deploy$yarn$Client$$obtainTokenForHiveMetastore(Client.scala:1260)
>         ... 54 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(message:Delegation Token can be issued only with kerberos 
> authentication. Current AuthenticationMethod: TOKEN)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.getDelegationToken(Hive.java:3153)
>         ... 59 more
> Caused by: MetaException(message:Delegation Token can be issued only with 
> kerberos authentication. Current AuthenticationMethod: TOKEN)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_delegation_token_result$get_delegation_token_resultStandardScheme.read(ThriftHiveMetastore.java)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_delegation_token_result$get_delegation_token_resultStandardScheme.read(ThriftHiveMetastore.java)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_delegation_token_result.read(ThriftHiveMetastore.java)
>         at 
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_delegation_token(ThriftHiveMetastore.java:3715)
>         at 
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_delegation_token(ThriftHiveMetastore.java:3701)
>         at 
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDelegationToken(HiveMetaStoreClient.java:1796)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at 
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
>         at com.sun.proxy.$Proxy19.getDelegationToken(Unknown Source)
>         at 
> org.apache.hadoop.hive.ql.metadata.Hive.getDelegationToken(Hive.java:3150)
>         ... 59 more
> {code}
> If I set the master as "local", I don't have the error. This errors appears 
> only with the yarn-client and yarn-server options.
> In addition, I tried to do the same test but after deleting the hive-site.xml 
> link in the conf directory. I can successfully create a SparkContext even 
> with a master as local or as yarn-client/server:
> {noformat:hive-site.xml deletion}
> lrwxrwxrwx 1 root root   48 Nov  2 18:45 hive-site.xml -> 
> /opt/application/Hive/current/conf/hive-site.xml
> -rw-r--r-- 1 root root 1213 Nov  2 14:21 log4j.properties
> -rw-r--r-- 1 root root 6593 Nov  2 14:21 metrics.properties
> -rw-r--r-- 1 root root 1438 Nov  2 14:21 spark-defaults.conf
> -rw-r--r-- 1 root root 3747 Nov  2 14:21 spark-env.sh
> [root@uabigspark01 conf]# rm -f hive-site.xml
> {noformat}
> {code:title=SparkContextCreationSuccessful.scala|borderStyle=solid}
> scala> import org.apache.spark.SparkConf
> import org.apache.spark.SparkConf
> scala> import org.apache.spark.SparkContext
> import org.apache.spark.SparkContext
> scala> import org.apache.spark.SparkContext._
> import org.apache.spark.SparkContext._
> scala> val sc = new SparkContext(new SparkConf())
> sc: org.apache.spark.SparkContext = org.apache.spark.SparkContext@1c33d2bb
> {code}
> To finish, I can add that I tested with spark 1.5.1 + Hive 1.1.0 and Spark 
> 1.5.1 + Hive 1.2.0. The result is the same.
> Note: this was working well with Spark 1.4.1 and Hive 1.1.0



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to