Hi James,

Thanks for verifying it

James Srinivasan <james.sriniva...@gmail.com> 于2019年4月5日周五 上午4:52写道:

> I seem to have fixed this by upgrading Zeppelin to use thrift 0.9.3. I
> took the v0.8.1 tag and then cherry picked this:
>
>
> https://github.com/apache/zeppelin/commit/4b436ca22091baab8170fa80713e86918db5b987
>
> I note that the HDP Zeppelin distro already has this commit included.
>
> James
>
> On Thu, 21 Mar 2019 at 20:17, James Srinivasan
> <james.sriniva...@gmail.com> wrote:
> >
> > I'm using Zeppelin 0.8.1 with a Kerberized HDP 3 cluster. Without user
> > impersonation, it seems fine. With user impersonation, I get the
> > following stack trace trying to run println(sc.version). Google
> > suggest libthrift incompatibilities, but the one jar I could find is
> > the same on 0.8.0 vs 0.8.1 (I didn't get this error with user
> > impersonation on 0.8.0.)
> >
> > Any help much appreciated,
> >
> > java.lang.NoSuchMethodError:
> >
> com.facebook.fb303.FacebookService$Client.sendBaseOneway(Ljava/lang/String;Lorg/apache/thrift/TBase;)V
> > at
> com.facebook.fb303.FacebookService$Client.send_shutdown(FacebookService.java:436)
> > at
> com.facebook.fb303.FacebookService$Client.shutdown(FacebookService.java:430)
> > at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:498)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498) at
> >
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
> > at com.sun.proxy.$Proxy20.close(Unknown Source) at
> > org.apache.hadoop.hive.ql.metadata.Hive.close(Hive.java:291) at
> > org.apache.hadoop.hive.ql.metadata.Hive.access$000(Hive.java:137) at
> > org.apache.hadoop.hive.ql.metadata.Hive$1.remove(Hive.java:157) at
> > org.apache.hadoop.hive.ql.metadata.Hive.closeCurrent(Hive.java:261) at
> >
> org.apache.spark.deploy.security.HiveDelegationTokenProvider$$anonfun$obtainDelegationTokens$2.apply$mcV$sp(HiveDelegationTokenProvider.scala:114)
> > at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1361)
> > at
> org.apache.spark.deploy.security.HiveDelegationTokenProvider.obtainDelegationTokens(HiveDelegationTokenProvider.scala:113)
> > at
> org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:132)
> > at
> org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:130)
> > at
> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
> > at
> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
> > at scala.collection.Iterator$class.foreach(Iterator.scala:893) at
> > scala.collection.AbstractIterator.foreach(Iterator.scala:1336) at
> > scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206)
> > at
> scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
> > at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
> > at
> org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainDelegationTokens(HadoopDelegationTokenManager.scala:130)
> > at
> org.apache.spark.deploy.yarn.security.YARNHadoopDelegationTokenManager.obtainDelegationTokens(YARNHadoopDelegationTokenManager.scala:56)
> > at
> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:388)
> > at
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:921)
> > at
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:169)
> > at
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
> > at
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
> > at org.apache.spark.SparkContext.<init>(SparkContext.scala:500) at
> > org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2493) at
> >
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:934)
> > at
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:925)
> > at scala.Option.getOrElse(Option.scala:121) at
> >
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:925)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498) at
> >
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.spark2CreateContext(BaseSparkScalaInterpreter.scala:246)
> > at
> org.apache.zeppelin.spark.BaseSparkScalaInterpreter.createSparkContext(BaseSparkScalaInterpreter.scala:178)
> > at
> org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:89)
> > at
> org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:102)
> > at
> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:62)
> > at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
> > at
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:616)
> > at org.apache.zeppelin.scheduler.Job.run(Job.java:188) at
> > org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
> > at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> > at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
> >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> > at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> > at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> > at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> > at java.lang.Thread.run(Thread.java:748)
>


-- 
Best Regards

Jeff Zhang

Reply via email to