[ 
https://issues.apache.org/jira/browse/SPARK-31145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17059884#comment-17059884
 ] 

L. C. Hsieh commented on SPARK-31145:
-------------------------------------

Please take a look SPARK-26432. It is known issue that Spark 2.4.5 cannot 
connect Hbase 2.1.

This is fixed in 3.0.0.

Otherwise, HBase also has a fix (HBASE-23175) in 2.1.8+ that brings back the 
support.

> spark2.4.0 read hbase2.1.0 Error
> --------------------------------
>
>                 Key: SPARK-31145
>                 URL: https://issues.apache.org/jira/browse/SPARK-31145
>             Project: Spark
>          Issue Type: Bug
>          Components: Security
>    Affects Versions: 2.4.0
>         Environment: kerberos 
>            Reporter: HarSenZhao
>            Priority: Blocker
>
> CDH version 6.3.0 spark local mode can read hbase but yarn mode can report 
> errors
>  Error information is as follows:
> {code:java}
> //代码占位符
> {code}
> java.lang.reflect.InvocationTargetException at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.spark.deploy.security.HBaseDelegationTokenProvider.obtainDelegationTokens(HBaseDelegationTokenProvider.scala:49)
>  at 
> org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:144)
>  at 
> org.apache.spark.deploy.security.HadoopDelegationTokenManager$$anonfun$obtainDelegationTokens$2.apply(HadoopDelegationTokenManager.scala:142)
>  at 
> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
>  at 
> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
>  at scala.collection.Iterator$class.foreach(Iterator.scala:891) at 
> scala.collection.AbstractIterator.foreach(Iterator.scala:1334) at 
> scala.collection.MapLike$DefaultValuesIterable.foreach(MapLike.scala:206) at 
> scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at 
> scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) at 
> org.apache.spark.deploy.security.HadoopDelegationTokenManager.obtainDelegationTokens(HadoopDelegationTokenManager.scala:142)
>  at 
> org.apache.spark.deploy.yarn.security.YARNHadoopDelegationTokenManager.obtainDelegationTokens(YARNHadoopDelegationTokenManager.scala:59)
>  at 
> org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$$anonfun$start$1.apply(CoarseGrainedSchedulerBackend.scala:407)
>  at 
> org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend$$anonfun$start$1.apply(CoarseGrainedSchedulerBackend.scala:401)
>  at scala.Option.foreach(Option.scala:257) at 
> org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.start(CoarseGrainedSchedulerBackend.scala:401)
>  at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:46)
>  at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:186)
>  at org.apache.spark.SparkContext.<init>(SparkContext.scala:511) at 
> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2549) at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:944)
>  at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
>  at scala.Option.getOrElse(Option.scala:121) at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:935) 
> at com.gree.test.SparkHbase$.main(SparkHbase.scala:34) at 
> com.gree.test.SparkHbase.main(SparkHbase.scala) at 
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) 
> at 
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:851)
>  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167) at 
> org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195) at 
> org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at 
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926) 
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935) at 
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: 
> org.apache.hadoop.hbase.HBaseIOException: 
> com.google.protobuf.ServiceException: Error calling method 
> hbase.pb.AuthenticationService.GetAuthenticationToken at 
> org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil.makeIOExceptionOfException(ProtobufUtil.java:363)
>  at 
> org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil.handleRemoteException(ProtobufUtil.java:349)
>  at 
> org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:86)
>  ... 42 more Caused by: com.google.protobuf.ServiceException: Error calling 
> method hbase.pb.AuthenticationService.GetAuthenticationToken at 
> org.apache.hadoop.hbase.client.SyncCoprocessorRpcChannel.callBlockingMethod(SyncCoprocessorRpcChannel.java:71)
>  at 
> org.apache.hadoop.hbase.protobuf.generated.AuthenticationProtos$AuthenticationService$BlockingStub.getAuthenticationToken(AuthenticationProtos.java:4512)
>  at 
> org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:81)
>  ... 42 more



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to