rubenssoto opened a new issue #2588:
URL: https://github.com/apache/hudi/issues/2588


   Hello People,
   
   Emr 6.1
   Spark 3.0.0
   Hudi 0.7.0
   
   My EMR is configured to use glue catalog as a metastore, most of the time my 
Hudi jobs working great, but sometimes I have problems on hive connection:
   ```
   21/02/21 09:30:10 ERROR Client: Application diagnostics message: User class 
threw exception: java.lang.Exception: Error on Table: mobile_checkout, Error 
Message: org.apache.hudi.hive.HoodieHiveSyncException: Cannot create hive 
connection jdbc:hive2://ip-10-0-49-168.us-west-2.compute.internal:10000/
        at jobs.TableProcessor.start(TableProcessor.scala:95)
        at 
TableProcessorWrapper$.$anonfun$main$2(TableProcessorWrapper.scala:23)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
        at scala.util.Success.$anonfun$map$1(Try.scala:255)
        at scala.util.Success.map(Try.scala:213)
        at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
        at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
        at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
        at 
java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1402)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
   
   Exception in thread "main" org.apache.spark.SparkException: Application 
application_1613496813774_2019 finished with failed status
        at org.apache.spark.deploy.yarn.Client.run(Client.scala:1191)
        at 
org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1582)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:936)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1015)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1024)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   21/02/21 09:30:10 INFO ShutdownHookManager: Shutdown hook called
   ```
   
   I open a ticket in AWS, the ticket is open for more than 2 weeks and was not 
solved yet.
   I know that is not a Hudi problem, but do you have any idea how to solve 
this?
   Is there another way in Hudi to sync with Hive?
   I have been using Hive with spark for some months and I never saw an error 
like this.
   
   
   Thank you, so much!!!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to