tjtoll commented on issue #5636:
URL: https://github.com/apache/hudi/issues/5636#issuecomment-1137491859

   I’m not sure what the next thing to try is, we do not convert to dynamic 
frames before we write 
   
   > On May 25, 2022, at 8:20 AM, Kapil Kumar Joshi ***@***.***> wrote:
   > 
   > 
   > Tried with below combination of jars its failing now with a new error, 
please let me know if I'm still missing something here
   > 
   > hudi-spark3.1-bundle_2.12-0.11.0.jar
   > spark-avro_2.12-3.1.2.jar
   > calcite-core-1.16.0.jar
   > Error while writing to dynamic frame:
   > 
   > 2022-05-25 12:13:39,886 WARN [Thread-12] metadata.Hive 
(Hive.java:registerAllFunctionsOnce(237)): Failed to register all functions.
   > java.lang.RuntimeException: Unable to instantiate 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
   >    at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1709)
   >    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:87)
   >    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:137)
   >    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:108)
   >    at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClientFactory.createMetaStoreClient(SessionHiveMetaStoreClientFactory.java:50)
   >    at 
org.apache.hadoop.hive.ql.metadata.HiveUtils.createMetaStoreClient(HiveUtils.java:507)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3746)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3726)
   >    at 
org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3988)
   >    at 
org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:251)
   >    at 
org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:234)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:402)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:335)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:315)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:291)
   >    at 
org.apache.hudi.hive.ddl.HMSDDLExecutor.<init>(HMSDDLExecutor.java:69)
   >    at 
org.apache.hudi.hive.HoodieHiveClient.<init>(HoodieHiveClient.java:73)
   >    at org.apache.hudi.hive.HiveSyncTool.initClient(HiveSyncTool.java:95)
   >    at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:89)
   >    at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:80)
   >    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
   >    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
   >    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   >    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
   >    at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:89)
   >    at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.instantiateMetaSyncTool(SyncUtilHelpers.java:78)
   >    at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:59)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2(HoodieSparkSqlWriter.scala:622)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2$adapted(HoodieSparkSqlWriter.scala:621)
   >    at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:621)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:680)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:313)
   >    at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:163)
   >    at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
   >    at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
   >    at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
   >    at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:90)
   >    at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
   >    at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
   >    at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
   >    at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
   >    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
   >    at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:134)
   >    at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:133)
   >    at 
org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
   >    at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:232)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.executeQuery$1(SQLExecution.scala:110)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:135)
   >    at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:232)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:135)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:253)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:134)
   >    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:68)
   >    at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989)
   >    at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
   >    at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
   >    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:301)
   >    at 
com.amazonaws.services.glue.marketplace.connector.SparkCustomDataSink.writeDynamicFrame(CustomDataSink.scala:45)
   >    at 
com.amazonaws.services.glue.DataSink.pyWriteDynamicFrame(DataSink.scala:64)
   >    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   >    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   >    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   >    at java.lang.reflect.Method.invoke(Method.java:498)
   >    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
   >    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
   >    at py4j.Gateway.invoke(Gateway.java:282)
   >    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
   >    at py4j.commands.CallCommand.execute(CallCommand.java:79)
   >    at py4j.GatewayConnection.run(GatewayConnection.java:238)
   >    at java.lang.Thread.run(Thread.java:750)
   > Caused by: java.lang.reflect.InvocationTargetException
   >    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
   >    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
   >    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   >    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
   >    at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1707)
   >    ... 73 more
   > Caused by: MetaException(message:Could not connect to meta store using any 
of the URIs provided. Most recent failure: 
org.apache.thrift.transport.TTransportException: java.net.ConnectException: 
Connection refused (Connection refused)
   >    at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
   >    at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:480)
   >    at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:247)
   >    at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
   >    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
   >    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
   >    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   >    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
   >    at 
org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1707)
   >    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:87)
   >    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:137)
   >    at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:108)
   >    at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClientFactory.createMetaStoreClient(SessionHiveMetaStoreClientFactory.java:50)
   >    at 
org.apache.hadoop.hive.ql.metadata.HiveUtils.createMetaStoreClient(HiveUtils.java:507)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3746)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3726)
   >    at 
org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3988)
   >    at 
org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:251)
   >    at 
org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:234)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:402)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:335)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:315)
   >    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:291)
   >    at 
org.apache.hudi.hive.ddl.HMSDDLExecutor.<init>(HMSDDLExecutor.java:69)
   >    at 
org.apache.hudi.hive.HoodieHiveClient.<init>(HoodieHiveClient.java:73)
   >    at org.apache.hudi.hive.HiveSyncTool.initClient(HiveSyncTool.java:95)
   >    at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:89)
   >    at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:80)
   >    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
   >    at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
   >    at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   >    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
   >    at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:89)
   >    at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.instantiateMetaSyncTool(SyncUtilHelpers.java:78)
   >    at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:59)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2(HoodieSparkSqlWriter.scala:622)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2$adapted(HoodieSparkSqlWriter.scala:621)
   >    at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:621)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:680)
   >    at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:313)
   >    at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:163)
   >    at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
   >    at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
   >    at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
   >    at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:90)
   >    at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
   >    at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
   >    at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
   >    at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
   >    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
   >    at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:134)
   >    at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:133)
   >    at 
org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
   >    at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:232)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.executeQuery$1(SQLExecution.scala:110)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:135)
   >    at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:232)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:135)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:253)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:134)
   >    at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
   >    at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:68)
   >    at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989)
   >    at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
   >    at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
   >    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:301)
   >    at 
com.amazonaws.services.glue.marketplace.connector.SparkCustomDataSink.writeDynamicFrame(CustomDataSink.scala:45)
   >    at 
com.amazonaws.services.glue.DataSink.pyWriteDynamicFrame(DataSink.scala:64)
   >    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   >    at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   >    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   >    at java.lang.reflect.Method.invoke(Method.java:498)
   >    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
   >    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
   >    at py4j.Gateway.invoke(Gateway.java:282)
   >    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
   >    at py4j.commands.CallCommand.execute(CallCommand.java:79)
   >    at py4j.GatewayConnection.run(GatewayConnection.java:238)
   >    at java.lang.Thread.run(Thread.java:750)
   > Caused by: java.net.ConnectException: Connection refused (Connection 
refused)
   >    at java.net.PlainSocketImpl.socketConnect(Native Method)
   >    at 
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
   >    at 
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
   >    at 
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
   >    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
   >    at java.net.Socket.connect(Socket.java:607)
   >    at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
   >    ... 81 more
   > )
   >    at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:529)
   >    at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:247)
   >    at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70)
   >    ... 78 more
   > —
   > Reply to this email directly, view it on GitHub, or unsubscribe.
   > You are receiving this because you commented.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to