15663671003 opened a new issue, #6483:
URL: https://github.com/apache/hudi/issues/6483

   **Describe the problem you faced**
   
   can't connect to hive when sync 
   
   **Expected behavior**
   
   Install spark3.2.2 in CDH6.2.1 environment, and run an operation to write to 
hudi0.12.0, the problem of synchronization hive failure occurs, I guess whether 
hudi0.12.0 only supports hive3.x, what should I do, please help me.
   
   **Environment Description**
   
   * Hudi version : 0.12.0
   
   * Spark version : 3.2.2
   
   * Hive version : 2.1.1
   
   * Hadoop version : 3.0.0-cdh6.2.1
   
   * Storage (HDFS/S3/GCS..) : HDFS
   
   * Running on Docker? (yes/no) : no
   
   
   **Stacktrace**
   
   ```
   [root@gt4 test]# sudo -u admin 
/opt/spark-3.2.2-bin-3.0.0-cdh6.2.1/bin/pyspark --num-executors 200 
--executor-cores 1 --executor-memory 8g --driver-memory 4g --conf 
'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog'
 --conf 
'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' 
--jars /home/test/hudi-spark3.2-bundle_2.12-0.12.0.jar
   Python 3.6.8 (default, Nov 16 2020, 16:55:22)
   [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] on linux
   Type "help", "copyright", "credits" or "license" for more information.
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
   22/08/24 14:30:46 WARN conf.HiveConf: HiveConf of name 
hive.vectorized.use.checked.expressions does not exist
   22/08/24 14:30:46 WARN conf.HiveConf: HiveConf of name 
hive.strict.checks.no.partition.filter does not exist
   22/08/24 14:30:46 WARN conf.HiveConf: HiveConf of name 
hive.strict.checks.orderby.no.limit does not exist
   22/08/24 14:30:46 WARN conf.HiveConf: HiveConf of name 
hive.vectorized.input.format.excludes does not exist
   /opt/spark-3.2.2-bin-3.0.0-cdh6.2.1/python/pyspark/context.py:238: 
FutureWarning: Python 3.6 support is deprecated in Spark 3.2.
     FutureWarning
   Welcome to
         ____              __
        / __/__  ___ _____/ /__
       _\ \/ _ \/ _ `/ __/  '_/
      /__ / .__/\_,_/_/ /_/\_\   version 3.2.2
         /_/
   
   Using Python version 3.6.8 (default, Nov 16 2020 16:55:22)
   Spark context Web UI available at http://gt4.dwh.antiytip.com:4040
   Spark context available as 'sc' (master = yarn, app id = 
application_1660892079989_11440).
   SparkSession available as 'spark'.
   >>> def write():
   ...     import datetime
   ...     df = spark.createDataFrame(
   ...         [
   ...             [i, i, i, datetime.datetime.now().strftime("%Y%m%d%H%M%S"), 
'a', 'b', ]
   ...             for i in range(1)
   ...         ],
   ...         ['k1', 'k2', 'v', 'cmp_key', 'pt1', 'pt2', ]
   ...     )
   ...     table_name = "hudi_0_12_0_test"
   ...     db_name = "user_test"
   ...     path = f"/user/hive/warehouse/{db_name}.db/{table_name}"
   ...     hudi_options = {
   ...         'hoodie.table.name': table_name,
   ...         'hoodie.datasource.write.recordkey.field': "k1,k2",
   ...         'hoodie.datasource.write.table.name': table_name,
   ...         'hoodie.datasource.write.operation': "upsert",
   ...         'hoodie.datasource.write.precombine.field': "cmp_key",
   ...         'hoodie.datasource.write.table.type': "COPY_ON_WRITE",
   ...         'hoodie.upsert.shuffle.parallelism': 2000,
   ...         'hoodie.bulkinsert.shuffle.parallelism': 2000,
   ...         'hoodie.insert.shuffle.parallelism': 2000,
   ...         'hoodie.cleaner.policy': 'KEEP_LATEST_COMMITS',
   ...         'hoodie.cleaner.fileversions.retained': 6,
   ...         'hoodie.parquet.max.file.size': 1024*1024*100,
   ...         'hoodie.parquet.small.file.limit': 1024*1024*60,
   ...         'hoodie.parquet.compression.codec': 'snappy',
   ...         'hoodie.bloom.index.parallelism': 4321,
   ...         'hoodie.datasource.write.payload.class': 
"org.apache.hudi.common.model.DefaultHoodieRecordPayload",
   ...         'hoodie.datasource.hive_sync.enable': 'true',
   ...         'hoodie.datasource.hive_sync.database': db_name,
   ...         'hoodie.datasource.hive_sync.table': table_name,
   ...         'hoodie.datasource.hive_sync.jdbcurl': 
"jdbc:hive2://hive.dwhtest.com:10000",
   ...         'hoodie.datasource.write.hive_style_partitioning': "true",
   ...         'hoodie.datasource.write.partitionpath.field': "pt1,pt2",
   ...         'hoodie.datasource.hive_sync.partition_extractor_class': 
'org.apache.hudi.hive.MultiPartKeysValueExtractor',
   ...         'hoodie.datasource.write.keygenerator.class': 
'org.apache.hudi.keygen.ComplexKeyGenerator'
   ...     }
   ...     df.write.format("hudi").options(
   ...         **hudi_options
   ...     ).save(path)
   ...
   >>> write()
   22/08/24 14:33:23 WARN metadata.HoodieBackedTableMetadata: Metadata table 
was not found at path /user/hive/warehouse/user_test.db/hudi_0_12           
_0_test/.hoodie/metadata
   22/08/24 14:33:41 ERROR scheduler.AsyncEventQueue: Dropping event from queue 
eventLog. This likely means one of the listeners is too slow and cann           
ot keep up with the rate at which tasks are being started by the scheduler.
   22/08/24 14:33:41 WARN scheduler.AsyncEventQueue: Dropped 1 events from 
eventLog since the application started.
   00:23  WARN: Timeline-server-based markers are not supported for HDFS: base 
path /user/hive/warehouse/user_test.db/hudi_0_12_0_test.  Falli           ng 
back to direct markers.
   00:28  WARN: Timeline-server-based markers are not supported for HDFS: base 
path /user/hive/warehouse/user_test.db/hudi_0_12_0_test.  Falli           ng 
back to direct markers.
   22/08/24 14:33:53 WARN conf.HiveConf: HiveConf of name 
hive.vectorized.use.checked.expressions does not exist
   22/08/24 14:33:53 WARN conf.HiveConf: HiveConf of name 
hive.strict.checks.no.partition.filter does not exist
   22/08/24 14:33:53 WARN conf.HiveConf: HiveConf of name 
hive.strict.checks.orderby.no.limit does not exist
   22/08/24 14:33:53 WARN conf.HiveConf: HiveConf of name 
hive.vectorized.input.format.excludes does not exist
   22/08/24 14:33:53 ERROR jdbc.HiveConnection: Error opening session
   org.apache.thrift.TApplicationException: Required field 'client_protocol' is 
unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{se           
t:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000, 
use:database=default})
           at 
org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:176)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:163)
           at 
org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:680)
           at 
org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:200)
           at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
           at java.sql.DriverManager.getConnection(DriverManager.java:664)
           at java.sql.DriverManager.getConnection(DriverManager.java:247)
           at 
org.apache.hudi.hive.ddl.JDBCExecutor.createHiveConnection(JDBCExecutor.java:104)
           at org.apache.hudi.hive.ddl.JDBCExecutor.<init>(JDBCExecutor.java:59)
           at 
org.apache.hudi.hive.HoodieHiveSyncClient.<init>(HoodieHiveSyncClient.java:91)
           at 
org.apache.hudi.hive.HiveSyncTool.initSyncClient(HiveSyncTool.java:101)
           at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:95)
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
           at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:89)
           at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.instantiateMetaSyncTool(SyncUtilHelpers.java:75)
           at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:56)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2(HoodieSparkSqlWriter.scala:648)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2$adapted(HoodieSparkSqlWriter.scala:647)
           at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:647)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:734)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:338)
           at 
org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:183)
           at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:97)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:97)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:93)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
           at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDown
           WithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
           at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:93)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:80)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:78)
           at 
org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:115)
           at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:848)
           at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
           at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
           at 
org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:239)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
           at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
           at py4j.Gateway.invoke(Gateway.java:282)
           at 
py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
           at py4j.commands.CallCommand.execute(CallCommand.java:79)
           at 
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
           at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
           at java.lang.Thread.run(Thread.java:748)
   22/08/24 14:33:53 WARN jdbc.HiveConnection: Failed to connect to 
hive.dwhtest.com:10000
   Traceback (most recent call last):
     File "<stdin>", line 1, in <module>
     File "<stdin>", line 41, in write
     File 
"/opt/spark-3.2.2-bin-3.0.0-cdh6.2.1/python/pyspark/sql/readwriter.py", line 
740, in save
       self._jwrite.save(path)
     File 
"/opt/spark-3.2.2-bin-3.0.0-cdh6.2.1/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py",
 line 1322, in __call__
     File "/opt/spark-3.2.2-bin-3.0.0-cdh6.2.1/python/pyspark/sql/utils.py", 
line 111, in deco
       return f(*a, **kw)
     File 
"/opt/spark-3.2.2-bin-3.0.0-cdh6.2.1/python/lib/py4j-0.10.9.5-src.zip/py4j/protocol.py",
 line 328, in get_return_value
   py4j.protocol.Py4JJavaError: An error occurred while calling o158.save.
   : org.apache.hudi.exception.HoodieException: Could not sync using the meta 
sync class org.apache.hudi.hive.HiveSyncTool
           at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:58)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2(HoodieSparkSqlWriter.scala:648)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2$adapted(HoodieSparkSqlWriter.scala:647)
           at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:647)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:734)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:338)
           at 
org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:183)
           at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:97)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:97)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:93)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
           at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDown
           WithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
           at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:93)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:80)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:78)
           at 
org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:115)
           at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:848)
           at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
           at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
           at 
org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:239)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
           at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
           at py4j.Gateway.invoke(Gateway.java:282)
           at 
py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
           at py4j.commands.CallCommand.execute(CallCommand.java:79)
           at 
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
           at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: org.apache.hudi.exception.HoodieException: Unable to instantiate 
class org.apache.hudi.hive.HiveSyncTool
           at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:91)
           at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.instantiateMetaSyncTool(SyncUtilHelpers.java:75)
           at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:56)
           ... 48 more
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
           at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:89)
           ... 50 more
   Caused by: org.apache.hudi.hive.HoodieHiveSyncException: Got runtime 
exception when hive syncing
           at 
org.apache.hudi.hive.HiveSyncTool.initSyncClient(HiveSyncTool.java:106)
           at org.apache.hudi.hive.HiveSyncTool.<init>(HiveSyncTool.java:95)
           ... 55 more
   Caused by: org.apache.hudi.hive.HoodieHiveSyncException: Failed to create 
HiveMetaStoreClient
           at 
org.apache.hudi.hive.HoodieHiveSyncClient.<init>(HoodieHiveSyncClient.java:95)
           at 
org.apache.hudi.hive.HiveSyncTool.initSyncClient(HiveSyncTool.java:101)
           ... 56 more
   Caused by: org.apache.hudi.hive.HoodieHiveSyncException: Cannot create hive 
connection jdbc:hive2://hive.dwhtest.com:10000/
           at 
org.apache.hudi.hive.ddl.JDBCExecutor.createHiveConnection(JDBCExecutor.java:107)
           at org.apache.hudi.hive.ddl.JDBCExecutor.<init>(JDBCExecutor.java:59)
           at 
org.apache.hudi.hive.HoodieHiveSyncClient.<init>(HoodieHiveSyncClient.java:91)
           ... 57 more
   Caused by: java.sql.SQLException: Could not open client transport with JDBC 
Uri: jdbc:hive2://hive.dwhtest.com:10000: Could not establish con           
nection to jdbc:hive2://hive.dwhtest.com:10000: Required field 
'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, confi  
         
guration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000, 
use:database=default})
           at 
org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:224)
           at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
           at java.sql.DriverManager.getConnection(DriverManager.java:664)
           at java.sql.DriverManager.getConnection(DriverManager.java:247)
           at 
org.apache.hudi.hive.ddl.JDBCExecutor.createHiveConnection(JDBCExecutor.java:104)
           ... 59 more
   Caused by: java.sql.SQLException: Could not establish connection to 
jdbc:hive2://hive.dwhtest.com:10000: Required field 'client_protocol' is        
    unset! Struct:TOpenSessionReq(client_protocol:null, 
configuration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000,
 use:databas           e=default})
           at 
org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:699)
           at 
org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:200)
           ... 63 more
   Caused by: org.apache.thrift.TApplicationException: Required field 
'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, config 
          
uration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000, 
use:database=default})
           at 
org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:176)
           at 
org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:163)
           at 
org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:680)
           ... 64 more
   
   >>>
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to