xuFabius edited a comment on issue #757: spark-hoodie-bundle using hive-serde 
to sync hive table(Hive2.3.5)
URL: https://github.com/apache/incubator-hudi/issues/757#issuecomment-511704470
 
 
   HI cdmikechen,   I got below exception with your posted xml when I run 
HoodieJavaApp(enableHiveSync=true)
   
   ```shell
   Exception in thread "main" java.lang.VerifyError: Bad type on operand stack
   Exception Details:
     Location:
       
com/uber/hoodie/org/apache/hadoop_hive/metastore/HiveMetaStore.startMetaStore(ILorg/apache/hadoop/hive/thrift/HadoopThriftAuthBridge;Lcom/uber/hoodie/org/apache/hadoop_hive/conf/HiveConf;Ljava/util/concurrent/locks/Lock;Ljava/util/concurrent/locks/Condition;Ljava/util/concurrent/atomic/AtomicBoolean;)V
 @317: invokespecial
     Reason:
       Type 'org/apache/hadoop/hive/thrift/TUGIContainingTransport$Factory' 
(current frame, stack[3]) is not assignable to 
'org/apache/thrift/transport/TTransportFactory'
     Current Frame:
       bci: @317
       flags: { }
       locals: { integer, 
'org/apache/hadoop/hive/thrift/HadoopThriftAuthBridge', 
'com/uber/hoodie/org/apache/hadoop_hive/conf/HiveConf', 
'java/util/concurrent/locks/Lock', 'java/util/concurrent/locks/Condition', 
'java/util/concurrent/atomic/AtomicBoolean', long, long_2nd, integer, integer, 
integer, integer, integer, integer, 
'org/apache/thrift/protocol/TProtocolFactory', 
'org/apache/thrift/protocol/TProtocolFactory', 
'com/uber/hoodie/org/apache/hadoop_hive/metastore/HiveMetaStore$HMSHandler', 
'com/uber/hoodie/org/apache/hadoop_hive/metastore/IHMSHandler', 
'org/apache/thrift/transport/TServerSocket' }
       stack: { uninitialized 298, uninitialized 298, 
'org/apache/thrift/transport/TFramedTransport$Factory', 
'org/apache/hadoop/hive/thrift/TUGIContainingTransport$Factory', null }
     Bytecode:
       0x0000000: 04b3 00a9 2cb2 018e b601 9237 062c b201
       0x0000010: 95b6 0199 3608 2cb2 019c b601 9936 092c
       0x0000020: b201 9fb6 0137 360a 2cb2 01a2 b601 3736
       0x0000030: 0b2c b201 a5b6 0137 360c 2cb2 01a8 b601
       0x0000040: 3736 0d2c b201 abb6 0137 b301 ad15 0c99
       0x0000050: 001c bb00 3659 b701 ae3a 0ebb 0036 5916
       0x0000060: 0616 06b7 01b1 3a0f a700 1bbb 003b 59b7
       0x0000070: 01b4 3a0e bb00 3b59 0404 1606 1606 b701
       0x0000080: b73a 0fbb 000c 5913 01bb 2c03 b700 8f3a
       0x0000090: 1019 102c b801 bd3a 1101 3a12 b201 ad99
       0x00000a0: 007c 150b 9900 0ebb 01bf 5913 01c1 b701
       0x00000b0: c3bf 2b2c b201 cab6 01ce 2cb2 01d1 b601
       0x00000c0: ceb6 01d5 b301 d7bb 009a 59b7 01d8 b300
       0x00000d0: 98b2 0098 2c19 10b2 01dc b601 e0b2 01d7
       0x00000e0: b200 98b6 01e4 b601 e8b2 01d7 2cb8 01ee
       0x00000f0: b601 f23a 13b2 01d7 bb00 3f59 1911 b701
       0x0000100: f5b6 01f9 3a14 011a b801 ff3a 12b2 00e2
       0x0000110: 1302 01b9 00fe 0200 a701 1b2c b202 04b6
       0x0000120: 0137 9900 4315 0b99 001c bb00 1259 bb00
       0x0000130: 4259 b702 05bb 0046 59b7 0206 01b7 0209
       0x0000140: a700 0abb 0046 59b7 0206 3a13 bb02 0d59
       0x0000150: 1911 b702 0e3a 14b2 00e2 1302 10b9 00fe
       0x0000160: 0200 a700 3115 0b99 000d bb00 4259 b702
       0x0000170: 05a7 000a bb02 0b59 b702 113a 13bb 0213
       0x0000180: 5919 11b7 0214 3a14 b200 e213 0216 b900
       0x0000190: fe02 00bb 021a 59b7 021b 3a15 2cb2 021e
       0x00001a0: b601 ce13 0220 b602 243a 1619 16be 3617
       0x00001b0: 0336 1815 1815 17a2 001a 1916 1518 323a
       0x00001c0: 1919 1519 19b9 022a 0200 5784 1801 a7ff
       0x00001d0: e515 0d9a 000d 011a b801 ff3a 12a7 0056
       0x00001e0: 2cb2 022d b601 ceb6 0230 3a16 1916 b602
       0x00001f0: 3399 0024 bb02 3559 bb00 ea59 b700 ebb2
       0x0000200: 0238 b402 3bb6 00f1 1302 3db6 00f1 b600
       0x0000210: fbb7 023e bfb8 0242 2cb2 0238 b402 3bb9
       0x0000220: 0248 0300 3a17 011a 1916 1917 1915 b802
       0x0000230: 4c3a 1215 0a99 000e bb02 4e59 1912 b702
       0x0000240: 513a 12bb 0031 5919 12b7 0254 1914 b602
       0x0000250: 58c0 0031 1913 b602 5cc0 0031 190e b602
       0x0000260: 60c0 0031 190f b602 63c0 0031 1508 b602
       0x0000270: 6715 09b6 026a 3a15 bb00 3359 1915 b702
       0x0000280: 6d3a 16bb 0019 59b7 026e 3a17 1916 1917
       0x0000290: b602 72b2 00d9 bb00 ea59 b700 eb13 0274
       0x00002a0: b600 f11a b600 f813 0276 b600 f1b6 00fb
       0x00002b0: b900 fe02 00b2 00d9 bb00 ea59 b700 eb13
       0x00002c0: 0278 b600 f115 08b6 00f8 b600 fbb9 00fe
       0x00002d0: 0200 b200 d9bb 00ea 59b7 00eb 1302 7ab6
       0x00002e0: 00f1 1509 b600 f8b6 00fb b900 fe02 00b2
       0x00002f0: 00d9 bb00 ea59 b700 eb13 027c b600 f115
       0x0000300: 0ab6 027f b600 fbb9 00fe 0200 2dc6 000d
       0x0000310: 1916 2d19 0419 05b8 0283 1916 b602 88a7
       0x0000320: 001a 3a1a 191a b602 8bb2 00d9 191a b802
       0x0000330: 91b9 0293 0200 191a bfb1               
     Exception Handler Table:
       bci [0, 799] => handler: 802
     Stackmap Table:
       
full_frame(@107,{Integer,Object[#33],Object[#76],Object[#342],Object[#435],Object[#348],Long,Integer,Integer,Integer,Integer,Integer,Integer},{})
       append_frame(@131,Object[#441],Object[#441])
       append_frame(@178,Object[#12],Object[#453],Object[#455])
       same_frame_extended(@283)
       same_frame(@323)
       same_locals_1_stack_item_frame(@330,Object[#523])
       same_frame(@357)
       same_frame(@372)
       same_locals_1_stack_item_frame(@379,Object[#523])
       append_frame(@403,Object[#523],Object[#536])
       
full_frame(@435,{Integer,Object[#33],Object[#76],Object[#342],Object[#435],Object[#348],Long,Integer,Integer,Integer,Integer,Integer,Integer,Object[#441],Object[#441],Object[#12],Object[#453],Object[#455],Object[#523],Object[#536],Object[#550],Object[#213],Integer,Integer},{})
       chop_frame(@465,3)
       same_frame(@480)
       append_frame(@533,Object[#265])
       chop_frame(@563,2)
       same_frame(@579)
       append_frame(@794,Object[#49],Object[#84],Object[#645])
       
full_frame(@802,{Integer,Object[#33],Object[#76],Object[#342],Object[#435],Object[#348]},{Object[#177]})
       same_frame(@825)
   
        at 
com.uber.hoodie.org.apache.hadoop_hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:164)
        at 
com.uber.hoodie.org.apache.hadoop_hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:129)
        at 
com.uber.hoodie.hive.HoodieHiveClient.<init>(HoodieHiveClient.java:102)
        at com.uber.hoodie.hive.HiveSyncTool.<init>(HiveSyncTool.java:61)
        at 
com.uber.hoodie.HoodieSparkSqlWriter$.syncHive(HoodieSparkSqlWriter.scala:239)
        at 
com.uber.hoodie.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:173)
        at com.uber.hoodie.DefaultSource.createRelation(DefaultSource.scala:90)
        at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
        at 
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
        at 
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:656)
        at 
org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:656)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
        at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:656)
        at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225)
        at HoodieJavaApp.run(HoodieJavaApp.java:147)
        at HoodieJavaApp.main(HoodieJavaApp.java:93)
   ```
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to