vinothchandar commented on issue #894: Getting java.lang.NoSuchMethodError while doing Hive sync URL: https://github.com/apache/incubator-hudi/issues/894#issuecomment-531988780 Hmmm. I tested with Hive version - Hive 2.3.3 Spark version - 2.4.4 by simply using spark2.4.4 to do this step on the docker demo (had spark installation unzipped onto `docker` and did something like ``` root@adhoc-2:/opt# /var/hoodie/ws/docker/spark-2.4.4-bin-hadoop2.7/bin/spark-submit --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer /var/hoodie/ws/docker/hoodie/hadoop/hive_base/target/hoodie-utilities.jar --storage-type COPY_ON_WRITE --source-class org.apache.hudi.utilities.sources.JsonDFSSource --source-ordering-field ts --target-base-path /user/hive/warehouse/stock_ticks_cow --target-table stock_ticks_cow --props /var/demo/config/dfs-source.properties --schemaprovider-class org.apache.hudi.utilities.schema.FilebasedSchemaProvider --enable-hive-sync --hoodie-conf hoodie.datasource.hive_sync.jdbcurl=jdbc:hive2://hiveserver:10000 --hoodie-conf hoodie.datasource.hive_sync.username=hive --hoodie-conf hoodie.datasource.hive_sync.password=hive --hoodie-conf hoodie.datasource.hive_sync.partition_fields=dt --hoodie-conf hoodie.datasource.hive_sync.database=default --hoodie-conf hoodie.datasource.hive_sync.table=stock_ticks_cow ``` seems to work.. So I suspect it has to do something with the amazon specific version? Can you try building a local docker image with that hive version locally and see if you can reproduce this? https://hudi.apache.org/docker_demo.html#building-local-docker-containers
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services