Hello,

I'm a newbie and just trying to run my very first example with Hudi -- to
ingest CSV DFS source to a Hudi table. I hit the following problems in the
process.

1. Class not found exception for HiveConf. I didn't enable hive sync but
still got this error. I guess it's because the class is imported in
DeltaSync. I solved this by adding hive-common to class path. (I tried
hive-exec at first but that caused conflicts with Parquet)

2. No such method error for jetty SessionHandler::setHttpOnly. It's
because the jetty-server version conflicts between Hudi and my Hadoop. I
solved this problem by setting spark.driver.userClassPathFirst=true.

Although I've managed to make the program run successfully, I wonder
whether I'm doing it right and what's the recommended way to add
dependencies.

The components I'm using:
Spark 2.4.6 w/o Hadoop
Hadoop 3.0.3
Hive 2.3.4
Hudi latest master code

Thanks in advance!

-- 
Cheers,
Rui Li

Reply via email to