sassai commented on issue #1646:
URL: https://github.com/apache/incubator-hudi/issues/1646#issuecomment-632666310


   Hi @bvaradar,
   
   thanks for the reply. After some digging I found a solution. 
   
   The problem was that the spark did not load the jars specified in 
`HIVE_AUX_JARS_PATH` and on the Cloudera Data Platform the 
`spark-defaults.conf` has the following properties.
   
   ```console
   
spark.sql.hive.metastore.jars=${env:HADOOP_COMMON_HOME}/../hive/lib/*:${env:HADOOP_COMMON_HOME}/client/*
   ```
   
   To fix this problem I edited the `spark-defaults.conf` via the Cloudera 
Manager and added the path to the hudi-mr-bundle jars. 
   
   Here is a brief description on how to resolve the issue on CDP:
   
   1. Go to Cloudera Manager > Cluster > Spark > Configuration > search for 
"safety".
   
   2. Edit the snippet for `spark-conf/spark-defaults.conf` and add
   
   ```console
   
spark.sql.hive.metastore.jars=${env:HADOOP_COMMON_HOME}/../hive/lib/*:${env:HADOOP_COMMON_HOME}/client/*:/shared/jars/hive/*
   ```
   
   > Note: /shared/jars/hive/* is the path containing the hudi-mr-bundle 
(HIVE_AUX_JARS_PATH)
   
   3. Restart Spark


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to