Are you using Spark 3.4?
Under directory $SPARK_HOME get a list of jar files for hive and hadoop.
This one is for version 3.4.0

 /opt/spark/jars> ltr *hive* *hadoop*
-rw-r--r--. 1 hduser hadoop   717820 Apr  7 03:43 spark-hive_2.12-3.4.0.jar
-rw-r--r--. 1 hduser hadoop   563632 Apr  7 03:43
spark-hive-thriftserver_2.12-3.4.0.jar
-rw-r--r--. 1 hduser hadoop   990925 Apr  7 03:43 parquet-hadoop-1.12.3.jar
-rw-r--r--. 1 hduser hadoop   258346 Apr  7 03:43 hive-storage-api-2.8.1.jar
-rw-r--r--. 1 hduser hadoop    12923 Apr  7 03:43
hive-shims-scheduler-2.3.9.jar
-rw-r--r--. 1 hduser hadoop   120293 Apr  7 03:43
hive-shims-common-2.3.9.jar
-rw-r--r--. 1 hduser hadoop     8786 Apr  7 03:43 hive-shims-2.3.9.jar
-rw-r--r--. 1 hduser hadoop    53902 Apr  7 03:43 hive-shims-0.23-2.3.9.jar
-rw-r--r--. 1 hduser hadoop  1679366 Apr  7 03:43 hive-service-rpc-3.1.3.jar
-rw-r--r--. 1 hduser hadoop   916630 Apr  7 03:43 hive-serde-2.3.9.jar
-rw-r--r--. 1 hduser hadoop  8195966 Apr  7 03:43 hive-metastore-2.3.9.jar
-rw-r--r--. 1 hduser hadoop   326585 Apr  7 03:43 hive-llap-common-2.3.9.jar
-rw-r--r--. 1 hduser hadoop   116364 Apr  7 03:43 hive-jdbc-2.3.9.jar
-rw-r--r--. 1 hduser hadoop 10840949 Apr  7 03:43 hive-exec-2.3.9-core.jar
-rw-r--r--. 1 hduser hadoop   436169 Apr  7 03:43 hive-common-2.3.9.jar
-rw-r--r--. 1 hduser hadoop    44704 Apr  7 03:43 hive-cli-2.3.9.jar
-rw-r--r--. 1 hduser hadoop   183633 Apr  7 03:43 hive-beeline-2.3.9.jar
-rw-r--r--. 1 hduser hadoop    56812 Apr  7 03:43
hadoop-yarn-server-web-proxy-3.3.4.jar
-rw-r--r--. 1 hduser hadoop  3362359 Apr  7 03:43
hadoop-shaded-guava-1.1.1.jar
-rw-r--r--. 1 hduser hadoop 30085504 Apr  7 03:43
hadoop-client-runtime-3.3.4.jar
-rw-r--r--. 1 hduser hadoop 19458635 Apr  7 03:43
hadoop-client-api-3.3.4.jar
-rw-r--r--. 1 hduser hadoop    15935 Apr  7 03:43
dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar
-rwxr--r--. 1 hduser hadoop 17663298 Apr 20 09:37
gcs-connector-hadoop3-2.2.0-shaded.jar

HTH

Mich Talebzadeh,
Solutions Architect/Engineering Lead
Palantir Technologies Limited
London
United Kingdom


   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>


 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Tue, 11 Jul 2023 at 18:29, Yeachan Park <yeachan...@gmail.com> wrote:

> Hi all,
>
> We made some changes to hive which require changes to the hive jars that
> Spark is bundled with. Since Spark 3.3.1 comes bundled with Hive 2.3.9
> jars, we built our changes in Hive 2.3.9 and put the necessary jars under
> $SPARK_HOME/jars (replacing the original jars that were there), everything
> works fine.
>
> However since I wanted to make use of spark.jars.packages to download jars
> at runtime, I thought what would also work is if I deleted the original
> hive jars from $SPARK_HOME/jars and download the same jars at runtime.
> Apparently spark.jars.packages should add these jars to the classpath.
> Instead I get a NoClassDefFoundError downloading the same Jars:
>
> ```
> Caused by: java.lang.reflect.InvocationTargetException:
> java.lang.NoClassDefFoundError:
> org/apache/hadoop/hive/ql/metadata/HiveException
>   at
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>   at
> java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown
> Source)
>   at
> java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown
> Source)
>   at java.base/java.lang.reflect.Constructor.newInstance(Unknown Source)
>   at
> org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:227)
>   ... 87 more
> Caused by: java.lang.NoClassDefFoundError:
> org/apache/hadoop/hive/ql/metadata/HiveException
>   at
> org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:75)
>   ... 92 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hive.ql.metadata.HiveException
>   at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(Unknown
> Source)
>   at
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(Unknown
> Source)
>   at java.base/java.lang.ClassLoader.loadClass(Unknown Source)
> ```
>
> The class HiveException should already be available in the jars that have
> been supplied by spark.jars.packages... Any idea what could be wrong?
>
> Thanks,
> Yeachan
>
>
>

Reply via email to