[ 
https://issues.apache.org/jira/browse/SPARK-38516?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang updated SPARK-38516:
--------------------------------
    Summary: Add log4j-core, log4j-api and log4j-slf4j-impl to classpath if 
active hadoop-provided  (was: Add log4j-core and log4j-api to classpath if 
active hadoop-provided)

> Add log4j-core, log4j-api and log4j-slf4j-impl to classpath if active 
> hadoop-provided
> -------------------------------------------------------------------------------------
>
>                 Key: SPARK-38516
>                 URL: https://issues.apache.org/jira/browse/SPARK-38516
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 3.3.0
>            Reporter: Yuming Wang
>            Priority: Major
>
> {noformat}
> Error: A JNI error has occurred, please check your installation and try again
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/logging/log4j/core/Filter
>     at java.lang.Class.getDeclaredMethods0(Native Method)
>     at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
>     at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
>     at java.lang.Class.getMethod0(Class.java:3018)
>     at java.lang.Class.getMethod(Class.java:1784)
>     at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
>     at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.logging.log4j.core.Filter
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>     ... 7 more{noformat}
> {noformat}
> Exception in thread "main" java.lang.NoClassDefFoundError: 
> org/apache/logging/log4j/LogManager
>       at 
> org.apache.spark.deploy.yarn.SparkRackResolver.<init>(SparkRackResolver.scala:42)
>       at 
> org.apache.spark.deploy.yarn.SparkRackResolver$.get(SparkRackResolver.scala:114)
>       at 
> org.apache.spark.scheduler.cluster.YarnScheduler.<init>(YarnScheduler.scala:31)
>       at 
> org.apache.spark.scheduler.cluster.YarnClusterManager.createTaskScheduler(YarnClusterManager.scala:35)
>       at 
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2985)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:563)
>       at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
>       at 
> org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
>       at scala.Option.getOrElse(Option.scala:189)
>       at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:54)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:327)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:159)
>       at 
> org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>       at 
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
>       at 
> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
>       at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
>       at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
>       at 
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.logging.log4j.LogManager
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       ... 26 more
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to