[ https://issues.apache.org/jira/browse/SPARK-30619?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17026612#comment-17026612 ]
Abhishek Rao commented on SPARK-30619: -------------------------------------- Here are the 2 exceptions SLF4J: Error: A JNI error has occurred, please check your installation and try again Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger at java.lang.Class.getDeclaredMethods0(Native Method) at java.lang.Class.privateGetDeclaredMethods(Class.java:2701) at java.lang.Class.privateGetMethodRecursive(Class.java:3048) at java.lang.Class.getMethod0(Class.java:3018) at java.lang.Class.getMethod(Class.java:1784) at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544) at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526) Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 7 more Commons.collections Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/collections/map/ReferenceMap at org.apache.spark.broadcast.BroadcastManager.<init>(BroadcastManager.scala:58) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:302) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:185) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.<init>(SparkContext.scala:424) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) at org.apache.spark.examples.JavaWordCount.main(JavaWordCount.java:28) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:850) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:925) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:934) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: org.apache.commons.collections.map.ReferenceMap at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ... 19 more > org.slf4j.Logger and org.apache.commons.collections classes not built as part > of hadoop-provided profile > -------------------------------------------------------------------------------------------------------- > > Key: SPARK-30619 > URL: https://issues.apache.org/jira/browse/SPARK-30619 > Project: Spark > Issue Type: Bug > Components: Build > Affects Versions: 2.4.2, 2.4.4 > Environment: Spark on kubernetes > Reporter: Abhishek Rao > Priority: Major > > We're using spark-2.4.4-bin-without-hadoop.tgz and executing Java Word count > (org.apache.spark.examples.JavaWordCount) example on local files. > But we're seeing that it is expecting org.slf4j.Logger and > org.apache.commons.collections classes to be available for executing this. > We expected the binary to work as it is for local files. Is there anything > which we're missing? -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org