[ https://issues.apache.org/jira/browse/SPARK-10944?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14944539#comment-14944539 ]
Pranas Baliuka edited comment on SPARK-10944 at 10/6/15 5:42 AM: ----------------------------------------------------------------- If one wants to deploy Spark without Hadoop it should be possible. Currently even path names and jar names conflicts each other: {quote} spark-1.5.1-bin-without-hadoop/lib/spark-assembly-1.5.1-hadoop2.2.0.jar {quote} Long term solution: remove Hadoop mentioning in the paths and jar names. was (Author: pranas): If one wants to deploy Spark without Hadoop it should be possible. Currently even path names and jar names conflicts each other: {qoute} spark-1.5.1-bin-without-hadoop/lib/spark-assembly-1.5.1-hadoop2.2.0.jar {qoute} > org/slf4j/Logger is not provided in > spark-1.5.1-bin-without-hadoop/lib/spark-assembly-1.5.1-hadoop2.2.0.jar > ----------------------------------------------------------------------------------------------------------- > > Key: SPARK-10944 > URL: https://issues.apache.org/jira/browse/SPARK-10944 > Project: Spark > Issue Type: Bug > Components: Deploy > Affects Versions: 1.5.1 > Environment: Mac OS/Java 8/Spark 1.5.1 without hadoop > Reporter: Pranas Baliuka > Priority: Blocker > Labels: easyfix, patch > Original Estimate: 2h > Remaining Estimate: 2h > > Attempt to run Spark cluster on Mac OS machine fails > Invocation: > {code} > # cd $SPARK_HOME > Imin:spark-1.5.1-bin-without-hadoop pranas$ ./sbin/start-master.sh > {code} > Output: > {code} > starting org.apache.spark.deploy.master.Master, logging to > /Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/sbin/../logs/spark-pranas-org.apache.spark.deploy.master.Master-1-Imin.local.out > failed to launch org.apache.spark.deploy.master.Master: > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > ... 7 more > full log in > /Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/sbin/../logs/spark-pranas-org.apache.spark.deploy.master.Master-1-Imin.local.out > {code} > Log: > {code} > # Options read when launching programs locally with > # ./bin/run-example or ./bin/spark-submit > Spark Command: > /Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/bin/java -cp > /Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/sbin/../conf/:/Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/lib/spark-assembly-1.5.1-hadoop2.2.0.jar > -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip Imin.local --port > 7077 --webui-port 8080 > ======================================== > Error: A JNI error has occurred, please check your installation and try again > Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger > at java.lang.Class.getDeclaredMethods0(Native Method) > at java.lang.Class.privateGetDeclaredMethods(Class.java:2701) > at java.lang.Class.privateGetMethodRecursive(Class.java:3048) > at java.lang.Class.getMethod0(Class.java:3018) > at java.lang.Class.getMethod(Class.java:1784) > at > sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544) > at > sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526) > Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger > at java.net.URLClassLoader.findClass(URLClassLoader.java:381) > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) > {code} > Proposed short term fix: > Bundle all required 3rd party libs to the uberjar and/or fix start-up script > to include required 3rd party libs. > Long term quality improvement proposal: Introduce integration tests to check > distribution before releasing. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org