[ 
https://issues.apache.org/jira/browse/SPARK-10944?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen closed SPARK-10944.
-----------------------------

> Provide self contained deployment not tighly coupled with Hadoop
> ----------------------------------------------------------------
>
>                 Key: SPARK-10944
>                 URL: https://issues.apache.org/jira/browse/SPARK-10944
>             Project: Spark
>          Issue Type: New Feature
>          Components: Deploy
>    Affects Versions: 1.5.1
>         Environment: Mac OS/Java 8/Spark 1.5.1 without hadoop
>            Reporter: Pranas Baliuka
>            Priority: Minor
>              Labels: patch
>
> Attempt to run Spark cluster on Mac OS machine fails if Hadoop is not 
> installed. There should be no real need to install full blown Hadoop 
> installation just to run Spark.
> Current situation
> {code}
> # cd $SPARK_HOME
> Imin:spark-1.5.1-bin-without-hadoop pranas$ ./sbin/start-master.sh
> {code}
> Output:
> {code}
> starting org.apache.spark.deploy.master.Master, logging to 
> /Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/sbin/../logs/spark-pranas-org.apache.spark.deploy.master.Master-1-Imin.local.out
> failed to launch org.apache.spark.deploy.master.Master:
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>       ... 7 more
> full log in 
> /Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/sbin/../logs/spark-pranas-org.apache.spark.deploy.master.Master-1-Imin.local.out
> {code}
> Log:
> {code}
> # Options read when launching programs locally with
> # ./bin/run-example or ./bin/spark-submit
> Spark Command: 
> /Library/Java/JavaVirtualMachines/jdk1.8.0_40.jdk/Contents/Home/bin/java -cp 
> /Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/sbin/../conf/:/Users/pranas/Apps/spark-1.5.1-bin-without-hadoop/lib/spark-assembly-1.5.1-hadoop2.2.0.jar
>  -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip Imin.local --port 
> 7077 --webui-port 8080
> ========================================
> Error: A JNI error has occurred, please check your installation and try again
> Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
>         at java.lang.Class.getDeclaredMethods0(Native Method)
>         at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
>         at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
>         at java.lang.Class.getMethod0(Class.java:3018)
>         at java.lang.Class.getMethod(Class.java:1784)
>         at 
> sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
>         at 
> sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
> Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> {code}
> Proposed short term fix:
> Bundle all required 3rd party libs to the uberjar and/or fix  start-up script 
> to include required 3rd party libs.
> Long term quality improvement proposal: Introduce integration tests to check 
> distribution before releasing.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to