Hi,

 

I have seen mails that state that the user has managed to build spark 1.3 to
work with Hive. I tried Spark 1.5.2 but no luck

 

I downloaded spark source 1.3 source code spark-1.3.0.tar and built it as
follows

 

./make-distribution.sh --name "hadoop2-without-hive" --tgz
"-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"

 

This successfully completed and created the tarred zip file. I then created
spark 1.3 tree from this zipped file. $SPARK_HOME is /usr/lib/spark

 

Other steps that I performed:

 

1.    In $HIVE_HOME/lib , I copied  spark-assembly-1.3.0-hadoop2.4.0.jar  to
this directory

2.  In $SPARK_HOME/conf I created a syblink to
/usr/lib/hive/conf/hive-site.xml

 

Then I tried to start spark master node

 

/usr/lib/spark/sbin/start-master.sh

 

I get the following error:

 

 

cat
/usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Mast
er-1-rhes564.out

Spark Command: /usr/java/latest/bin/java -cp
:/usr/lib/spark/sbin/../conf:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2
.4.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop -XX:MaxPermSize=128m
-Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
org.apache.spark.deploy.master.Master --ip rhes564 --port 7077 --webui-port
8080

========================================

 

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger

        at java.lang.Class.getDeclaredMethods0(Native Method)

        at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)

        at java.lang.Class.getMethod0(Class.java:2764)

        at java.lang.Class.getMethod(Class.java:1653)

        at
sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)

        at
sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)

Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger

        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

        at java.security.AccessController.doPrivileged(Native Method)

        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

 

I also notice that in /usr/lib/spark/lib, I only have the following jar
files

 

-rw-r--r-- 1 hduser hadoop 98795479 Dec  3 09:03
spark-examples-1.3.0-hadoop2.4.0.jar

-rw-r--r-- 1 hduser hadoop 98187168 Dec  3 09:03
spark-assembly-1.3.0-hadoop2.4.0.jar

-rw-r--r-- 1 hduser hadoop  4136760 Dec  3 09:03
spark-1.3.0-yarn-shuffle.jar

 

Wheras in pre-build downloaded one --> /usr/lib/spark-1.3.0-bin-hadoop2.4,
there are additional  JAR files

 

-rw-rw-r-- 1 hduser hadoop   1890075 Mar  6  2015
datanucleus-core-3.2.10.jar

-rw-rw-r-- 1 hduser hadoop 112446389 Mar  6  2015
spark-examples-1.3.0-hadoop2.4.0.jar

-rw-rw-r-- 1 hduser hadoop 159319006 Mar  6  2015
spark-assembly-1.3.0-hadoop2.4.0.jar

-rw-rw-r-- 1 hduser hadoop   4136744 Mar  6  2015
spark-1.3.0-yarn-shuffle.jar

-rw-rw-r-- 1 hduser hadoop   1809447 Mar  6  2015
datanucleus-rdbms-3.2.9.jar

-rw-rw-r-- 1 hduser hadoop    339666 Mar  6  2015
datanucleus-api-jdo-3.2.6.jar

 

Any ideas what is is missing? I am sure someone has sorted this one out
before.

 

 

Thanks,

 

Mich

 

 

 

NOTE: The information in this email is proprietary and confidential. This
message is for the designated recipient only, if you are not the intended
recipient, you should destroy it immediately. Any information in this
message shall not be understood as given or endorsed by Peridale Technology
Ltd, its subsidiaries or their employees, unless expressly so stated. It is
the responsibility of the recipient to ensure that this email is virus free,
therefore neither Peridale Ltd, its subsidiaries nor their employees accept
any responsibility.

 

Reply via email to