*java:*
java version "1.7.0_60"
Java(TM) SE Runtime Environment (build 1.7.0_60-b19)
Java HotSpot(TM) 64-Bit Server VM (build 24.60-b09, mixed mode)

*scala*
Scala code runner version 2.10.5 -- Copyright 2002-2013, LAMP/EPFL

*hadoop cluster:*
with 51 Servers that hadoop-2.3-cdh-5.1.0 version ,and also setup Snappy ,
ls /home/cluster/apps/hadoop/lib/native/libsnappy*
/home/cluster/apps/hadoop/lib/native/libsnappyjava.so 
/home/cluster/apps/hadoop/lib/native/libsnappy.so.1
/home/cluster/apps/hadoop/lib/native/libsnappy.so     
/home/cluster/apps/hadoop/lib/native/libsnappy.so.1.1.3

when submit hadoop wordcount with SnappyCodec ,it can run success~

* ./bin/hadoop jar
~/opt/project/hadoop-demo/out/artifacts/hadoop_demo/hadoop-demo.jar 
com.hadoop.demo.mapreduce.base.WordCountWithSnappyCodec 
-Dmapred.map.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec 
/data/hadoop/wordcount/input output13*

But when  submit spark job with yarn-cluster, 

*/home/dp/spark/spark-1.4/spark-1.4.1-test/bin/spark-submit --master
yarn-cluster --executor-memory 3g  --driver-memory 1g  --class
org.apache.spark.examples.SparkPi
/home/dp/spark/spark-1.4/spark-1.4.1-test/examples/target/spark-examples_2.10-1.4.1.jar
10*

PS: As I know spark use of snappy-version:1.1.1.7,but hadoop-2.3-cdh-5.1.0
use of  snappy-version:1.0.4.1,So I  have replaced hadoop-2.3-cdh-5.1.0 with
snappy-version:1.1.1.7,but it also do not work~
It will run fail and blow error:

15/07/22 10:29:55 DEBUG component.AbstractLifeCycle: STARTED
o.s.j.s.ServletContextHandler{/metrics/json,null}
java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:317)
        at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:219)
        at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
        at
org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:150)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:68)
        at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:60)
        at
org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:69)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:513)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:483)
Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in
java.library.path
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
        at java.lang.Runtime.loadLibrary0(Runtime.java:849)
        at java.lang.System.loadLibrary(System.java:1088)
        at
org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)
        ... 23 more
15/07/22 10:29:55 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:68)
        at
org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:60)
        at
org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:69)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:513)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:483)
Caused by: java.lang.IllegalArgumentException
        at
org.apache.spark.io.SnappyCompressionCodec.<init>(CompressionCodec.scala:155)
        ... 15 more
15/07/22 10:29:55 DEBUG component.AbstractLifeCycle: stopping
org.spark-project.jetty.server.Server@3902cacd



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-UnsatisfiedLinkError-no-snappyjava-in-java-library-path-tp23945.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to