Re: Error: no snappyjava in java.library.path

2015-02-26 Thread Marcelo Vanzin
Hi Dan,

This is a CDH issue, so I'd recommend using cdh-u...@cloudera.org for
those questions.

This is an issue with fixed in recent CM 5.3 updates; if you're not
using CM, or want a workaround, you can manually configure
"spark.driver.extraLibraryPath" and "spark.executor.extraLibraryPath"
to include the path to the $HADOOP_HOME/lib/native/ directory.

(Note this is not a classpath issue, but a native library issue.)


On Thu, Feb 26, 2015 at 2:44 PM, Dan Dong  wrote:
> Hi, All,
>   When I run a small program in spark-shell, I got the following error:
> ...
> Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in
> java.library.path
> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
> at java.lang.Runtime.loadLibrary0(Runtime.java:849)
> at java.lang.System.loadLibrary(System.java:1088)
> at
> org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)
> ... 29 more
> ...
>
> I see the file is actually there under my hadoop installation dir, e.g:
> ./hadoop-2.5.0-cdh5.2.0/share/hadoop/mapreduce2/lib/snappy-java-1.0.4.1.jar
> ./hadoop-2.5.0-cdh5.2.0/share/hadoop/mapreduce1/lib/snappy-java-1.0.4.1.jar
> ./hadoop-2.5.0-cdh5.2.0/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/snappy-java-1.0.4.1.jar
> ./hadoop-2.5.0-cdh5.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar
> ./hadoop-2.5.0-cdh5.2.0/share/hadoop/tools/lib/snappy-java-1.0.4.1.jar
> ./hadoop-2.5.0-cdh5.2.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/snappy-java-1.0.4.1.jar
>
> But although after I included one of the above path in $CLASSPATH, the error
> is still there. So how to set the *PATH*s to resolve it? Thanks!
>
> $ echo $CLASSPATH
> /home/ubuntu/hadoop-2.5.0-cdh5.2.0/share/hadoop/mapreduce/lib
>
> Cheers,
> Dan
>



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Error: no snappyjava in java.library.path

2015-02-26 Thread Dan Dong
Hi, All,
  When I run a small program in spark-shell, I got the following error:
...
Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in
java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
at java.lang.Runtime.loadLibrary0(Runtime.java:849)
at java.lang.System.loadLibrary(System.java:1088)
at
org.xerial.snappy.SnappyNativeLoader.loadLibrary(SnappyNativeLoader.java:52)
... 29 more
...

I see the file is actually there under my hadoop installation dir, e.g:
./hadoop-2.5.0-cdh5.2.0/share/hadoop/mapreduce2/lib/snappy-java-1.0.4.1.jar
./hadoop-2.5.0-cdh5.2.0/share/hadoop/mapreduce1/lib/snappy-java-1.0.4.1.jar
./hadoop-2.5.0-cdh5.2.0/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/snappy-java-1.0.4.1.jar
./hadoop-2.5.0-cdh5.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar
./hadoop-2.5.0-cdh5.2.0/share/hadoop/tools/lib/snappy-java-1.0.4.1.jar
./hadoop-2.5.0-cdh5.2.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/snappy-java-1.0.4.1.jar

But although after I included one of the above path in $CLASSPATH, the
error is still there. So how to set the *PATH*s to resolve it? Thanks!

$ echo $CLASSPATH
/home/ubuntu/hadoop-2.5.0-cdh5.2.0/share/hadoop/mapreduce/lib

Cheers,
Dan