HI

Removed export CLASSPATH="$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar”

It works, THANK YOU!!

Regards 
Arthur
 

On 23 Oct, 2014, at 1:00 pm, Shao, Saisai <saisai.s...@intel.com> wrote:

> Seems you just add snappy library into your classpath:
>  
> export CLASSPATH="$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar"
>  
> But for spark itself, it depends on snappy-0.2.jar. Is there any possibility 
> that this problem caused by different version of snappy?
>  
> Thanks
> Jerry
>  
> From: arthur.hk.c...@gmail.com [mailto:arthur.hk.c...@gmail.com] 
> Sent: Thursday, October 23, 2014 11:32 AM
> To: Shao, Saisai
> Cc: arthur.hk.c...@gmail.com; user
> Subject: Re: Spark Hive Snappy Error
>  
> Hi,
>  
> Please find the attached file.
>  
>  
>  
> my spark-default.xml
> # Default system properties included when running spark-submit.
> # This is useful for setting default environmental settings.
> #
> # Example:
> # spark.master            spark://master:7077
> # spark.eventLog.enabled  true
> # spark.eventLog.dir
>   hdfs://namenode:8021/directory
> # spark.serializer        org.apache.spark.serializer.KryoSerializer
> #
> spark.executor.memory           2048m
> spark.shuffle.spill.compress    false
> spark.io.compression.codec
> org.apache.spark.io.SnappyCompressionCodec
>  
>  
>  
> my spark-env.sh
> #!/usr/bin/env bash
> export CLASSPATH="$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar"
> export 
> CLASSPATH="$CLASSPATH:$HIVE_HOME/lib/mysql-connector-java-5.1.31-bin.jar"
> export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/Linux-amd64-64"
> export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop"}
> export SPARK_WORKER_DIR="/edh/hadoop_data/spark_work/"
> export SPARK_LOG_DIR="/edh/hadoop_logs/spark"
> export SPARK_LIBRARY_PATH="$HADOOP_HOME/lib/native/Linux-amd64-64"
> export 
> SPARK_CLASSPATH="$SPARK_HOME/lib_managed/jars/mysql-connector-java-5.1.31-bin.jar"
> export 
> SPARK_CLASSPATH="$SPARK_CLASSPATH:$HBASE_HOME/lib/*:$HIVE_HOME/csv-serde-1.1.2-0.11.0-all.jar:"
> export SPARK_WORKER_MEMORY=2g
> export HADOOP_HEAPSIZE=2000
> export SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER 
> -Dspark.deploy.zookeeper.url=m35:2181,m33:2181,m37:2181"
> export SPARK_JAVA_OPTS=" -XX:+UseConcMarkSweepGC"
>  
>  
> ll $HADOOP_HOME/lib/native/Linux-amd64-64
> -rw-rw-r--. 1 tester tester        50523 Aug 27 14:12 hadoop-auth-2.4.1.jar
> -rw-rw-r--. 1 tester tester      1062640 Aug 27 12:19 libhadoop.a
> -rw-rw-r--. 1 tester tester      1487564 Aug 27 11:14 libhadooppipes.a
> lrwxrwxrwx. 1 tester tester           24 Aug 27 07:08 libhadoopsnappy.so -> 
> libhadoopsnappy.so.0.0.1
> lrwxrwxrwx. 1 tester tester           24 Aug 27 07:08 libhadoopsnappy.so.0 -> 
> libhadoopsnappy.so.0.0.1
> -rwxr-xr-x. 1 tester tester        54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
> -rwxrwxr-x. 1 tester tester       630328 Aug 27 12:19 libhadoop.so
> -rwxrwxr-x. 1 tester tester       630328 Aug 27 12:19 libhadoop.so.1.0.0
> -rw-rw-r--. 1 tester tester       582472 Aug 27 11:14 libhadooputils.a
> -rw-rw-r--. 1 tester tester       298626 Aug 27 11:14 libhdfs.a
> -rwxrwxr-x. 1 tester tester       200370 Aug 27 11:14 libhdfs.so
> -rwxrwxr-x. 1 tester tester       200370 Aug 27 11:14 libhdfs.so.0.0.0
> lrwxrwxrwx. 1 tester tester             55 Aug 27 07:08 libjvm.so 
> ->/usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> lrwxrwxrwx. 1 tester tester           25 Aug 27 07:08 libprotobuf-lite.so -> 
> libprotobuf-lite.so.8.0.0
> lrwxrwxrwx. 1 tester tester           25 Aug 27 07:08 libprotobuf-lite.so.8 
> -> libprotobuf-lite.so.8.0.0
> -rwxr-xr-x. 1 tester tester       964689 Aug 27 07:08 
> libprotobuf-lite.so.8.0.0
> lrwxrwxrwx. 1 tester tester           20 Aug 27 07:08 libprotobuf.so -> 
> libprotobuf.so.8.0.0
> lrwxrwxrwx. 1 tester tester           20 Aug 27 07:08 libprotobuf.so.8 -> 
> libprotobuf.so.8.0.0
> -rwxr-xr-x. 1 tester tester      8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> lrwxrwxrwx. 1 tester tester           18 Aug 27 07:08 libprotoc.so -> 
> libprotoc.so.8.0.0
> lrwxrwxrwx. 1 tester tester           18 Aug 27 07:08 libprotoc.so.8 -> 
> libprotoc.so.8.0.0
> -rwxr-xr-x. 1 tester tester      9935810 Aug 27 07:08 libprotoc.so.8.0.0
> -rw-r--r--. 1 tester tester       233554 Aug 27 15:19 libsnappy.a
> lrwxrwxrwx. 1 tester tester           23 Aug 27 11:32 libsnappy.so -> 
> /usr/lib64/libsnappy.so
> lrwxrwxrwx. 1 tester tester           23 Aug 27 11:33 libsnappy.so.1 -> 
> /usr/lib64/libsnappy.so
> -rwxr-xr-x. 1 tester tester       147726 Aug 27 07:08 libsnappy.so.1.2.0
> drwxr-xr-x. 2 tester tester         4096 Aug 27 07:08 pkgconfig
>  
>  
> Regards
> Arthur
>  
>  
> On 23 Oct, 2014, at 10:57 am, Shao, Saisai <saisai.s...@intel.com> wrote:
> 
> 
> Hi Arthur,
>  
> I think your problem might be different from what 
> SPARK-3958(https://issues.apache.org/jira/browse/SPARK-3958) mentioned, seems 
> your problem is more likely to be a library link problem, would you mind 
> checking your Spark runtime to see if the snappy.so is loaded or not? 
> (through lsof -p).
>  
> I guess your problem is more likely to be a library not found problem.
>  
>  
> Thanks
> Jerry

Reply via email to