To: Shao, Saisai
Cc: arthur.hk.c...@gmail.com; user
Subject: Re: Spark Hive Snappy Error
Hi,
Please find the attached file.
my spark-default.xml
# Default system properties included when running spark-submit.
# This is useful for setting default environmental settings.
#
# Example
From: arthur.hk.c...@gmail.com [mailto:arthur.hk.c...@gmail.com]
Sent: Friday, October 17, 2014 7:13 AM
To: user
Cc: arthur.hk.c...@gmail.com
Subject: Spark Hive Snappy Error
Hi,
When trying Spark with Hive table, I got the “java.lang.UnsatisfiedLinkError
: Re: Spark Hive Snappy Error
Hi,
Yes, I can always reproduce the issue:
about you workload, Spark configuration, JDK version and OS version?
I ran SparkPI 1000
java -version
java version 1.7.0_67
Java(TM) SE Runtime Environment (build 1.7.0_67-b01)
Java HotSpot(TM) 64-Bit Server VM (build
...@gmail.com [mailto:arthur.hk.c...@gmail.com]
Sent: Wednesday, October 22, 2014 8:35 PM
To: Shao, Saisai
Cc: arthur.hk.c...@gmail.com; user
Subject: Re: Spark Hive Snappy Error
Hi,
Yes, I can always reproduce the issue:
about you workload, Spark configuration, JDK version and OS version
Hi,Please find the attached file.{\rtf1\ansi\ansicpg1252\cocoartf1265\cocoasubrtf210
{\fonttbl\f0\fnil\fcharset0 Menlo-Regular;}
{\colortbl;\red255\green255\blue255;}
\paperw11900\paperh16840\margl1440\margr1440\vieww26300\viewh12480\viewkind0
Hi
May I know where to configure Spark to load libhadoop.so?
Regards
Arthur
On 23 Oct, 2014, at 11:31 am, arthur.hk.c...@gmail.com
arthur.hk.c...@gmail.com wrote:
Hi,
Please find the attached file.
lsof.rtf
my spark-default.xml
# Default system properties included when running
...@gmail.com [mailto:arthur.hk.c...@gmail.com]
Sent: Thursday, October 23, 2014 11:32 AM
To: Shao, Saisai
Cc: arthur.hk.c...@gmail.com; user
Subject: Re: Spark Hive Snappy Error
Hi,
Please find the attached file.
my spark-default.xml
# Default system properties included when running spark-submit
Hi,
When trying Spark with Hive table, I got the “java.lang.UnsatisfiedLinkError:
org.xerial.snappy.SnappyNative.maxCompressedLength(I)I” error,
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext.sql(“select count(1) from q8_national_market_share
sqlContext.sql(select
configuration, JDK version and OS version?
Thanks
Jerry
From: arthur.hk.c...@gmail.com [mailto:arthur.hk.c...@gmail.com]
Sent: Friday, October 17, 2014 7:13 AM
To: user
Cc: arthur.hk.c...@gmail.com
Subject: Spark Hive Snappy Error
Hi,
When trying Spark with Hive table, I got