Re: [Spark 2.0] ClassNotFoundException is thrown when using Hive

2016-08-18 Thread Aditya

Try using --files /path/of/hive-site.xml  in spark-submit and run.

On Thursday 18 August 2016 05:26 PM, Diwakar Dhanuskodi wrote:

Hi

Can  you  cross check by providing same library path in --jars of 
spark-submit and run .



Sent from Samsung Mobile.


 Original message 
From: "颜发才(Yan Facai)" 
Date:18/08/2016 15:17 (GMT+05:30)
To: "user.spark" 
Cc:
Subject: [Spark 2.0] ClassNotFoundException is thrown when using Hive

Hi, all.

I copied hdfs-site.xml, core-site.xml and hive-site.xml to 
$SPARK_HOME/conf.
And spark-submit is used to submit task to yarn, and run as **client** 
mode.

However, ClassNotFoundException is thrown.

some details of logs are list below:
```
16/08/12 17:07:32 INFO hive.HiveUtils: Initializing 
HiveMetastoreConnection version 0.13.1 using 
file:/data0/facai/lib/hive-0.13.1/lib:file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
16/08/12 17:07:32 ERROR yarn.ApplicationMaster: User class threw 
exception: java.lang.ClassNotFoundException: 
java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/ql/session/SessionState when creating Hive 
client using classpath: file:/data0/facai/lib/hive-0.13.1/lib, 
file:/data0/facai/lib/hadoop-2.4.1/share/hadoop

```

In fact, all the jars needed by hive is  in the directory:
```Bash
[hadoop@h107713699 spark_test]$ ls /data0/facai/lib/hive-0.13.1/lib/ | 
grep hive

hive-ant-0.13.1.jar
hive-beeline-0.13.1.jar
hive-cli-0.13.1.jar
hive-common-0.13.1.jar
...
```

So, my question is:
why spark cannot find the jars needed?

Any help will be appreciate, thanks.







RE: [Spark 2.0] ClassNotFoundException is thrown when using Hive

2016-08-18 Thread Diwakar Dhanuskodi
Hi

Can  you  cross check by providing same library path in --jars of spark-submit 
and run .


Sent from Samsung Mobile.

 Original message From: "颜发才(Yan Facai)" 
 Date:18/08/2016  15:17  (GMT+05:30) 
To: "user.spark"  Cc:  
Subject: [Spark 2.0] ClassNotFoundException is thrown when using 
Hive 
Hi, all.

I copied hdfs-site.xml, core-site.xml and hive-site.xml to $SPARK_HOME/conf. 
And spark-submit is used to submit task to yarn, and run as **client** mode. 
However, ClassNotFoundException is thrown.

some details of logs are list below:
```
16/08/12 17:07:32 INFO hive.HiveUtils: Initializing HiveMetastoreConnection 
version 0.13.1 using 
file:/data0/facai/lib/hive-0.13.1/lib:file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
16/08/12 17:07:32 ERROR yarn.ApplicationMaster: User class threw exception: 
java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/ql/session/SessionState when creating Hive client using 
classpath: file:/data0/facai/lib/hive-0.13.1/lib, 
file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
```

In fact, all the jars needed by hive is  in the directory:
```Bash
[hadoop@h107713699 spark_test]$ ls /data0/facai/lib/hive-0.13.1/lib/ | grep hive
hive-ant-0.13.1.jar
hive-beeline-0.13.1.jar
hive-cli-0.13.1.jar
hive-common-0.13.1.jar
...
```

So, my question is:
why spark cannot find the jars needed? 

Any help will be appreciate, thanks.



Re: [Spark 2.0] ClassNotFoundException is thrown when using Hive

2016-08-18 Thread Mich Talebzadeh
when you start spark-shell does it work or this issue is only with
spark-submit?

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 18 August 2016 at 10:47, 颜发才(Yan Facai)  wrote:

> Hi, all.
>
> I copied hdfs-site.xml, core-site.xml and hive-site.xml to
> $SPARK_HOME/conf.
> And spark-submit is used to submit task to yarn, and run as **client**
> mode.
> However, ClassNotFoundException is thrown.
>
> some details of logs are list below:
> ```
> 16/08/12 17:07:32 INFO hive.HiveUtils: Initializing
> HiveMetastoreConnection version 0.13.1 using file:/data0/facai/lib/hive-0.1
> 3.1/lib:file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
> 16/08/12 17:07:32 ERROR yarn.ApplicationMaster: User class threw
> exception: java.lang.ClassNotFoundException:
> java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/session/SessionState
> when creating Hive client using classpath: 
> file:/data0/facai/lib/hive-0.13.1/lib,
> file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
> ```
>
> In fact, all the jars needed by hive is  in the directory:
> ```Bash
> [hadoop@h107713699 spark_test]$ ls /data0/facai/lib/hive-0.13.1/lib/ |
> grep hive
> hive-ant-0.13.1.jar
> hive-beeline-0.13.1.jar
> hive-cli-0.13.1.jar
> hive-common-0.13.1.jar
> ...
> ```
>
> So, my question is:
> why spark cannot find the jars needed?
>
> Any help will be appreciate, thanks.
>
>