when you start spark-shell does it work or this issue is only with
spark-submit?

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 18 August 2016 at 10:47, 颜发才(Yan Facai) <yaf...@gmail.com> wrote:

> Hi, all.
>
> I copied hdfs-site.xml, core-site.xml and hive-site.xml to
> $SPARK_HOME/conf.
> And spark-submit is used to submit task to yarn, and run as **client**
> mode.
> However, ClassNotFoundException is thrown.
>
> some details of logs are list below:
> ```
> 16/08/12 17:07:32 INFO hive.HiveUtils: Initializing
> HiveMetastoreConnection version 0.13.1 using file:/data0/facai/lib/hive-0.1
> 3.1/lib:file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
> 16/08/12 17:07:32 ERROR yarn.ApplicationMaster: User class threw
> exception: java.lang.ClassNotFoundException:
> java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/session/SessionState
> when creating Hive client using classpath: 
> file:/data0/facai/lib/hive-0.13.1/lib,
> file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
> ```
>
> In fact, all the jars needed by hive is  in the directory:
> ```Bash
> [hadoop@h107713699 spark_test]$ ls /data0/facai/lib/hive-0.13.1/lib/ |
> grep hive
> hive-ant-0.13.1.jar
> hive-beeline-0.13.1.jar
> hive-cli-0.13.1.jar
> hive-common-0.13.1.jar
> ...
> ```
>
> So, my question is:
> why spark cannot find the jars needed?
>
> Any help will be appreciate, thanks.
>
>

Reply via email to