Hi, everyone.

According the official guide, I copied hdfs-site.xml, core-site.xml and
hive-site.xml to $SPARK_HOME/conf, and write code as below:

```Java
        SparkSession spark = SparkSession
                .builder()
                .appName("Test Hive for Spark")
                .config("spark.sql.hive.metastore.version", "0.13.1")
                .config("spark.sql.hive.metastore.jars",
"/data0/facai/lib/hive-0.13.1/lib:/data0/facai/lib/hadoop-2.
4.1/share/hadoop")
                .enableHiveSupport()
                .getOrCreate();
```


When I use spark-submit to submit tasks to yarn, and run as **client**
mode,  ClassNotFoundException is thrown,  and some details of logs are list
below:
```
16/08/12 17:07:28 INFO execution.SparkSqlParser: Parsing command: SHOW
TABLES
16/08/12 17:07:30 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint:
Registered executor NettyRpcEndpointRef(null) (10.77.113.154:52806) with ID
1
16/08/12 17:07:31 INFO storage.BlockManagerMasterEndpoint: Registering
block manager h113154.mars.grid.sina.com.cn:44756 with 912.3 MB RAM,
BlockManagerId(1, h113154.mars.grid.sina.com.cn, 44756)
16/08/12 17:07:32 INFO hive.HiveUtils: Initializing HiveMetastoreConnection
version 0.13.1 using file:/data0/facai/lib/hive-0.1
3.1/lib:file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
16/08/12 17:07:32 ERROR yarn.ApplicationMaster: User class threw exception:
java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError:
org/apache/hadoop/hive/ql/session/SessionState when creating Hive client
using classpath: file:/data0/facai/lib/hive-0.13.1/lib,
file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
```

However, all the jars needed by hive is indeed in the dir:
```Bash
[hadoop@h107713699 spark_test]$ ls /data0/facai/lib/hive-0.13.1/lib/ | grep
hive
hive-ant-0.13.1.jar
hive-beeline-0.13.1.jar
hive-cli-0.13.1.jar
hive-common-0.13.1.jar
hive-contrib-0.13.1.jar
hive-exec-0.13.1.jar
hive-hbase-handler-0.13.1.jar
hive-hwi-0.13.1.jar
hive-jdbc-0.13.1.jar
hive-metastore-0.13.1.jar
hive-serde-0.13.1.jar
hive-service-0.13.1.jar
hive-shims-0.13.1.jar
hive-shims-0.20-0.13.1.jar
hive-shims-0.20S-0.13.1.jar
hive-shims-0.23-0.13.1.jar
hive-shims-common-0.13.1.jar
hive-shims-common-secure-0.13.1.jar
hive-testutils-0.13.1.jar
```

So,
I wonder why spark cannot find the jars needed?

Any help will be appreciated, thanks.

Reply via email to