Hi,
I think you need to include phoenix-client jar, instead of
phoenix-spark.jar. :-)

2017-03-16 16:44 GMT+09:00 Sateesh Karuturi <sateesh.karutu...@gmail.com>:

> Hello folks..,
>
> i am trying to run sample phoenix spark application, while i am trying to
> run i am getting following exception:
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/phoenix/jdbc/PhoenixDriver
>
>
> Here is my sample code:
>
>
> package com.inndata.spark.sparkphoenix;
>
>
> import org.apache.spark.SparkConf;
>
> import org.apache.spark.SparkContext;
>
> import org.apache.spark.api.java.JavaSparkContext;
>
> import org.apache.spark.sql.DataFrame;
>
> import org.apache.spark.sql.SQLContext;
>
>
> import com.google.common.collect.ImmutableMap;
>
>
> import java.io.Serializable;
>
>
> /**
>
>  *
>
>  */
>
> public class SparkConnection implements Serializable {
>
>
>     public static void main(String args[]) {
>
>         SparkConf sparkConf = new SparkConf();
>
>         sparkConf.setAppName("spark-phoenix-df");
>
>         sparkConf.setMaster("local[*]");
>
>         JavaSparkContext sc = new JavaSparkContext(sparkConf);
>
>         SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
>
>
>         /*DataFrame df = sqlContext.read()
>
>                 .format("org.apache.phoenix.spark")
>
>                 .option("table", "TABLE1")
>
>                 .option("zkUrl", "localhost:2181")
>
>                 .load();
>
>         df.count();*/
>
>
>
>         DataFrame fromPhx = sqlContext.read().format("jdbc")
>
> .options(ImmutableMap.of("driver", "org.apache.phoenix.jdbc.PhoenixDriver",
> "url",
>
> "jdbc:phoenix:ZK_QUORUM:2181:/hbase-secure", "dbtable", "TABLE1"))
>
> .load();
>
>
>
>         fromPhx.show();
>
>
>     }
>
> }
>
>
> I have included phoenix-spark jar to the spark library and as well as
> spark-submit command. and i also added *spark.executor.extraClassPath
> and **spark.driver.extraClassPath in spark-env.sh*
>

Reply via email to