Re: Getting Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: org.apache.phoenix.spark. Please find packages at http://spark-packages.org Exception

2017-03-16 Thread Josh Mahonin
Hi Sateesh,

It seems you are missing the import which gives Spark visibility into the
"org.apache.phoenix.spark". From the documentation page:

*import org.apache.phoenix.spark._*

I'm not entirely sure how this works in Java, however. You might have some
luck with:

*import static org.apache.phoenix.spark.*;*

If you do get this working, please update the list. It would be nice to add
this to the existing documentation.

Josh



On Wed, Mar 15, 2017 at 2:06 PM, Sateesh Karuturi <
sateesh.karutu...@gmail.com> wrote:

> Hello folks..,
>
> I am trying to execute sample spark-phoenix application.
> but i am getting
>  Exception in thread "main" java.lang.ClassNotFoundException: Failed to
> find data source: org.apache.phoenix.spark. Please find packages at
> http://spark-packages.org exception.
>
> here is my code:
>
> package com.inndata.spark.sparkphoenix;
>
> import org.apache.spark.SparkConf;
> import org.apache.spark.SparkContext;
> import org.apache.spark.api.java.JavaSparkContext;
> import org.apache.spark.sql.DataFrame;
> import org.apache.spark.sql.SQLContext;
>
> import java.io.Serializable;
>
> /**
>  *
>  */
> public class SparkConnection implements Serializable {
>
> public static void main(String args[]) {
> SparkConf sparkConf = new SparkConf();
> sparkConf.setAppName("spark-phoenix-df");
> sparkConf.setMaster("local[*]");
> JavaSparkContext sc = new JavaSparkContext(sparkConf);
> SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
>
> DataFrame df = sqlContext.read()
> .format("org.apache.phoenix.spark")
> .option("table", "ORDERS")
> .option("zkUrl", "localhost:2181")
> .load();
> df.count();
>
> }
> }
>
> and here is my pom.xml:
>
> 
>   org.apache.phoenix
>   phoenix-core
>   4.8.0-HBase-1.2
> 
>
> 
>   org.scala-lang
>   scala-library
>   2.10.6
>   provided
> 
> 
> org.apache.phoenix
> phoenix-spark
> 4.8.0-HBase-1.2
>
> 
>
>
> 
>   org.apache.spark
>   spark-core_2.10
>   1.6.2
> 
>
> 
> org.apache.spark
> spark-sql_2.10
> 1.6.2
> 
>
>
> 
>   org.apache.hadoop
>   hadoop-client
>   2.7.3
>
> 
>
> 
>   org.apache.hadoop
>   hadoop-common
>   2.7.3
>
> 
>
> 
>   org.apache.hadoop
>   hadoop-common
>   2.7.3
>
> 
>
> 
>   org.apache.hadoop
>   hadoop-hdfs
>   2.7.3
>
> 
>
> 
>   org.apache.hbase
>   hbase-client
>   1.2.4
>
>
> 
>
>
>
> 
>   org.apache.hbase
>   hbase-hadoop-compat
>   1.2.4
>
> 
>
> 
>   org.apache.hbase
>   hbase-hadoop2-compat
>   1.2.4
>
> 
> 
>   org.apache.hbase
>   hbase-server
>   1.2.4
> 
>
> 
>   org.apache.hbase
>   hbase-it
>   1.2.4
>   test-jar
> 
>
>
> 
>   junit
>   junit
>   3.8.1
>   test
> 
>   
>
>
> here is the stackoverflow link:
>
>
> http://stackoverflow.com/questions/42816998/getting-failed-to-find-data-source-org-apache-phoenix-spark-please-find-packag
>
> please help me out.
>
>


Re: Getting Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/phoenix/jdbc/PhoenixDriver Exception

2017-03-16 Thread NaHeon Kim
Hi,
I think you need to include phoenix-client jar, instead of
phoenix-spark.jar. :-)

2017-03-16 16:44 GMT+09:00 Sateesh Karuturi :

> Hello folks..,
>
> i am trying to run sample phoenix spark application, while i am trying to
> run i am getting following exception:
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/phoenix/jdbc/PhoenixDriver
>
>
> Here is my sample code:
>
>
> package com.inndata.spark.sparkphoenix;
>
>
> import org.apache.spark.SparkConf;
>
> import org.apache.spark.SparkContext;
>
> import org.apache.spark.api.java.JavaSparkContext;
>
> import org.apache.spark.sql.DataFrame;
>
> import org.apache.spark.sql.SQLContext;
>
>
> import com.google.common.collect.ImmutableMap;
>
>
> import java.io.Serializable;
>
>
> /**
>
>  *
>
>  */
>
> public class SparkConnection implements Serializable {
>
>
> public static void main(String args[]) {
>
> SparkConf sparkConf = new SparkConf();
>
> sparkConf.setAppName("spark-phoenix-df");
>
> sparkConf.setMaster("local[*]");
>
> JavaSparkContext sc = new JavaSparkContext(sparkConf);
>
> SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
>
>
> /*DataFrame df = sqlContext.read()
>
> .format("org.apache.phoenix.spark")
>
> .option("table", "TABLE1")
>
> .option("zkUrl", "localhost:2181")
>
> .load();
>
> df.count();*/
>
>
>
> DataFrame fromPhx = sqlContext.read().format("jdbc")
>
> .options(ImmutableMap.of("driver", "org.apache.phoenix.jdbc.PhoenixDriver",
> "url",
>
> "jdbc:phoenix:ZK_QUORUM:2181:/hbase-secure", "dbtable", "TABLE1"))
>
> .load();
>
>
>
> fromPhx.show();
>
>
> }
>
> }
>
>
> I have included phoenix-spark jar to the spark library and as well as
> spark-submit command. and i also added *spark.executor.extraClassPath
> and **spark.driver.extraClassPath in spark-env.sh*
>


Getting Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/phoenix/jdbc/PhoenixDriver Exception

2017-03-16 Thread Sateesh Karuturi
Hello folks..,

i am trying to run sample phoenix spark application, while i am trying to
run i am getting following exception:

Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/phoenix/jdbc/PhoenixDriver


Here is my sample code:


package com.inndata.spark.sparkphoenix;


import org.apache.spark.SparkConf;

import org.apache.spark.SparkContext;

import org.apache.spark.api.java.JavaSparkContext;

import org.apache.spark.sql.DataFrame;

import org.apache.spark.sql.SQLContext;


import com.google.common.collect.ImmutableMap;


import java.io.Serializable;


/**

 *

 */

public class SparkConnection implements Serializable {


public static void main(String args[]) {

SparkConf sparkConf = new SparkConf();

sparkConf.setAppName("spark-phoenix-df");

sparkConf.setMaster("local[*]");

JavaSparkContext sc = new JavaSparkContext(sparkConf);

SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);


/*DataFrame df = sqlContext.read()

.format("org.apache.phoenix.spark")

.option("table", "TABLE1")

.option("zkUrl", "localhost:2181")

.load();

df.count();*/



DataFrame fromPhx = sqlContext.read().format("jdbc")

.options(ImmutableMap.of("driver", "org.apache.phoenix.jdbc.PhoenixDriver",
"url",

"jdbc:phoenix:ZK_QUORUM:2181:/hbase-secure", "dbtable", "TABLE1"))

.load();



fromPhx.show();


}

}


I have included phoenix-spark jar to the spark library and as well as
spark-submit command. and i also added *spark.executor.extraClassPath
and **spark.driver.extraClassPath
in spark-env.sh*