Re: Accessing Phoenix table from Spark 2.0., any cure!

2016-10-24 Thread Josh Mahonin
Hi Mich, If you're having the exact same classpath error as that link, it's likely that you're not including the phoenix client JAR for your Spark driver/executor classpath settings. In previous Phoenix releases, it was necessary to use a specific phoenix-client-spark assembly JAR, but as of 4.8.0

Re: Accessing Phoenix table from Spark 2.0., any cure!

2016-10-24 Thread Mich Talebzadeh
Hi Ted, No joy even after adding hbase-common-1.2.3.jar to HADOOP_CLASSPATH and CLASSPATH. Still getting error. This link shows the same issue. Thanks

Re: Accessing Phoenix table from Spark 2.0., any cure!

2016-10-24 Thread Ted Yu
HBaseConfiguration is in hbase-common module. See if hbase-common jar is on the classpath. On Mon, Oct 24, 2016 at 8:22 AM, Mich Talebzadeh wrote: > My stack is this > > Spark: Spark 2.0.0 > Zookeeper: ZooKeeper 3.4.6 > Hbase: hbase-1.2.3 > Phoenix: apache-phoenix-4.8.1-HBase-1.2-bin > > I am r

Accessing Phoenix table from Spark 2.0., any cure!

2016-10-24 Thread Mich Talebzadeh
My stack is this Spark: Spark 2.0.0 Zookeeper: ZooKeeper 3.4.6 Hbase: hbase-1.2.3 Phoenix: apache-phoenix-4.8.1-HBase-1.2-bin I am running this simple code scala> val df = sqlContext.load("org.apache.phoenix.spark", | Map("table" -> "MARKETDATAHBASE", "zkUrl" -> "rhes564:2181") | ) ja

Phoenix driver 4.8.1-HBase-1.1, doesn't wrap schema

2016-10-24 Thread Vivek Paranthaman
Hi, With Phoenix diriver, and Phoenix connection, I tried to get the metaData, and it gives me "null". Is this the behaviour of it, or is it an issue in the Driver? How is this NoSql(HBase) table/schema being converted to JDBCinstances..!? Java 1.7 api. Thanks & Regards, Vivek