HBaseConfiguration is in hbase-common module.

See if hbase-common jar is on the classpath.

On Mon, Oct 24, 2016 at 8:22 AM, Mich Talebzadeh <[email protected]>
wrote:

> My stack is this
>
> Spark: Spark 2.0.0
> Zookeeper: ZooKeeper 3.4.6
> Hbase: hbase-1.2.3
> Phoenix: apache-phoenix-4.8.1-HBase-1.2-bin
>
> I am running this simple code
>
> scala> val df = sqlContext.load("org.apache.phoenix.spark",
>      | Map("table" -> "MARKETDATAHBASE", "zkUrl" -> "rhes564:2181")
>      | )
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
>   at org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(
> PhoenixRDD.scala:71)
>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf$
> lzycompute(PhoenixRDD.scala:39)
>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
>   at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
>   at org.apache.phoenix.spark.PhoenixRelation.schema(
> PhoenixRelation.scala:50)
>   at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(
> LogicalRelation.scala:40)
>   at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(
> SparkSession.scala:382)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
>   at org.apache.spark.sql.SQLContext.load(SQLContext.scala:958)
>   ... 54 elided
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>

Reply via email to