It still seems that Spark is unable to find all of the Phoenix/HBase
classes that are necessary.

As a reference, I've got a Docker image that might help:

https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark

The versions of Phoenix and Spark it uses are a bit out of date, but it
shows the necessary classes and settings to make Spark happy.

Good luck!

Josh

On Thu, Feb 16, 2017 at 4:10 AM, Dequn Zhang <[email protected]>
wrote:

> Please check whether your table is created by Phoenix( means this table is
> not a *Mapping* ) , you can follow the sample on phoenix official site,
> only need *change the version to the latest*, use *phoenix-client*
> instead, and promise *Schema Corresponding*. Create a new table to test,
> use simple data, *be independent to your business*.
>
> We don’t know your background and code, so that’s all I can help you.
>
>
> On 16 February 2017 at 16:57:26, Nimrod Oren (nimrod.oren@veracity-group.
> com) wrote:
>
> org.apache.hadoop.hbase.HTableDescriptor.setValue
>
>

Reply via email to