ing this problem for users.
>
> On 9/17/18 3:27 PM, Saif Addin wrote:
> > Thanks for the patience, sorry maybe I sent incomplete information. We
> > are loading the following jars and still getting: */executor 1):
> > java.lang.NoClassDefFoundError: Could not initialize class
which one I could be missing??
On Fri, Sep 14, 2018 at 7:34 PM Josh Elser wrote:
> Uh, you're definitely not using the right JARs :)
>
> You'll want the phoenix-client.jar for the Phoenix JDBC driver and the
> phoenix-spark.jar for the Phoenix RDD.
>
> On 9/14/
Pretty sure we ran tests with Spark 2.3 with Phoenix 5.0. Not sure if
> Spark has already moved beyond that.
>
> On 9/12/18 11:00 PM, Saif Addin wrote:
> > Thanks, we'll try Spark Connector then. Thought it didn't support newest
> > Spark Versions
> >
> >
e an HBase table
> and uses Phoenix mapping it.
>
>
>Jaanai Zhang
>Best regards!
>
>
>
> Thomas D'Silva 于2018年9月13日周四 上午6:03写道:
>
>> Is there a reason you didn't use the spark-connector to serialize your
>
to interact with Phoenix?
>
> Sounds to me that Phoenix is expecting more data at the head of your
> rowkey. Maybe a salt bucket that you've defined on the table but not
> created?
>
> On 9/12/18 4:32 PM, Saif Addin wrote:
> > Hi all,
> >
> > We're trying to wri
Hi all,
We're trying to write tables with all string columns from spark.
We are not using the Spark Connector, instead we are directly writing byte
arrays from RDDs.
The process works fine, and Hbase receives the data correctly, and content
is consistent.
However reading the table from Phoenix,