Hi,
Spark is unable to load the Phoenix classes it needs. If you're using a
recent version of Phoenix, please ensure the "fat" *client* JAR (or for
older versions of Phoenix, the Phoenix *client*-spark JAR) is on your Spark
driver and executor classpath [1]. The 'phoenix-spark' JAR is
Hi,
I'm trying to write a simple dataframe to Phoenix:
df.save("org.apache.phoenix.spark", SaveMode.Overwrite,
Map("table" -> "TEST_SAVE", "zkUrl" -> "zk.internal:2181"))
I have the following in my pom.xml:
org.apache.phoenix
phoenix-spark