0: jdbc:phoenix:master> select count(1) from STORE_SALES;
+--+
| COUNT(1) |
+--+
java.lang.RuntimeException:
org.apache.phoenix.exception.PhoenixIOException:
Robert:
When using `spark-submit`, the application jar along with any jars included
with the `--jars` option
will be automatically transferred to the cluster. URLs supplied after
`--jars` must be separated by commas. That list is included on the driver
and executor classpaths. Directory
Hey Robert,
Probably a better question to ask over at u...@spark.apache.org.
abase-common,jar would be the artifact you’d wanna put on the class path,
though.
-Dima
On Tue, Jul 5, 2016 at 3:39 PM, Robert James wrote:
> I'm using spark-shell. The perplexing thing is
I'm using spark-shell. The perplexing thing is that if I load it via
spark-shell --jars, it seems to work. However, if I load it via
spark.driver.extraClassPath in the config file, it seems to fail.
What is the difference between --jars (command line) and
spark.driver.extraClassPath (config)?
Hey Robert,
HBaseConfiguration is part of the hbase-common module of the HBase project.
Are you using Maven to provide dependencies or just running java -cp?
-Dima
On Monday, July 4, 2016, Robert James wrote:
> When trying to load HBase via Spark, I get
When trying to load HBase via Spark, I get NoClassDefFoundError
org/apache/hadoop/hbase/HBaseConfiguration errors.
How do I provide that class to Spark?