Hi Jeroen,
No problem. I think there's some magic involved with how the Spark
classloader(s) works, especially with regards to the HBase dependencies. I
know there's probably a more light-weight solution that doesn't require
customizing the Spark setup, but that's the most straight-forward way
Hi Josh,
That worked! Thank you so much! (I can't believe it was something so obvious
;) )
If you care about such a thing you could answer my question here for bounty:
http://stackoverflow.com/questions/30639659/apache-phoenix-4-3-1-and-4-4-0-hbase-0-98-on-spark-1-3-1-classnotfoundexceptio
Hi Jeroen,
Rather than bundle the Phoenix client JAR with your app, are you able to
include it in a static location either in the SPARK_CLASSPATH, or set the
conf values below (I use SPARK_CLASSPATH myself, though it's deprecated):
spark.driver.extraClassPath
spark.executor.extraClassPath
Hi Josh,
Thank you for your effort. Looking at your code, I feel that mine is
semantically the same, except written in Java. The dependencies in the pom.xml
all have the scope provided. The job is submitted as follows:
$ rm spark.log MASTER=spark://maprdemo:7077
Hi,
I posted a question with regards to Phoenix and Spark Streaming on
StackOverflow [1]. Please find a copy of the question to this email below the
first stack trace. I also already contacted the Phoenix mailing list and tried
the suggestion of setting spark.driver.userClassPathFirst.
This may or may not be helpful for your classpath issues, but I wanted to
verify that basic functionality worked, so I made a sample app here:
https://github.com/jmahonin/spark-streaming-phoenix
This consumes events off a Kafka topic using spark streaming, and writes
out event counts to Phoenix