This may or may not be helpful for your classpath issues, but I wanted to
verify that basic functionality worked, so I made a sample app here:
https://github.com/jmahonin/spark-streaming-phoenix
This consumes events off a Kafka topic using spark streaming, and writes
out event counts to Phoenix
Hi Jeroen,
Have you tried using the phoenix-client uber JAR in the Spark classpath?
That strategy I think is the simplest and most straight-forward, although
it may not be appropriate for all projects.
With your setup though, my guess is that Spark is preferring to use its own
versions of Hadoop
Hi,
I posted a question with regards to Phoenix and Spark Streaming on
StackOverflow [1] and realized that I might have more luck trying it on
here. I copied the complete question to this email as well (see below)
If you guys deem it necessary, I can also try my luck on the Spark
mailing