Thanks Alonso,
I think this gives me some ideas.
My code is written in Python, and I use spark-submit to submit it.
I am not sure what code is written in scala. Maybe the Phoenix driver based on
the stack trace?
How do I tell which version of scala that was compiled against?
Is there a jar
The error message Caused by: java.lang.ClassNotFoundException:
scala.Product$class indicates that the Spark job is trying to load a class
that is not available in the classpath. This can happen if the Spark job is
compiled with a different version of Scala than the version of Scala that
is used to
I am getting the error below when I try to run a spark job connecting to
phoneix. It seems like I have the incorrect scala version that some part of
the code is expecting.
I am using spark 3.5.0, and I have copied these phoenix jars into the spark lib
phoenix-server-hbase-2.5-5.1.3.jar