Easiest way should be to add both jars in SPARK_CLASSPATH as a colon
separated string.
On 10 Aug 2015 06:20, "Jonathan Haddad" <j...@jonhaddad.com> wrote:

> I'm trying to write a simple job for Pyspark 1.4 migrating data from MySQL
> to Cassandra.  I can work with either the MySQL JDBC jar or the cassandra
> jar separately without issue, but when I try to reference both of them it
> throws an exception:
>
> Py4JJavaError: An error occurred while calling o32.save.
> : java.lang.NoSuchMethodError:
> scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
>
> I'm not sure if I'm including the jars correctly as --jars says it's comma
> separated and --driver-class-path seems to take a colon delimited
> classpath.  If I separate the list in --driver-class-path with a comma, i
> get a class not found exception so I'm thinking colon is right.
>
> The job, params for submission, and exception are here.  Help getting this
> going would be deeply appreciated.
>
> https://gist.github.com/rustyrazorblade/9a38a9499a7531eefd1e
>
>

Reply via email to