Re: multiple dependency jars using pyspark

2015-08-10 Thread Jonathan Haddad
I figured out the issue - it had to do with the Cassandra jar I had compiled. I had tested a previous version. Using --jars (comma separated) and --driver-class-path (colon separated) is working. On Mon, Aug 10, 2015 at 1:08 AM ayan guha guha.a...@gmail.com wrote: Easiest way should be to add

Re: multiple dependency jars using pyspark

2015-08-10 Thread ayan guha
Easiest way should be to add both jars in SPARK_CLASSPATH as a colon separated string. On 10 Aug 2015 06:20, Jonathan Haddad j...@jonhaddad.com wrote: I'm trying to write a simple job for Pyspark 1.4 migrating data from MySQL to Cassandra. I can work with either the MySQL JDBC jar or the

multiple dependency jars using pyspark

2015-08-09 Thread Jonathan Haddad
I'm trying to write a simple job for Pyspark 1.4 migrating data from MySQL to Cassandra. I can work with either the MySQL JDBC jar or the cassandra jar separately without issue, but when I try to reference both of them it throws an exception: Py4JJavaError: An error occurred while calling