Re: Write to Cassandra table from pyspark fails with scala reflect error

2016-09-15 Thread neil90
You need to use 2.0.0-M2-s_2.11 since Spark 2.0 is compiled with Scala 2.11
by default.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Write-to-Cassandra-table-from-pyspark-fails-with-scala-reflect-error-tp27723p27729.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Write to Cassandra table from pyspark fails with scala reflect error

2016-09-15 Thread Russell Spitzer
If the download fails you have to start figuring out if you have network
issues or if your local cache is messed up :( I would see if you can
manually pull that artifact or try running through just spark-shell first
to see if that gives any more verbose output.

On Thu, Sep 15, 2016 at 6:48 AM Trivedi Amit <amit_...@yahoo.com> wrote:

> Thanks Russell. I didn't build this myself. I tried with Scala 2.11
> com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M(1-3) and I am
> getting
>
> ```
> Exception in thread "main" java.lang.RuntimeException: [download failed:
> org.scala-lang#scala-reflect;2.11.8!scala-reflect.jar]
> at
> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1066)
> at
> org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:294)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:158)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Traceback (most recent call last):
>   File "$SPARK_HOME/python/pyspark/shell.py", line 38, in 
> SparkContext._ensure_initialized()
>   File "$SPARK_HOME/python/pyspark/context.py", line 243, in
> _ensure_initialized
> SparkContext._gateway = gateway or launch_gateway()
>   File "$SPARK_HOME/python/pyspark/java_gateway.py", line 94, in
> launch_gateway
> raise Exception("Java gateway process exited before sending the driver
> its port number")
> Exception: Java gateway process exited before sending the driver its port
> number
> ```
> I deleted my .m2 directory to avoid any conflicts with cached or older
> versions. I only have SPARK_HOME environment variable set (env variables
> related to Spark and Python).
>
> --
> *From:* Russell Spitzer <russell.spit...@gmail.com>
> *To:* Trivedi Amit <amit_...@yahoo.com>; "user@spark.apache.org" <
> user@spark.apache.org>
> *Sent:* Wednesday, September 14, 2016 11:24 PM
> *Subject:* Re: Write to Cassandra table from pyspark fails with scala
> reflect error
>
> Spark 2.0 defaults to Scala 2.11, so if you didn't build it yourself you
> need the 2.11 artifact for the Spark Cassandra Connector.
>
> On Wed, Sep 14, 2016 at 7:44 PM Trivedi Amit <amit_...@yahoo.com.invalid>
> wrote:
>
> Hi,
>
>
>
> I am testing a pyspark program that will read from a csv file and write
> data into Cassandra table. I am using pyspark with
> spark-cassandra-connector 2.10:2.0.0-M3. I am using Spark v2.0.0.
>
> While executing below command
>
> ```df.write.format("org.apache.spark.sql.cassandra").mode('append').options(
> table="test_table", keyspace="test").save()```
>
> I am getting
> ```
> py4j.protocol.Py4JJavaError: An error occurred while calling o47.save.
> : java.lang.NoSuchMethodError:
> scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
>
> at
> com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala:116)
>
> at
> com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala)
>
> at
> com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:50)
>
> at
> com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:46)
>
> at
> com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
>
> at
> com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
>
> at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>
> at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>
> at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> at
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at
> scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
> at scala.collection.AbstractTraversable.map(Traversable.scala:104)
> at
> com.datastax.spark.connector.writer.SqlRowWriter.(SqlRowWriter.scala:18)
>
> at
> com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:36)
>
> at
> com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWri

Re: Write to Cassandra table from pyspark fails with scala reflect error [RESOLVED]

2016-09-15 Thread Trivedi Amit
There was some environment issue. I basically removed all environment variables 
and with 2.11 and it worked.
Thanks for help.

  From: Trivedi Amit <amit_...@yahoo.com.INVALID>
 To: Russell Spitzer <russell.spit...@gmail.com>; "user@spark.apache.org" 
<user@spark.apache.org> 
 Sent: Thursday, September 15, 2016 9:47 AM
 Subject: Re: Write to Cassandra table from pyspark fails with scala reflect 
error
   
Thanks Russell. I didn't build this myself. I tried with Scala 2.11 
com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M(1-3) and I am getting
```
Exception in thread "main" java.lang.RuntimeException: [download failed: 
org.scala-lang#scala-reflect;2.11.8!scala-reflect.jar]
    at 
org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1066)
    at 
org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:294)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:158)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
  File "$SPARK_HOME/python/pyspark/shell.py", line 38, in 
    SparkContext._ensure_initialized()
  File "$SPARK_HOME/python/pyspark/context.py", line 243, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "$SPARK_HOME/python/pyspark/java_gateway.py", line 94, in launch_gateway
    raise Exception("Java gateway process exited before sending the driver its 
port number")
Exception: Java gateway process exited before sending the driver its port 
number```
I deleted my .m2 directory to avoid any conflicts with cached or older 
versions. I only have SPARK_HOME environment variable set (env variables 
related to Spark and Python). 

  From: Russell Spitzer <russell.spit...@gmail.com>
 To: Trivedi Amit <amit_...@yahoo.com>; "user@spark.apache.org" 
<user@spark.apache.org> 
 Sent: Wednesday, September 14, 2016 11:24 PM
 Subject: Re: Write to Cassandra table from pyspark fails with scala reflect 
error
  
Spark 2.0 defaults to Scala 2.11, so if you didn't build it yourself you need 
the 2.11 artifact for the Spark Cassandra Connector.
On Wed, Sep 14, 2016 at 7:44 PM Trivedi Amit <amit_...@yahoo.com.invalid> wrote:

Hi,



I am testing a pyspark program that will read from a csv file and write data 
into Cassandra table. I am using pyspark with spark-cassandra-connector 
2.10:2.0.0-M3. I am using Spark v2.0.0.

While executing below command

```df.write.format("org.apache.spark.sql.cassandra").mode('append').options( 
table="test_table", keyspace="test").save()```

I am getting
```
py4j.protocol.Py4JJavaError: An error occurred while calling o47.save.
: java.lang.NoSuchMethodError: 
scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
        at 
com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala:116)
        at 
com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala)
        at 
com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:50)
        at 
com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:46)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at scala.collection.Iterator$class.foreach(Iterator.scala:893)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
        at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
        at scala.collection.AbstractTraversable.map(Traversable.scala:104)
        at 
com.datastax.spark.connector.writer.SqlRowWriter.(SqlRowWriter.scala:18)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:36)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:34)
        at 
com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:271)
        at 
com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
        at 
org.apache.spark.sql.cassandra.CassandraSourceRelation.insert(CassandraSourceRelation.scala:66)
        at 
org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:85)
        at 
org.apache.spark.s

Re: Write to Cassandra table from pyspark fails with scala reflect error

2016-09-15 Thread Trivedi Amit
Thanks Russell. I didn't build this myself. I tried with Scala 2.11 
com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M(1-3) and I am getting
```
Exception in thread "main" java.lang.RuntimeException: [download failed: 
org.scala-lang#scala-reflect;2.11.8!scala-reflect.jar]
    at 
org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1066)
    at 
org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:294)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:158)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
  File "$SPARK_HOME/python/pyspark/shell.py", line 38, in 
    SparkContext._ensure_initialized()
  File "$SPARK_HOME/python/pyspark/context.py", line 243, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "$SPARK_HOME/python/pyspark/java_gateway.py", line 94, in launch_gateway
    raise Exception("Java gateway process exited before sending the driver its 
port number")
Exception: Java gateway process exited before sending the driver its port 
number```
I deleted my .m2 directory to avoid any conflicts with cached or older 
versions. I only have SPARK_HOME environment variable set (env variables 
related to Spark and Python). 

  From: Russell Spitzer <russell.spit...@gmail.com>
 To: Trivedi Amit <amit_...@yahoo.com>; "user@spark.apache.org" 
<user@spark.apache.org> 
 Sent: Wednesday, September 14, 2016 11:24 PM
 Subject: Re: Write to Cassandra table from pyspark fails with scala reflect 
error
   
Spark 2.0 defaults to Scala 2.11, so if you didn't build it yourself you need 
the 2.11 artifact for the Spark Cassandra Connector.
On Wed, Sep 14, 2016 at 7:44 PM Trivedi Amit <amit_...@yahoo.com.invalid> wrote:

Hi,



I am testing a pyspark program that will read from a csv file and write data 
into Cassandra table. I am using pyspark with spark-cassandra-connector 
2.10:2.0.0-M3. I am using Spark v2.0.0.

While executing below command

```df.write.format("org.apache.spark.sql.cassandra").mode('append').options( 
table="test_table", keyspace="test").save()```

I am getting
```
py4j.protocol.Py4JJavaError: An error occurred while calling o47.save.
: java.lang.NoSuchMethodError: 
scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
        at 
com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala:116)
        at 
com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala)
        at 
com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:50)
        at 
com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:46)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at scala.collection.Iterator$class.foreach(Iterator.scala:893)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
        at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
        at scala.collection.AbstractTraversable.map(Traversable.scala:104)
        at 
com.datastax.spark.connector.writer.SqlRowWriter.(SqlRowWriter.scala:18)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:36)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:34)
        at 
com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:271)
        at 
com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
        at 
org.apache.spark.sql.cassandra.CassandraSourceRelation.insert(CassandraSourceRelation.scala:66)
        at 
org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:85)
        at 
org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:429)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:211)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.

Re: Write to Cassandra table from pyspark fails with scala reflect error

2016-09-14 Thread Russell Spitzer
Spark 2.0 defaults to Scala 2.11, so if you didn't build it yourself you
need the 2.11 artifact for the Spark Cassandra Connector.

On Wed, Sep 14, 2016 at 7:44 PM Trivedi Amit 
wrote:

> Hi,
>
>
>
> I am testing a pyspark program that will read from a csv file and write
> data into Cassandra table. I am using pyspark with
> spark-cassandra-connector 2.10:2.0.0-M3. I am using Spark v2.0.0.
>
> While executing below command
>
> ```df.write.format("org.apache.spark.sql.cassandra").mode('append').options(
> table="test_table", keyspace="test").save()```
>
> I am getting
> ```
> py4j.protocol.Py4JJavaError: An error occurred while calling o47.save.
> : java.lang.NoSuchMethodError:
> scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
>
> at
> com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala:116)
>
> at
> com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala)
>
> at
> com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:50)
>
> at
> com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:46)
>
> at
> com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
>
> at
> com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
>
> at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>
> at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>
> at scala.collection.Iterator$class.foreach(Iterator.scala:893)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
> at
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at
> scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
> at scala.collection.AbstractTraversable.map(Traversable.scala:104)
> at
> com.datastax.spark.connector.writer.SqlRowWriter.(SqlRowWriter.scala:18)
>
> at
> com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:36)
>
> at
> com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:34)
>
> at
> com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:271)
>
> at
> com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
>
> at
> org.apache.spark.sql.cassandra.CassandraSourceRelation.insert(CassandraSourceRelation.scala:66)
>
> at
> org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:85)
>
> at
> org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:429)
>
> at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:211)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:498)
> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237)
> at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
> at py4j.Gateway.invoke(Gateway.java:280)
> at
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:128)
> at py4j.commands.CallCommand.execute(CallCommand.java:79)
> at py4j.GatewayConnection.run(GatewayConnection.java:211)
> at java.lang.Thread.run(Thread.java:745)
> ```
>
> Google search for this error lead to threads where folks were talking
> about using scala version 2.10 instead of 2.11 for this issue. However, I
> am not using Scala and I am assuming that 2.10 in spark-cassandra-connector
> is Scala version.
>
> Don't know how to fix or get around this issue. Appreciate any help.
>
> Thanks
> AT
>
>
>


Re: Write to Cassandra table from pyspark fails with scala reflect error

2016-09-14 Thread Trivedi Amit
Hi,



I am testing a pyspark program that will read from a csv file and write data 
into Cassandra table. I am using pyspark with spark-cassandra-connector 
2.10:2.0.0-M3. I am using Spark v2.0.0.

While executing below command

```df.write.format("org.apache.spark.sql.cassandra").mode('append').options( 
table="test_table", keyspace="test").save()```

I am getting
```
py4j.protocol.Py4JJavaError: An error occurred while calling o47.save.
: java.lang.NoSuchMethodError: 
scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
        at 
com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala:116)
        at 
com.datastax.spark.connector.types.TypeConverter$.(TypeConverter.scala)
        at 
com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:50)
        at 
com.datastax.spark.connector.types.BigIntType$.converterToCassandra(PrimitiveColumnType.scala:46)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$$anonfun$3.apply(SqlRowWriter.scala:18)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at scala.collection.Iterator$class.foreach(Iterator.scala:893)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
        at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
        at scala.collection.AbstractTraversable.map(Traversable.scala:104)
        at 
com.datastax.spark.connector.writer.SqlRowWriter.(SqlRowWriter.scala:18)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:36)
        at 
com.datastax.spark.connector.writer.SqlRowWriter$Factory$.rowWriter(SqlRowWriter.scala:34)
        at 
com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:271)
        at 
com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
        at 
org.apache.spark.sql.cassandra.CassandraSourceRelation.insert(CassandraSourceRelation.scala:66)
        at 
org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:85)
        at 
org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:429)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:211)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:280)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:128)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.GatewayConnection.run(GatewayConnection.java:211)
        at java.lang.Thread.run(Thread.java:745)
```

Google search for this error lead to threads where folks were talking about 
using scala version 2.10 instead of 2.11 for this issue. However, I am not 
using Scala and I am assuming that 2.10 in spark-cassandra-connector is Scala 
version. 

Don't know how to fix or get around this issue. Appreciate any help.
ThanksAT