Thanks for the reply. Unfortunately this is the highest version available for 
Cassandra connector. 
One thing I don’t quite understand is that it worked perfectly under Spark 
2.4.0. I thought support for Scala 2.11 only became deprecated starting spark 
2.4.1, will be removed after spark 3.0


Sent from Yahoo Mail for iPhone


On Monday, May 6, 2019, 18:34, Russell Spitzer <russell.spit...@gmail.com> 
wrote:

Scala version mismatched
Spark is shown at 2.12, the connector only has a 2.11 release 



On Mon, May 6, 2019, 7:59 PM Richard Xin <richardxin...@yahoo.com.invalid> 
wrote:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.12</artifactId>
    <version>2.4.0</version>
    <scope>compile</scope>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.12</artifactId>
    <version>2.4.0</version>
</dependency>
<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.11</artifactId>
    <version>2.4.1</version>
</dependency>

I run spark-submit I got following exceptions on Spark 2.4.2, it works fine 
when running  spark-submit under Spark 2.4.0 with exact the same command-line 
call, any idea how do i fix this? Thanks a lot!
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class 
at 
com.datastax.spark.connector.util.ConfigParameter.<init>(ConfigParameter.scala:7)
 at com.datastax.spark.connector.rdd.ReadConf$.<init>(ReadConf.scala:33) at 
com.datastax.spark.connector.rdd.ReadConf$.<clinit>(ReadConf.scala) at 
org.apache.spark.sql.cassandra.DefaultSource$.<init>(DefaultSource.scala:134) 
at org.apache.spark.sql.cassandra.DefaultSource$.<clinit>(DefaultSource.scala) 
at 
org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:55)
 at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
 at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223) at 
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211) at 
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167) at 
com.apple.jmet.pallas.data_migration.DirectMigrationWConfig.main(DirectMigrationWConfig.java:76)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498) at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
 at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167) at 
org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195) at 
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924) at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933) at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)




Reply via email to