2.4.3 Binary is out now and they did change back to 2.11.
https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz

On Mon, May 6, 2019 at 9:21 PM Russell Spitzer <russell.spit...@gmail.com>
wrote:

> Spark 2.4.2 was incorrectly released with the default package binaries set
> to Scala 2.12
> <https://lists.apache.org/thread.html/af556c307b1ff9672f400964c8d13b081c4edb9821cf30c22aaac8a1@%3Cdev.spark.apache.org%3E>instead
> of scala 2.11.12 which was supposed to be the case. See the 2.4.3 vote
> <https://lists.apache.org/thread.html/609a820ea4dc56a31b9766142834b6954bb9c567ea85adca9ea099c8@%3Cdev.spark.apache.org%3E>that
> is happening at this very moment. While Spark can be built against 2.12 the
> correct default for binaries was supposed to be 2.11. So either build Spark
> 2.4.2 with 2.11 or wait for the 2.4.3 release which will be very soon.
>
> So the reason that your 2.4.0 build works is that the default binaries
> were built against 2.11, so even though you build specified 2.12 (at least
> as far as I can tell) your runtime was the prebuilt 2.4.0 version. So no
> linkage errors at runtime. 2.4.1 similarly had default binaries with 2.11
> and only 2.4.2 switched the minor version of scala. The 2.4.3 release will
> switch this back.
>
> On Mon, May 6, 2019, 9:06 PM Richard Xin <richardxin...@yahoo.com> wrote:
>
>> Thanks for the reply.
>> Unfortunately this is the highest version available for Cassandra
>> connector.
>>
>> One thing I don’t quite understand is that it worked perfectly under
>> Spark 2.4.0. I thought support for Scala 2.11 only became deprecated
>> starting spark 2.4.1, will be removed after spark 3.0
>>
>>
>> Sent from Yahoo Mail for iPhone
>> <https://overview.mail.yahoo.com/?.src=iOS>
>>
>> On Monday, May 6, 2019, 18:34, Russell Spitzer <russell.spit...@gmail.com>
>> wrote:
>>
>> Scala version mismatched
>>
>> Spark is shown at 2.12, the connector only has a 2.11 release
>>
>>
>>
>> On Mon, May 6, 2019, 7:59 PM Richard Xin <richardxin...@yahoo.com.invalid>
>> wrote:
>>
>> <dependency>
>>     <groupId>org.apache.spark</groupId>
>>     <artifactId>spark-core_2.12</artifactId>
>>     <version>2.4.0</version>
>>     <scope>compile</scope>
>> </dependency>
>> <dependency>
>>     <groupId>org.apache.spark</groupId>
>>     <artifactId>spark-sql_2.12</artifactId>
>>     <version>2.4.0</version>
>> </dependency>
>> <dependency>
>>     <groupId>com.datastax.spark</groupId>
>>     <artifactId>spark-cassandra-connector_2.11</artifactId>
>>     <version>2.4.1</version>
>> </dependency>
>>
>>
>>
>> I run spark-submit I got following exceptions on Spark 2.4.2, it works
>> fine when running  spark-submit under Spark 2.4.0 with exact the same
>> command-line call, any idea how do i fix this? Thanks a lot!
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> scala/Product$class
>> at
>> com.datastax.spark.connector.util.ConfigParameter.<init>(ConfigParameter.scala:7)
>> at com.datastax.spark.connector.rdd.ReadConf$.<init>(ReadConf.scala:33)
>> at com.datastax.spark.connector.rdd.ReadConf$.<clinit>(ReadConf.scala)
>> at
>> org.apache.spark.sql.cassandra.DefaultSource$.<init>(DefaultSource.scala:134)
>> at
>> org.apache.spark.sql.cassandra.DefaultSource$.<clinit>(DefaultSource.scala)
>> at
>> org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:55)
>> at
>> org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
>> at
>> org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
>> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
>> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
>> at
>> com.apple.jmet.pallas.data_migration.DirectMigrationWConfig.main(DirectMigrationWConfig.java:76)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>> at org.apache.spark.deploy.SparkSubmit.org
>> $apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
>> at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
>> at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
>> at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
>> at
>> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>>

Reply via email to