Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-09 Thread Russell Spitzer
2.4.3 Binary is out now and they did change back to 2.11. https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz On Mon, May 6, 2019 at 9:21 PM Russell Spitzer wrote: > Spark 2.4.2 was incorrectly released with the default package binaries set > to Scala 2.12 >

Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Russell Spitzer
Actually i just checked the release, they only changed the pyspark part. So the download on the website will still be 2.12 so you'll need to build the scala 2.11 version of Spark if you want to use the connector. Or Submit a PR for scala 2.12 support On Mon, May 6, 2019 at 9:21 PM Russell Spitzer

Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Russell Spitzer
Spark 2.4.2 was incorrectly released with the default package binaries set to Scala 2.12 instead of scala 2.11.12 which was supposed to be the case. See the 2.4.3 vote

Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Richard Xin
Thanks for the reply. Unfortunately this is the highest version available for Cassandra connector.  One thing I don’t quite understand is that it worked perfectly under Spark 2.4.0. I thought support for Scala 2.11 only became deprecated starting spark 2.4.1, will be removed after spark 3.0

Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Russell Spitzer
Scala version mismatched Spark is shown at 2.12, the connector only has a 2.11 release On Mon, May 6, 2019, 7:59 PM Richard Xin wrote: > > org.apache.spark > spark-core_2.12 > 2.4.0 > compile > > > org.apache.spark > spark-sql_2.12 > 2.4.0 > > >

spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Richard Xin
org.apache.spark spark-core_2.12 2.4.0 compile org.apache.spark spark-sql_2.12 2.4.0 com.datastax.spark spark-cassandra-connector_2.11 2.4.1 I run spark-submit I got following exceptions on Spark 2.4.2, it works fine when running  spark-submit under