Re: Spark 3.2.0 upgrade

2022-01-22 Thread Amit Sharma
Alex, Please find below the build.sbt. I am using the assembly command to
create a fat jar. These things were working fine until i changed the
version scala and spark cassandra connector and other dependent jars.
I tried to run my job on my local intellij then also i am getting the issue.

Thanks
Amit


name := """cft-common"""

ThisBuild / version := "0.0.1-SNAPSHOT"

ThisBuild/scalaVersion := "2.12.15"
Test / fork := true
Test / envVars := Map("env" -> "qa")

val jacksonVersion = "2.13.1"
val AkkaVersion = "2.6.17"

val sparkVersion = "3.1.0"


libraryDependencies += "com.typesafe" % "config" % "1.3.3"
libraryDependencies += "com.typesafe.play" %% "play-json" % "2.9.2"
//2021-10-29
Chuck replaced cake solutions with apache recent version
libraryDependencies += "com.typesafe.play" %% "play-json-joda" %
"2.9.2" ///2021-12-06
Chuck Updated play json to latest to match
libraryDependencies += "org.json4s" %% "json4s-native" % "3.6.12"
libraryDependencies += "com.datastax.cassandra" % "java-driver-core" %
"3.9.0"
libraryDependencies += "io.getquill" %% "quill-cassandra" % "3.4.9"
libraryDependencies += "org.scalaj" %% "scalaj-http" % "2.4.2"
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "2.8.1"
//2021-10-29
Chuck replaced cake solutions with apache recent version
libraryDependencies += "com.typesafe.akka" %% "akka-stream-kafka" %
"2.1.1" //2021-11-11
Chuck replaced cake solutions with akka recent version
libraryDependencies += "com.typesafe.akka" %% "akka-slf4j" %
AkkaVersion //2021-11-11
Chuck replaced cake solutions with akka recent version
libraryDependencies += "com.typesafe.akka" %% "akka-stream" %
AkkaVersion //2021-11-11
Chuck replaced cake solutions with akka recent version
libraryDependencies += "com.typesafe.akka" %% "akka-actor" %
AkkaVersion //2021-11-11
Chuck replaced cake solutions with akka recent version
libraryDependencies += "org.apache.logging.log4j" % "log4j-api" % "2.17.0"
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % "2.17.0"

/** pin jackson libraries to specific sbt version */
val jacksonDeps = Seq(
"com.fasterxml.jackson.core" % "jackson-core" % jacksonVersion,
"com.fasterxml.jackson.core" % "jackson-annotations" % jacksonVersion,
"com.fasterxml.jackson.core" % "jackson-databind" % jacksonVersion,
"com.fasterxml.jackson.datatype" % "jackson-datatype-jdk8" % jacksonVersion,
"com.fasterxml.jackson.datatype" % "jackson-datatype-jsr310" %
jacksonVersion,
"com.fasterxml.jackson.module" %% "jackson-module-scala" % jacksonVersion,
"com.fasterxml.jackson.module" % "jackson-module-paranamer" % jacksonVersion
).filter(_.name != "jackson-annotations")
.map(_.exclude("com.fasterxml.jackson.core","jackson-annotations"))

def excludeJackson(fromDependency: ModuleID): ModuleID = {
jacksonDeps.foldLeft(fromDependency){ (libDep, jackSonDep) =>
libDep.exclude(jackSonDep.organization, jackSonDep.name)
}
}
// https://mvnrepository.com/artifact/com.google.guava/guava
//libraryDependencies += "com.google.guava" % "guava" % "16.0.1"



libraryDependencies ++= (
Seq(
"org.scalatest" %% "scalatest" % "3.0.1" % Test,
"org.scalacheck" %% "scalacheck" % "1.13.4" % Test)
).map(excludeJackson) ++
jacksonDeps

//dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind"
% "2.8.9"
//dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" %
"2.8.9"


// https://mvnrepository.com/artifact/org.julienrf/play-json-derived-codecs
libraryDependencies += "org.julienrf" %% "play-json-derived-codecs" %
"5.0.0"
libraryDependencies += "org.scalatestplus.play" %%
"scalatestplus-play" % "3.0.0"
% Test
libraryDependencies += "net.liftweb" %% "lift-json" % "3.4.2"

libraryDependencies += "com.fasterxml.jackson.core" %
"jackson-annotations" % jacksonVersion force()
libraryDependencies += "net.liftweb" %% "lift-json" % "3.4.2"


val sparkDependencies = Seq(
  "org.apache.spark" %% "spark-sql" % sparkVersion  ,
  "org.apache.spark" %% "spark-catalyst" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion  ,
  "org.apache.spark" %% "spark-mllib" % sparkVersion   ,

  "com.datastax.spark" %% "spark-cassandra-connector" % "3.1.0", //
this includes cassandra-driver


  "org.apache.spark" %% "spark-hive" % sparkVersion,
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion
//2021-11-11 updating streaming from .8 to .10  -- replaced cake
solutions with akka recent version
).map(_.exclude("org.slf4j","slf4j-api"))
  .map(_.exclude("org.slf4j" , "jul-to-slf4j"))
  .map(_.exclude("org.slf4j","jcl-over-slf4j"))
  .map(_.exclude("org.slf4j","slf4j-log4j12"))
  //.map(_.exclude("log4j","log4j"))
  .map(_.exclude("org.apache.kafka", "kafka-clients"))

libraryDependencies ++=
  sparkDependencies
.map(excludeJackson) ++
jacksonDeps

assembly / assemblyShadeRules := Seq(
  ShadeRule.rename("com.google.**" -> "shade.com.google.@1"
  ).inAll

)



On Sat, Jan 22, 2022 at 7:55 AM Alex Ott  wrote:

> Show how do you execute your code - either you 

Re: Spark 3.2.0 upgrade

2022-01-22 Thread Alex Ott
Show how do you execute your code - either you didn't pack it as uberjar,
or didn't provide all necessary dependencies, if you're using `--jars`
option. You may try `-assembly` variant when submitting your application

Amit Sharma  at "Fri, 21 Jan 2022 11:17:38 -0500" wrote:
 AS> Hello, I tried using a cassandra unshaded  connector or normal connector 
both are giving the same error at runtime while
 AS> connecting to cassandra.

 AS> "com.datastax.spark" %% "spark-cassandra-connector-unshaded" % "2.4.2"

 AS> Or

 AS> "com.datastax.spark" %% "spark-cassandra-connector" % "3.1.0"

 AS> Russ similar issue is reported here also but no solution

 AS> 
https://community.datastax.com/questions/3519/issue-with-spring-boot-starter-data-cassandra-and.html

 AS> Caused by: java.lang.ClassNotFoundException: 
com.codahale.metrics.JmxReporter
 AS> at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
 AS> at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
 AS> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)

 AS> On Thu, Jan 20, 2022 at 5:17 PM Amit Sharma  wrote:

 AS> Hello, I am trying to upgrade my project from spark 2.3.3 to spark 
3.2.0. While running the application locally I am getting
 AS> below error. 
 AS>
 AS> Could you please let me know which version of the cassandra connector 
I should use. I am using below shaded connector  but i
 AS> think that causing the issue 

 AS> "com.datastax.spark" %% "spark-cassandra-connector-unshaded" % "2.4.2"

 AS> Caused by: java.lang.ClassNotFoundException: 
com.codahale.metrics.JmxReporter
 AS> at 
java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
 AS> at 
java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
 AS> at 
java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)

 AS> Thanks
 AS> 
 AS> Amit


-- 
With best wishes,Alex Ott
http://alexott.net/
Twitter: alexott_en (English), alexott (Russian)

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org