[ https://issues.apache.org/jira/browse/HUDI-3965?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Raymond Xu updated HUDI-3965: ----------------------------- Fix Version/s: 0.13.0 (was: 0.12.1) > Spark sql dml w/ spark2 and scala12 fails w/ ClassNotFoundException for > SparkSQLCLIDriver > ----------------------------------------------------------------------------------------- > > Key: HUDI-3965 > URL: https://issues.apache.org/jira/browse/HUDI-3965 > Project: Apache Hudi > Issue Type: Task > Components: spark-sql > Reporter: sivabalan narayanan > Priority: Critical > Fix For: 0.13.0 > > > spark-sql dml when launched w/ spark2 and scala 12 profile, fails with > ClassNotFoundException: > org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver for both 0.10.0 and > 0.11.0. > > {code:java} > java.lang.ClassNotFoundException: > org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver > at java.net.URLClassLoader.findClass(URLClassLoader.java:387) > at java.lang.ClassLoader.loadClass(ClassLoader.java:418) > at java.lang.ClassLoader.loadClass(ClassLoader.java:351) > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:348) > at org.apache.spark.util.Utils$.classForName(Utils.scala:238) > at > org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:816) > at > org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161) > at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) > at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) > at > org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930) > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939) > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > Failed to load main class > org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver. > // launch command > ./bin/spark-sql --jars /home/hadoop/hudi-spark2.4-bundle_2.12-0.11.0-rc3.jar > --packages org.apache.spark:spark-avro_2.12:2.4.8 --conf > 'spark.serializer=org.apache.spark.serializer.KryoSerializer' --conf > 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010)