xzwDavid opened a new issue, #9159:
URL: https://github.com/apache/hudi/issues/9159

   **_Tips before filing an issue_**
   
   - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)? Yeah
   
   - Join the mailing list to engage in conversations and get faster support at 
dev-subscr...@hudi.apache.org.
   
   - If you have triaged this as a bug, then file an 
[issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   When I download the hudi and complie it manually in local, I tried to use 
the jars I complied and it throwed an exception.
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1.run the command
   ```
   spark-shell --driver-memory 4G --packages 
org.apache.hudi:hudi-spark3-bundle_2.12:0.11.0,org.apache.spark:spark-avro_2.12:3.2.4
 --conf "spark.driver.maxResultSize=0" --conf 
"spark.serializer=org.apache.spark.serializer.KryoSerializer" --conf 
"spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension" 
--conf 
"spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog"
 --conf 
"spark.sql.catalog.hudi=org.apache.spark.sql.hudi.catalog.HoodieCatalog" --conf 
"spark.hadoop.fs.s3.useRequesterPaysHeader=true" --conf 
"spark.benchmarkId=20230710-130143-tpcds-1gb-hudi-load"  --jars 
~/logs/20230710-130143-tpcds-1gb-hudi-load-benchmarks.jar -I 
20230710-130143-tpcds-1gb-hudi-load_shell_init.scala
   ```
   
   
   **Expected behavior**
   
   It cannot found Spark3_2Adapter
   
   #### Here is exception:
   org.apache.spark.sql.adapter.Spark3_2Adapter
   java.lang.ClassNotFoundException: 
org.apache.spark.sql.adapter.Spark3_2Adapter
        
   **Environment Description**
   
   * Hudi version : 0.11.0 / 0.12.0
   
   * Spark version : 3.2.4
   
   * Hive version : 3.1.2
   
   * Hadoop version : 3.3.1
   
   * Storage (HDFS/S3/GCS..) : HDFS
   
   * Running on Docker? (yes/no) : no
   
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   ANTLR Tool version 4.7 used for code generation does not match the current 
runtime version 4.8ANTLR Runtime version 4.7 used for parser compilation does 
not match the current runtime version 4.8ANTLR Tool version 4.7 used for code 
generation does not match the current runtime version 4.8ANTLR Runtime version 
4.7 used for parser compilation does not match the current runtime version 
4.82023-07-10T05:02:03.501 ERROR: drop-database
   org.apache.spark.sql.adapter.Spark3_2Adapter
   java.lang.ClassNotFoundException: 
org.apache.spark.sql.adapter.Spark3_2Adapter
        at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at 
org.apache.hudi.SparkAdapterSupport.sparkAdapter(SparkAdapterSupport.scala:37)
        at 
org.apache.hudi.SparkAdapterSupport.sparkAdapter$(SparkAdapterSupport.scala:29)
        at 
org.apache.spark.sql.parser.HoodieCommonSqlParser.sparkAdapter$lzycompute(HoodieCommonSqlParser.scala:34)
        at 
org.apache.spark.sql.parser.HoodieCommonSqlParser.sparkAdapter(HoodieCommonSqlParser.scala:34)
        at 
org.apache.spark.sql.parser.HoodieCommonSqlParser.sparkExtendedParser$lzycompute(HoodieCommonSqlParser.scala:38)
        at 
org.apache.spark.sql.parser.HoodieCommonSqlParser.sparkExtendedParser(HoodieCommonSqlParser.scala:38)
        at 
org.apache.spark.sql.parser.HoodieCommonSqlParser.$anonfun$parsePlan$1(HoodieCommonSqlParser.scala:44)
        at 
org.apache.spark.sql.parser.HoodieCommonSqlParser.parse(HoodieCommonSqlParser.scala:84)
        at 
org.apache.spark.sql.parser.HoodieCommonSqlParser.parsePlan(HoodieCommonSqlParser.scala:41)
        at 
org.apache.spark.sql.SparkSession.$anonfun$sql$2(SparkSession.scala:616)
        at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
        at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:616)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
        at benchmark.Benchmark.runQuery(Benchmark.scala:146)
        at benchmark.TPCDSDataLoad.runInternal(TPCDSDataLoad.scala:91)
        at benchmark.Benchmark.run(Benchmark.scala:120)
        at benchmark.TPCDSDataLoad$.$anonfun$main$1(TPCDSDataLoad.scala:149)
        at 
benchmark.TPCDSDataLoad$.$anonfun$main$1$adapted(TPCDSDataLoad.scala:148)
        at scala.Option.foreach(Option.scala:407)
        at benchmark.TPCDSDataLoad$.main(TPCDSDataLoad.scala:148)
        at 
$line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:22)
        at 
$line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:29)
        at 
$line14.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:31)
        at 
$line14.$read$$iw$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:33)
        at 
$line14.$read$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:35)
        at 
$line14.$read$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:37)
        at 
$line14.$read$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:39)
        at 
$line14.$read$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:41)
        at 
$line14.$read.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:43)
        at 
$line14.$read$.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:47)
        at 
$line14.$read$.<clinit>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala)
        at 
$line14.$eval$.$print$lzycompute(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:7)
        at 
$line14.$eval$.$print(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:6)
        at 
$line14.$eval.$print(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
        at 
scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
        at 
scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
        at 
scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
        at 
scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
        at 
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
        at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:865)
        at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:733)
        at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:435)
        at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:456)
        at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:451)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$interpretAllFrom$3(ILoop.scala:468)
        at scala.reflect.io.Streamable$Chars.applyReader(Streamable.scala:118)
        at scala.reflect.io.Streamable$Chars.applyReader$(Streamable.scala:116)
        at scala.reflect.io.File.applyReader(File.scala:50)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$interpretAllFrom$2(ILoop.scala:464)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:97)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$interpretAllFrom$1(ILoop.scala:464)
        at scala.tools.nsc.interpreter.ILoop.savingReader(ILoop.scala:102)
        at scala.tools.nsc.interpreter.ILoop.interpretAllFrom(ILoop.scala:463)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$loadCommand$1(ILoop.scala:629)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$withFile$1(ILoop.scala:622)
        at scala.tools.nsc.interpreter.IMain.withLabel(IMain.scala:111)
        at scala.tools.nsc.interpreter.ILoop.withFile(ILoop.scala:621)
        at scala.tools.nsc.interpreter.ILoop.run$3(ILoop.scala:628)
        at scala.tools.nsc.interpreter.ILoop.loadCommand(ILoop.scala:635)
        at 
org.apache.spark.repl.SparkILoop.$anonfun$process$7(SparkILoop.scala:173)
        at 
org.apache.spark.repl.SparkILoop.$anonfun$process$7$adapted(SparkILoop.scala:172)
        at scala.collection.immutable.List.foreach(List.scala:431)
        at 
org.apache.spark.repl.SparkILoop.loadInitFiles$1(SparkILoop.scala:172)
        at 
org.apache.spark.repl.SparkILoop.$anonfun$process$4(SparkILoop.scala:166)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.tools.nsc.interpreter.ILoop.$anonfun$mumly$1(ILoop.scala:166)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:163)
        at org.apache.spark.repl.SparkILoop.loopPostInit$1(SparkILoop.scala:153)
        at 
org.apache.spark.repl.SparkILoop.$anonfun$process$10(SparkILoop.scala:221)
        at 
org.apache.spark.repl.SparkILoop.withSuppressedSettings$1(SparkILoop.scala:189)
        at org.apache.spark.repl.SparkILoop.startup$1(SparkILoop.scala:201)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:966)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:191)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:214)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1054)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1063)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   
   
   
   
   
   
   
   org.apache.hudi.exception.HoodieException: Unable to load class
        at 
org.apache.hudi.common.util.ReflectionUtils.getClass(ReflectionUtils.java:57)
        at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:89)
        at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:118)
        at 
org.apache.spark.sql.hudi.analysis.HoodieAnalysis$.$anonfun$extraResolutionRules$1(HoodieAnalysis.scala:59)
        at 
org.apache.spark.sql.hudi.HoodieSparkSessionExtension.$anonfun$apply$3(HoodieSparkSessionExtension.scala:38)
        at 
org.apache.spark.sql.SparkSessionExtensions.$anonfun$buildResolutionRules$1(SparkSessionExtensions.scala:174)
        at 
scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at scala.collection.TraversableLike.map(TraversableLike.scala:286)
        at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
        at scala.collection.AbstractTraversable.map(Traversable.scala:108)
        at 
org.apache.spark.sql.SparkSessionExtensions.buildResolutionRules(SparkSessionExtensions.scala:174)
        at 
org.apache.spark.sql.internal.BaseSessionStateBuilder.customResolutionRules(BaseSessionStateBuilder.scala:212)
        at 
org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:84)
        at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:75)
        at 
org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$build$2(BaseSessionStateBuilder.scala:357)
        at 
org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:87)
        at 
org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:87)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:75)
        at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:183)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
        at 
org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:183)
        at 
org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:75)
        at 
org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:73)
        at 
org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:65)
        at org.apache.spark.sql.Dataset$.$anonfun$ofRows$1(Dataset.scala:90)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
        at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:88)
        at 
org.apache.spark.sql.SparkSession.emptyDataFrame$lzycompute(SparkSession.scala:288)
        at 
org.apache.spark.sql.SparkSession.emptyDataFrame(SparkSession.scala:288)
        at benchmark.Benchmark.runQuery(Benchmark.scala:161)
        at benchmark.TPCDSDataLoad.runInternal(TPCDSDataLoad.scala:91)
        at benchmark.Benchmark.run(Benchmark.scala:120)
        at benchmark.TPCDSDataLoad$.$anonfun$main$1(TPCDSDataLoad.scala:149)
        at 
benchmark.TPCDSDataLoad$.$anonfun$main$1$adapted(TPCDSDataLoad.scala:148)
        at scala.Option.foreach(Option.scala:407)
        at benchmark.TPCDSDataLoad$.main(TPCDSDataLoad.scala:148)
        at 
$line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:22)
        at 
$line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:29)
        at 
$line14.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:31)
        at 
$line14.$read$$iw$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:33)
        at 
$line14.$read$$iw$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:35)
        at 
$line14.$read$$iw$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:37)
        at 
$line14.$read$$iw$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:39)
        at 
$line14.$read$$iw.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:41)
        at 
$line14.$read.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:43)
        at 
$line14.$read$.<init>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:47)
        at 
$line14.$read$.<clinit>(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala)
        at 
$line14.$eval$.$print$lzycompute(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:7)
        at 
$line14.$eval$.$print(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala:6)
        at 
$line14.$eval.$print(20230710-130143-tpcds-1gb-hudi-load_shell_init.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
        at 
scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
        at 
scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
        at 
scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
        at 
scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
        at 
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
        at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:865)
        at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:733)
        at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:435)
        at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:456)
        at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:451)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$interpretAllFrom$3(ILoop.scala:468)
        at scala.reflect.io.Streamable$Chars.applyReader(Streamable.scala:118)
        at scala.reflect.io.Streamable$Chars.applyReader$(Streamable.scala:116)
        at scala.reflect.io.File.applyReader(File.scala:50)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$interpretAllFrom$2(ILoop.scala:464)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:97)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$interpretAllFrom$1(ILoop.scala:464)
        at scala.tools.nsc.interpreter.ILoop.savingReader(ILoop.scala:102)
        at scala.tools.nsc.interpreter.ILoop.interpretAllFrom(ILoop.scala:463)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$loadCommand$1(ILoop.scala:629)
        at 
scala.tools.nsc.interpreter.ILoop.$anonfun$withFile$1(ILoop.scala:622)
        at scala.tools.nsc.interpreter.IMain.withLabel(IMain.scala:111)
        at scala.tools.nsc.interpreter.ILoop.withFile(ILoop.scala:621)
        at scala.tools.nsc.interpreter.ILoop.run$3(ILoop.scala:628)
        at scala.tools.nsc.interpreter.ILoop.loadCommand(ILoop.scala:635)
        at 
org.apache.spark.repl.SparkILoop.$anonfun$process$7(SparkILoop.scala:173)
        at 
org.apache.spark.repl.SparkILoop.$anonfun$process$7$adapted(SparkILoop.scala:172)
        at scala.collection.immutable.List.foreach(List.scala:431)
        at 
org.apache.spark.repl.SparkILoop.loadInitFiles$1(SparkILoop.scala:172)
        at 
org.apache.spark.repl.SparkILoop.$anonfun$process$4(SparkILoop.scala:166)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.tools.nsc.interpreter.ILoop.$anonfun$mumly$1(ILoop.scala:166)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:163)
        at org.apache.spark.repl.SparkILoop.loopPostInit$1(SparkILoop.scala:153)
        at 
org.apache.spark.repl.SparkILoop.$anonfun$process$10(SparkILoop.scala:221)
        at 
org.apache.spark.repl.SparkILoop.withSuppressedSettings$1(SparkILoop.scala:189)
        at org.apache.spark.repl.SparkILoop.startup$1(SparkILoop.scala:201)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:966)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:191)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:214)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1054)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1063)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.ClassNotFoundException: 
org.apache.spark.sql.hudi.analysis.HoodieSpark3Analysis
        at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at 
org.apache.hudi.common.util.ReflectionUtils.getClass(ReflectionUtils.java:54)
   
   ```Add the stacktrace of the error.```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to