tonkane opened a new issue, #4445:
URL: https://github.com/apache/incubator-seatunnel/issues/4445

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-seatunnel/issues?q=is%3Aissue+label%3A%22bug%22)
 and found no similar issues.
   
   
   ### What happened
   
   I got an error when run start-seatunnel-spark-connector-v2.sh. 
   commend like : ./bin/start-seatunnel-spark-connector-v2.sh --master local[*] 
--deploy-mode client --config ./config/seatunnel.streaming.conf.template. (just 
follow steps from 
https://seatunnel.apache.org/docs/start-v2/locally/quick-start-spark)
   console error like:
   23/03/29 11:26:22 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/spark/sql/sources/v2/DataSourceV2
   i not sure whether seatunnel or spark problem
   
   ### SeaTunnel Version
   
   2.3.0-beta
   
   ### SeaTunnel Config
   
   ```conf
   env {
     execution.parallelism = 1
     job.mode = "BATCH"
     spark.app.name = "SeaTunnel"
     spark.executor.instances = 2
     spark.executor.cores = 1
     spark.executor.memory = "1g"
     spark.stream.batchDuration = 5
   }
   
   source {
       FakeSource {
         result_table_name = "fake"
         row.num = 16
         schema = {
           fields {
             name = "string"
             age = "int"
           }
         }
       }
   }
   
   transform {
   
   }
   
   sink {
     Console {}
   }
   ```
   
   
   ### Running Command
   
   ```shell
   ./bin/start-seatunnel-spark-connector-v2.sh --master local[*] --deploy-mode 
client --config ./config/seatunnel.streaming.conf.template
   ```
   
   
   ### Error Exception
   
   ```log
   [root@hadoop101 apache-seatunnel-incubating-2.3.0-beta]# 
./bin/start-seatunnel-spark-connector-v2.sh --master local[*] --deploy-mode 
client --config ./config/seatunnel.streaming.conf.template
   log4j:WARN No appenders could be found for logger 
(org.apache.seatunnel.core.starter.config.ConfigBuilder).
   log4j:WARN Please initialize the log4j system properly.
   log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for 
more info.
   Execute SeaTunnel Spark Job: ${SPARK_HOME}/bin/spark-submit --class 
"org.apache.seatunnel.core.starter.spark.SeatunnelSpark" --name "SeaTunnel" 
--master "local[*]" --deploy-mode "client" --jars 
"/root/seatunnel/apache-seatunnel-incubating-2.3.0-beta/connectors/seatunnel/connector-console-2.3.0-beta.jar,/root/seatunnel/apache-seatunnel-incubating-2.3.0-beta/connectors/seatunnel/connector-fake-2.3.0-beta.jar"
 --conf "spark.executor.memory=1g" --conf "job.mode=BATCH" --conf 
"execution.parallelism=1" --conf "spark.executor.cores=1" --conf 
"spark.app.name=SeaTunnel" --conf "spark.stream.batchDuration=5" --conf 
"spark.executor.instances=2" 
/root/seatunnel/apache-seatunnel-incubating-2.3.0-beta/lib/seatunnel-spark-starter.jar
 --master local[*] --deploy-mode client --config 
./config/seatunnel.streaming.conf.template
   SLF4J: Class path contains multiple SLF4J bindings.
   SLF4J: Found binding in 
[jar:file:/opt/module/spark-3.0.1/jars/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: Found binding in 
[jar:file:/opt/module/hadoop-3.1.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
   SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   Warning: Ignoring non-Spark config property: execution.parallelism
   Warning: Ignoring non-Spark config property: job.mode
   23/03/29 11:26:22 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/spark/sql/sources/v2/DataSourceV2
           at java.lang.ClassLoader.defineClass1(Native Method)
           at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
           at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
           at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
           at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
           at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
           at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
           at java.security.AccessController.doPrivileged(Native Method)
           at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
           at java.lang.Class.forName0(Native Method)
           at java.lang.Class.forName(Class.java:348)
           at 
java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
           at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
           at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
           at 
scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:44)
           at scala.collection.Iterator.foreach(Iterator.scala:941)
           at scala.collection.Iterator.foreach$(Iterator.scala:941)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
           at scala.collection.IterableLike.foreach(IterableLike.scala:74)
           at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
           at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
           at 
scala.collection.TraversableLike.filterImpl(TraversableLike.scala:255)
           at 
scala.collection.TraversableLike.filterImpl$(TraversableLike.scala:249)
           at 
scala.collection.AbstractTraversable.filterImpl(Traversable.scala:108)
           at scala.collection.TraversableLike.filter(TraversableLike.scala:347)
           at 
scala.collection.TraversableLike.filter$(TraversableLike.scala:347)
           at scala.collection.AbstractTraversable.filter(Traversable.scala:108)
           at 
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:649)
           at 
org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSourceV2(DataSource.scala:733)
           at 
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:248)
           at 
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:221)
           at 
org.apache.seatunnel.core.starter.spark.execution.SourceExecuteProcessor.execute(SourceExecuteProcessor.java:70)
           at 
org.apache.seatunnel.core.starter.spark.execution.SparkExecution.execute(SparkExecution.java:54)
           at 
org.apache.seatunnel.core.starter.spark.command.SparkApiTaskExecuteCommand.execute(SparkApiTaskExecuteCommand.java:52)
           at org.apache.seatunnel.core.starter.Seatunnel.run(Seatunnel.java:39)
           at 
org.apache.seatunnel.core.starter.spark.SeatunnelSpark.main(SeatunnelSpark.java:34)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:497)
           at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
           at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
           at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.ClassNotFoundException: 
org.apache.spark.sql.sources.v2.DataSourceV2
           at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
           ... 50 more
   ```
   
   
   ### Flink or Spark Version
   
   spark : 3.0.1
   
   ### Java or Scala Version
   
   openjdk version "1.8.0_322"
   
   ### Screenshots
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to