sprybee opened a new issue, #5053: URL: https://github.com/apache/seatunnel/issues/5053
### Search before asking - [X] I had searched in the [issues](https://github.com/apache/seatunnel/issues?q=is%3Aissue+label%3A%22bug%22) and found no similar issues. ### What happened Why did I place the connector in the target directory and use plugin_ After install.sh. Using the sqlserverconnector still prompts that the engine cannot be found. ### SeaTunnel Version 2.3.1 ### SeaTunnel Config ```conf env { job.name = "SeaTunnel" spark.executor.instances = 1 spark.executor.cores = 1 spark.executor.memory = "1g" spark.master = local } source { SqlServer-CDC { result_table_name = "customers" username = "sa" password = "123." database-names = ["MyDB"] table-names = ["MyDB.dbo.MyTable"] base-url="jdbc:sqlserver://xxxxxx.me:1433;databaseName=MyDB" } } transform { } sink { # choose stdout output plugin to output data to console Console { } # you can also you other output plugins, such as sql # hdfs { # path = "hdfs://hadoop-cluster-01/nginx/accesslog_processed" # save_mode = "append" # } # If you would like to get more information about how to configure seatunnel and see full list of output plugins, # please go to https://seatunnel.apache.org/docs/category/sink-v2 } ``` ### Running Command ```shell bin/start-seatunnel-spark-3-connector-v2.sh --config /home/zhaohongzhou/test.conf -master local ``` ### Error Exception ```log 23/07/09 17:01:08 INFO AbstractPluginDiscovery: Load plugin: PluginIdentifier{engineType='seatunnel', pluginType='source', pluginName='SqlServer-CDC'} from classpath 23/07/09 17:01:08 ERROR SparkTaskExecuteCommand: Run SeaTunnel on spark failed. java.lang.NullPointerException: null at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:191) ~[guava-14.0.1.jar:?] at org.apache.seatunnel.connectors.seatunnel.cdc.sqlserver.source.config.SqlServerSourceConfigFactory.create(SqlServerSourceConfigFactory.java:49) ~[connector-cdc-sqlserver-2.3.1.jar:2.3.1] at org.apache.seatunnel.connectors.seatunnel.cdc.sqlserver.source.source.SqlServerDialect.<init>(SqlServerDialect.java:55) ~[connector-cdc-sqlserver-2.3.1.jar:2.3.1] at org.apache.seatunnel.connectors.seatunnel.cdc.sqlserver.source.source.SqlServerIncrementalSource.createDataSourceDialect(SqlServerIncrementalSource.java:118) ~[connector-cdc-sqlserver-2.3.1.jar:2.3.1] at org.apache.seatunnel.connectors.cdc.base.source.IncrementalSource.prepare(IncrementalSource.java:117) ~[connector-cdc-sqlserver-2.3.1.jar:2.3.1] at org.apache.seatunnel.core.starter.spark.execution.SourceExecuteProcessor.initializePlugins(SourceExecuteProcessor.java:104) ~[seatunnel-spark-3-starter.jar:2.3.1] at org.apache.seatunnel.core.starter.spark.execution.SparkAbstractPluginExecuteProcessor.<init>(SparkAbstractPluginExecuteProcessor.java:49) ~[seatunnel-spark-3-starter.jar:2.3.1] at org.apache.seatunnel.core.starter.spark.execution.SourceExecuteProcessor.<init>(SourceExecuteProcessor.java:51) ~[seatunnel-spark-3-starter.jar:2.3.1] at org.apache.seatunnel.core.starter.spark.execution.SparkExecution.<init>(SparkExecution.java:57) ~[seatunnel-spark-3-starter.jar:2.3.1] at org.apache.seatunnel.core.starter.spark.command.SparkTaskExecuteCommand.execute(SparkTaskExecuteCommand.java:59) ~[seatunnel-spark-3-starter.jar:2.3.1] at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40) ~[seatunnel-spark-3-starter.jar:2.3.1] at org.apache.seatunnel.core.starter.spark.SeaTunnelSpark.main(SeaTunnelSpark.java:35) ~[seatunnel-spark-3-starter.jar:2.3.1] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_352] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_352] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_352] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_352] at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) ~[spark-core_2.12-3.3.1.jar:3.3.1] at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958) ~[spark-core_2.12-3.3.1.jar:3.3.1] at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) ~[spark-core_2.12-3.3.1.jar:3.3.1] at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) ~[spark-core_2.12-3.3.1.jar:3.3.1] at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) ~[spark-core_2.12-3.3.1.jar:3.3.1] at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046) ~[spark-core_2.12-3.3.1.jar:3.3.1] at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055) ~[spark-core_2.12-3.3.1.jar:3.3.1] at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[spark-core_2.12-3.3.1.jar:3.3.1] 23/07/09 17:01:08 ERROR SeaTunnel: ``` ### Flink or Spark Version Spark 3 ### Java or Scala Version Java ### Screenshots    ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR! ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
