[ 
https://issues.apache.org/jira/browse/SPARK-43865?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang resolved SPARK-43865.
---------------------------------
    Resolution: Not A Problem

> spark cluster deploy mode cannot initialize metastore java.sql.SQLException: 
> No suitable driver found for jdbc:mysql
> --------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-43865
>                 URL: https://issues.apache.org/jira/browse/SPARK-43865
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: pin_zhang
>            Priority: Major
>
> 1. Test with JDK 11 + SPARK340
> object BugHS {
>   def main(args: Array[String]): Unit = {
>     val conf = new SparkConf()
>     
> conf.set("javax.jdo.option.ConnectionURL","jdbc:mysql://mysql:3306/hive_ms_spark3?useSSL=false")
>     conf.set("javax.jdo.option.ConnectionDriverName","com.mysql.jdbc.Driver")
>     conf.set("javax.jdo.option.ConnectionUserName","**")
>     conf.set("javax.jdo.option.ConnectionPassword","**")
>     conf.set("spark.sql.hive.thriftServer.singleSession","false")
>     conf.set("spark.sql.warehouse.dir","hdfs://hadoop/warehouse_spark3")
>     import org.apache.spark.sql.SparkSession
>     val spark = SparkSession
>       .builder()
>       .appName("Test").config(conf).enableHiveSupport()
>       .getOrCreate()
>     HiveThriftServer2.startWithContext(spark.sqlContext)
>     spark.sql("create table IF NOT EXISTS test2 (id int) USING parquet")
>   }
> }
> 2. Submit in cluster mode
>    a. spark_config.properties 
>             spark.master=spark://master:6066
>             
> spark.jars=hdfs://hadoop/tmp/test_bug/mysql-connector-java-5.1.47.jar
>             spark.master.rest.enabled=true
>    b. spark-submit2.cmd --deploy-mode cluster  --properties-file 
> spark_config.properties  --class com.test.BugHS 
> "hdfs://hadoop/tmp/test_bug/bug_classloader.jar"  
> 3.  Meet "No suitable driver found" exception, caused by classloader is 
> different for driver in spark.jars and metastore jar in JDK 11
> java.sql.SQLException: Unable to open a test connection to the given 
> database. JDBC url = jdbc:mysql://mysql:3306/hive_ms_spark3?useSSL=false, 
> username = root. Terminating connection pool (set lazyInit to true if you 
> expect to start your database after your app). Original Exception: ------
> java.sql.SQLException: No suitable driver found for 
> jdbc:mysql://mysql:3306/hive_ms_spark3?useSSL=false
>       at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:702)
>       at java.sql/java.sql.DriverManager.getConnection(DriverManager.java:189)
>       at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
>       at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
>       at 
> com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
>       at 
> org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483)
>       at 
> org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:297)
>       at 
> jdk.internal.reflect.GeneratedConstructorAccessor77.newInstance(Unknown 
> Source)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to