[ https://issues.apache.org/jira/browse/SPARK-40405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17603861#comment-17603861 ]
Hyukjin Kwon commented on SPARK-40405: -------------------------------------- [~ghsea]seems like it's a classpath problem. how do you reproduce this issue? > sparksql throws exception while reading by jdbc > ----------------------------------------------- > > Key: SPARK-40405 > URL: https://issues.apache.org/jira/browse/SPARK-40405 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.2.1 > Reporter: ghsea > Priority: Major > > the sample > code(https://spark.apache.org/docs/3.2.1/sql-data-sources-jdbc.html) throws > exception while reading data by jdbc > Dataset<Row> jdbcDF = spark.read() > .format("jdbc") > .option("url", "jdbc:postgresql:dbserver") > .option("dbtable", "schema.tablename") > .option("user", "username") > .option("password", "password") > .load(); > {{}} > exception: > java.lang.NoClassDefFoundError: org/apache/spark/sql/sources/v2/DataSourceV2 > at java.lang.ClassLoader.defineClass1(Native Method) > at java.lang.ClassLoader.defineClass(ClassLoader.java:757) > at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > at java.net.URLClassLoader.defineClass(URLClassLoader.java:468) > at java.net.URLClassLoader.access$100(URLClassLoader.java:74) > at java.net.URLClassLoader$1.run(URLClassLoader.java:369) > at java.net.URLClassLoader$1.run(URLClassLoader.java:363) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:362) > at java.lang.ClassLoader.loadClass(ClassLoader.java:419) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) > at java.lang.ClassLoader.loadClass(ClassLoader.java:406) > at java.lang.ClassLoader.loadClass(ClassLoader.java:406) > at java.lang.ClassLoader.loadClass(ClassLoader.java:352) > at java.lang.Class.forName0(Native Method) > at java.lang.Class.forName(Class.java:348) > at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370) > at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404) > at java.util.ServiceLoader$1.next(ServiceLoader.java:480) > at > scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:46) > at scala.collection.Iterator.foreach(Iterator.scala:943) > at scala.collection.Iterator.foreach$(Iterator.scala:943) > at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) > at scala.collection.IterableLike.foreach(IterableLike.scala:74) > at scala.collection.IterableLike.foreach$(IterableLike.scala:73) > at scala.collection.AbstractIterable.foreach(Iterable.scala:56) > at scala.collection.TraversableLike.filterImpl(TraversableLike.scala:303) > at scala.collection.TraversableLike.filterImpl$(TraversableLike.scala:297) > at scala.collection.AbstractTraversable.filterImpl(Traversable.scala:108) > at scala.collection.TraversableLike.filter(TraversableLike.scala:395) > at scala.collection.TraversableLike.filter$(TraversableLike.scala:395) > at scala.collection.AbstractTraversable.filter(Traversable.scala:108) > at > org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:652) > at > org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSourceV2(DataSource.scala:720) > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:210) > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:174) > ... 47 elided > Caused by: java.lang.ClassNotFoundException: > org.apache.spark.sql.sources.v2.DataSourceV2 > at java.net.URLClassLoader.findClass(URLClassLoader.java:382) > at java.lang.ClassLoader.loadClass(ClassLoader.java:419) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) > at java.lang.ClassLoader.loadClass(ClassLoader.java:352) > ... 83 more -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org