Gankun Luo created SPARK-5239: --------------------------------- Summary: JdbcRDD throws "java.lang.AbstractMethodError: oracle.jdbc.driver.xxxxxx.isClosed()Z" Key: SPARK-5239 URL: https://issues.apache.org/jira/browse/SPARK-5239 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 1.2.0, 1.1.1 Environment: centos6.4 + ojdbc14 Reporter: Gankun Luo Priority: Minor
I try use JdbcRDD to operate the table of Oracle database, but failed. My test code as follows: {code} import java.sql.DriverManager import org.apache.spark.SparkContext import org.apache.spark.rdd.JdbcRDD import org.apache.spark.SparkConf object JdbcRDD4Oracle { def main(args: Array[String]) { val sc = new SparkContext(new SparkConf().setAppName("JdbcRDD4Oracle").setMaster("local[2]")) val rdd = new JdbcRDD(sc, () => getConnection, getSQL, 12987, 13055, 3, r => { (r.getObject("HISTORY_ID"), r.getObject("APPROVE_OPINION")) }) println(rdd.collect.toList) sc.stop() } def getConnection() = { Class.forName("oracle.jdbc.driver.OracleDriver").newInstance() DriverManager.getConnection("jdbc:oracle:thin:@hadoop000:1521/ORCL", "scott", "tiger") } def getSQL() = { "select HISTORY_ID,APPROVE_OPINION from CI_APPROVE_HISTORY WHERE HISTORY_ID >=? AND HISTORY_ID <=?" } } {code} Run the example, I get the following exception: {code} 09:56:48,302 [Executor task launch worker-0] ERROR Logging$class : Error in TaskCompletionListener java.lang.AbstractMethodError: oracle.jdbc.driver.OracleResultSetImpl.isClosed()Z at org.apache.spark.rdd.JdbcRDD$$anon$1.close(JdbcRDD.scala:99) at org.apache.spark.util.NextIterator.closeIfNeeded(NextIterator.scala:63) at org.apache.spark.rdd.JdbcRDD$$anon$1$$anonfun$1.apply(JdbcRDD.scala:71) at org.apache.spark.rdd.JdbcRDD$$anon$1$$anonfun$1.apply(JdbcRDD.scala:71) at org.apache.spark.TaskContext$$anon$1.onTaskCompletion(TaskContext.scala:85) at org.apache.spark.TaskContext$$anonfun$markTaskCompleted$1.apply(TaskContext.scala:110) at org.apache.spark.TaskContext$$anonfun$markTaskCompleted$1.apply(TaskContext.scala:108) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.TaskContext.markTaskCompleted(TaskContext.scala:108) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:64) at org.apache.spark.scheduler.Task.run(Task.scala:54) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:181) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) 09:56:48,302 [Executor task launch worker-1] ERROR Logging$class : Error in TaskCompletionListener {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org