[ 
https://issues.apache.org/jira/browse/SPARK-10625?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14746450#comment-14746450
 ] 

Peng Cheng commented on SPARK-10625:
------------------------------------

branch: https://github.com/Schedule1/spark/tree/SPARK-10625
will submit a pull request after I fixed all tests

> Spark SQL JDBC read/write is unable to handle JDBC Drivers that adds 
> unserializable objects into connection properties
> ----------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10625
>                 URL: https://issues.apache.org/jira/browse/SPARK-10625
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.1, 1.5.0
>         Environment: Ubuntu 14.04
>            Reporter: Peng Cheng
>              Labels: jdbc, spark, sparksql
>
> Some JDBC drivers (e.g. SAP HANA) tries to optimize connection pooling by 
> adding new objects into the connection properties, which is then reused by 
> Spark to be deployed to workers. When some of these new objects are unable to 
> be serializable it will trigger an org.apache.spark.SparkException: Task not 
> serializable. The following test code snippet demonstrate this problem by 
> using a modified H2 driver:
>   test("INSERT to JDBC Datasource with UnserializableH2Driver") {
>     object UnserializableH2Driver extends org.h2.Driver {
>       override def connect(url: String, info: Properties): Connection = {
>         val result = super.connect(url, info)
>         info.put("unserializableDriver", this)
>         result
>       }
>       override def getParentLogger: Logger = ???
>     }
>     import scala.collection.JavaConversions._
>     val oldDrivers = 
> DriverManager.getDrivers.filter(_.acceptsURL("jdbc:h2:")).toSeq
>     oldDrivers.foreach{
>       DriverManager.deregisterDriver
>     }
>     DriverManager.registerDriver(UnserializableH2Driver)
>     sql("INSERT INTO TABLE PEOPLE1 SELECT * FROM PEOPLE")
>     assert(2 === sqlContext.read.jdbc(url1, "TEST.PEOPLE1", properties).count)
>     assert(2 === sqlContext.read.jdbc(url1, "TEST.PEOPLE1", 
> properties).collect()(0).length)
>     DriverManager.deregisterDriver(UnserializableH2Driver)
>     oldDrivers.foreach{
>       DriverManager.registerDriver
>     }
>   }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to