Hi,

I need to access JDBC from my java spark code, and thinking to use JdbcRDD as 
noted in 
http://spark.incubator.apache.org/docs/0.8.0/api/core/org/apache/spark/rdd/JdbcRDD.html

I have this questions:
When RDD decide to close the connection?

... getConnection
a function that returns an open Connection. The RDD takes care of closing the 
connection.

Any setting that I can tell spark to keep JdbcRDD connections open for next 
query, instead of opening a new one for the same JDBC source?

Also per checking
https://github.com/apache/incubator-spark/blob/branch-0.8/core/src/test/scala/org/apache/spark/rdd/JdbcRDDSuite.scala


I am seeing it's invoking explicit close for the connection in the after { }.   
If RDD take care of closing the connection then why we have to explicit invoke  
     DriverManager.getConnection("jdbc:derby:;shutdown=true")

Thanks,
Hussam



Reply via email to