Hi All, 
 Has anyone tried using user defined database api for postgres on Spark
1.5.0 onwards.
 I have a build that uses 
Spark = 1.5.1
ScalikeJDBC= 2.3+
postgres driver = postgresql-9.3-1102-jdbc41.jar
Spark SQL API to write dataframe to postgres works.
But writing a spark RDD to postgres using ScalikeJDBC does not work.
My code gets past 
Class.forName("org.postgresql.Driver") line which means the driver is
loaded(Also spark sql api works )

Do I need to load the jar differently so that non spark code will see it? 
Things I have tried
=============
1) I have tried shading postgres jar to my assembly with out any success.
2) I have tried providing postgres jar via extra classpath to executors.
3) Took a sip of coffee and cola in an alternating fashion.

The stack trace is given below.

java.sql.SQLException: No suitable driver found for 10.224.36.151
        at java.sql.DriverManager.getConnection(DriverManager.java:596)
        at java.sql.DriverManager.getConnection(DriverManager.java:215)
        at
org.apache.commons.dbcp2.DriverManagerConnectionFactory.createConnection(DriverManagerConnectionFactory.java:77)
        at
org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:256)
        at
org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:868)
        at
org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435)
        at
org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363)
        at
org.apache.commons.dbcp2.PoolingDataSource.getConnection(PoolingDataSource.java:134)
        at
scalikejdbc.Commons2ConnectionPool.borrow(Commons2ConnectionPool.scala:41)
        at scalikejdbc.DB$.localTx(DB.scala:257)
        at com.exactearth.lvi.db.TODB$$anonfun$insertTODB$1.apply(TODB.scala:71)
        at com.exactearth.lvi.db.TODB$$anonfun$insertTODB$1.apply(TODB.scala:69)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:898)
        at
org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$29.apply(RDD.scala:898)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848)
        at
org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1848)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Possible-bug-in-Spark-1-5-0-onwards-while-loading-Postgres-JDBC-driver-tp25579.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to