HI All,
Currently using Spark 1.4.1, my Spark job has to fetche data from
PostgreSQL database using JdbcRDD
I am submitting my spark job using --jars to pass PostgreSQL JDBC driver
but still getting error as mentioned below:

"java.sql.SQLException: No suitable driver found for PostgreSQL JDBC"

when the same is given through Spark Shell it is working fine

In several blogs it is mentioned that it is fixed in Spark 1.4.1 by just
passing JDBC Driver through --jars option but still i am stuck

I have tried below options:


   1. SPARK_CLASSPATH= /path/postgresql.jar in
   spark/conf/spark-defaults.conf

     2.   -driver-class-path /path/postgresql.jar and -conf
spark.executor.extraClassPath = /path/postgreSQL.jar

     3.   --jars /path/postgreSQL,jar

     4.  Currently trying to add SPARK_CLASSPATH in file
"compute_classpath.sh" for each node of cluster

Please let me know if any inputs on the same to proceed further

Regards
Satish Chandra

Reply via email to