Hi Stephen,

I forgot to mention that I added these lines below to the spark-default.conf on 
the node with Spark SQL Thrift JDBC/ODBC Server running on it. Then, I 
restarted it.

spark.driver.extraClassPath=/usr/share/java/postgresql-9.3-1104.jdbc41.jar
spark.executor.extraClassPath=/usr/share/java/postgresql-9.3-1104.jdbc41.jar

I read in another thread that this would work. I was able to create the table 
and could see it in my SHOW TABLES list. But, when I try to query the table, I 
get the same error. It looks like I’m getting close.

Are there any other things that I have to do that you can think of?

Thanks,
Ben


> On Dec 22, 2015, at 6:25 PM, Stephen Boesch <java...@gmail.com> wrote:
> 
> The postgres jdbc driver needs to be added to the  classpath of your spark 
> workers.  You can do a search for how to do that (multiple ways).
> 
> 2015-12-22 17:22 GMT-08:00 b2k70 <bbuil...@gmail.com 
> <mailto:bbuil...@gmail.com>>:
> I see in the Spark SQL documentation that a temporary table can be created
> directly onto a remote PostgreSQL table.
> 
> CREATE TEMPORARY TABLE <table_name>
> USING org.apache.spark.sql.jdbc
> OPTIONS (
> url "jdbc:postgresql://<PostgreSQL_Hostname_IP>/<database_name>",
> dbtable "impressions"
> );
> When I run this against our PostgreSQL server, I get the following error.
> 
> Error: java.sql.SQLException: No suitable driver found for
> jdbc:postgresql://<PostgreSQL_Hostname_IP>/<database_name> (state=,code=0)
> 
> Can someone help me understand why this is?
> 
> Thanks, Ben
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-1-5-2-missing-JDBC-driver-for-PostgreSQL-tp25773.html
>  
> <http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-1-5-2-missing-JDBC-driver-for-PostgreSQL-tp25773.html>
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> <mailto:user-unsubscr...@spark.apache.org>
> For additional commands, e-mail: user-h...@spark.apache.org 
> <mailto:user-h...@spark.apache.org>
> 
> 

Reply via email to