Hi All,

I’m currently having some issues getting Spark to read from Postgres tables 
which have uuid type columns through a PySpark shell.

I can connect and see tables which do not have a uuid column but get the error 
"java.sql.SQLException: Unsupported type 1111" when I try to get a table which 
does have uuid column. Is there anyway I can access these?

See the pastebin: http://pastebin.com/VbpU4uRU <http://pastebin.com/VbpU4uRU> 
for more info and the PySpark shell readout.

Am using Postgres 9.4, Spark 1.5.1, Java OpenJDK 1.7.0_79, JDBC 
postgresql-9.4-1206-jdbc41

Chris

Reply via email to