Hello,

When I run PySpark to save to a Postgresql database, I run into an error
where uuid insert statements are not constructed properly.  There are a lot
of different questions on stackoverflow about the same issue.

https://stackoverflow.com/questions/64671739/pyspark-nullable-uuid-type-uuid-but-expression-is-of-type-character-varying

I would like to add support for saving uuids to Postgresql in Pyspark.

How do I identify what is causing this error? Is this something that needs
to be fixed in the Pyspark code, the Apache Spark Code, or the Postgresql
JDBC driver?  Does anyone have advice on how I should approach fixing this
issue?

Thanks,
Denise

Reply via email to