Github user maropu commented on the issue:

    https://github.com/apache/spark/pull/19999
  
    I noticed that, in the current master, spark throws an exception in runtime 
if type mismatches between a given partition column type and an actual column;
    ```
    scala> jdbcTable.show
    17/12/16 17:47:59 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
    org.postgresql.util.PSQLException: ERROR: operator does not exist: text < 
integer
      Hint: No operator matches the given name and argument type(s). You might 
need to add explicit type casts.
      Position: 83
            at 
org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2182)
            at 
org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1911)
            at 
org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:173)
            at 
org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:616)
            at 
org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:466)
            at 
org.postgresql.jdbc2.AbstractJdbc2Statement.executeQuery(AbstractJdbc2Statement.java:351)
            at 
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD.compute(JDBCRDD.scala:301)
            at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
            at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
    ```
    IMHO we'd better to check this type mismatch ASAP before execution?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to