[ https://issues.apache.org/jira/browse/SPARK-25453?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiao Li resolved SPARK-25453. ----------------------------- Resolution: Fixed Fix Version/s: 2.4.0 > OracleIntegrationSuite IllegalArgumentException: Timestamp format must be > yyyy-mm-dd hh:mm:ss[.fffffffff] > --------------------------------------------------------------------------------------------------------- > > Key: SPARK-25453 > URL: https://issues.apache.org/jira/browse/SPARK-25453 > Project: Spark > Issue Type: Test > Components: Tests > Affects Versions: 2.4.0 > Reporter: Yuming Wang > Assignee: Chenxiao Mao > Priority: Major > Fix For: 2.4.0 > > > {noformat} > - SPARK-22814 support date/timestamp types in partitionColumn *** FAILED *** > java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd > hh:mm:ss[.fffffffff] > at java.sql.Timestamp.valueOf(Timestamp.java:204) > at > org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.toInternalBoundValue(JDBCRelation.scala:183) > at > org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.columnPartition(JDBCRelation.scala:88) > at > org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:36) > at > org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318) > at > org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223) > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211) > at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167) > at > org.apache.spark.sql.jdbc.OracleIntegrationSuite$$anonfun$18.apply(OracleIntegrationSuite.scala:445) > at > org.apache.spark.sql.jdbc.OracleIntegrationSuite$$anonfun$18.apply(OracleIntegrationSuite.scala:427) > ...{noformat} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org