Repository: spark
Updated Branches:
  refs/heads/branch-2.4 7b1094b54 -> 82990e5ef


[SPARK-25453][SQL][TEST][.FFFFFFFFF] OracleIntegrationSuite 
IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss

## What changes were proposed in this pull request?
This PR aims to fix the failed test of `OracleIntegrationSuite`.

## How was this patch tested?
Existing integration tests.

Closes #22461 from seancxmao/SPARK-25453.

Authored-by: seancxmao <seancx...@gmail.com>
Signed-off-by: gatorsmile <gatorsm...@gmail.com>
(cherry picked from commit 21f0b73dbcd94f9eea8cbc06a024b0e899edaf4c)
Signed-off-by: gatorsmile <gatorsm...@gmail.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/82990e5e
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/82990e5e
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/82990e5e

Branch: refs/heads/branch-2.4
Commit: 82990e5efba3693cdaf02f325ca677cb5f7425fc
Parents: 7b1094b
Author: seancxmao <seancx...@gmail.com>
Authored: Sun Sep 30 22:49:14 2018 -0700
Committer: gatorsmile <gatorsm...@gmail.com>
Committed: Sun Sep 30 22:49:27 2018 -0700

----------------------------------------------------------------------
 docs/sql-programming-guide.md                               | 2 +-
 .../org/apache/spark/sql/jdbc/OracleIntegrationSuite.scala  | 9 +++++++++
 2 files changed, 10 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/82990e5e/docs/sql-programming-guide.md
----------------------------------------------------------------------
diff --git a/docs/sql-programming-guide.md b/docs/sql-programming-guide.md
index 2546064..d525405 100644
--- a/docs/sql-programming-guide.md
+++ b/docs/sql-programming-guide.md
@@ -1500,7 +1500,7 @@ See the [Apache Avro Data Source 
Guide](avro-data-source-guide.html).
 
  * The JDBC driver class must be visible to the primordial class loader on the 
client session and on all executors. This is because Java's DriverManager class 
does a security check that results in it ignoring all drivers not visible to 
the primordial class loader when one goes to open a connection. One convenient 
way to do this is to modify compute_classpath.sh on all worker nodes to include 
your driver JARs.
  * Some databases, such as H2, convert all names to upper case. You'll need to 
use upper case to refer to those names in Spark SQL.
-
+ * Users can specify vendor-specific JDBC connection properties in the data 
source options to do special treatment. For example, 
`spark.read.format("jdbc").option("url", 
oracleJdbcUrl).option("oracle.jdbc.mapDateToTimestamp", "false")`. 
`oracle.jdbc.mapDateToTimestamp` defaults to true, users often need to disable 
this flag to avoid Oracle date being resolved as timestamp.
 
 # Performance Tuning
 

http://git-wip-us.apache.org/repos/asf/spark/blob/82990e5e/external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/OracleIntegrationSuite.scala
----------------------------------------------------------------------
diff --git 
a/external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/OracleIntegrationSuite.scala
 
b/external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/OracleIntegrationSuite.scala
index 09a2cd8..70d294d 100644
--- 
a/external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/OracleIntegrationSuite.scala
+++ 
b/external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/OracleIntegrationSuite.scala
@@ -442,6 +442,12 @@ class OracleIntegrationSuite extends 
DockerJDBCIntegrationSuite with SharedSQLCo
       .option("lowerBound", "2018-07-06")
       .option("upperBound", "2018-07-20")
       .option("numPartitions", 3)
+      // oracle.jdbc.mapDateToTimestamp defaults to true. If this flag is not 
disabled, column d
+      // (Oracle DATE) will be resolved as Catalyst Timestamp, which will fail 
bound evaluation of
+      // the partition column. E.g. 2018-07-06 cannot be evaluated as 
Timestamp, and the error
+      // message says: Timestamp format must be yyyy-mm-dd 
hh:mm:ss[.fffffffff].
+      .option("oracle.jdbc.mapDateToTimestamp", "false")
+      .option("sessionInitStatement", "ALTER SESSION SET NLS_DATE_FORMAT = 
'YYYY-MM-DD'")
       .load()
 
     df1.logicalPlan match {
@@ -462,6 +468,9 @@ class OracleIntegrationSuite extends 
DockerJDBCIntegrationSuite with SharedSQLCo
       .option("lowerBound", "2018-07-04 03:30:00.0")
       .option("upperBound", "2018-07-27 14:11:05.0")
       .option("numPartitions", 2)
+      .option("oracle.jdbc.mapDateToTimestamp", "false")
+      .option("sessionInitStatement",
+        "ALTER SESSION SET NLS_TIMESTAMP_FORMAT = 'YYYY-MM-DD HH24:MI:SS.FF'")
       .load()
 
     df2.logicalPlan match {


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to