[GitHub] [spark] gengliangwang commented on a change in pull request #33640: [SPARK-36409][SQL][TESTS] Splitting test cases from datetime.sql

2021-08-05 Thread GitBox


gengliangwang commented on a change in pull request #33640:
URL: https://github.com/apache/spark/pull/33640#discussion_r683144515



##
File path: sql/core/src/test/resources/sql-tests/results/ansi/date.sql.out
##
@@ -0,0 +1,525 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 60
+
+
+-- !query
+select date '2019-01-01\t'
+-- !query schema
+struct
+-- !query output
+2019-01-01
+
+
+-- !query
+select date '2020-01-01中文'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the DATE value: 2020-01-01中文(line 1, pos 7)
+
+== SQL ==
+select date '2020-01-01中文'
+---^^^
+
+
+-- !query
+select date'99-03-18', date'-0001-1-28', date'0015'
+-- !query schema
+struct
+-- !query output
++99-03-18  -0001-01-28 0015-01-01
+
+
+-- !query
+select date'015'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the DATE value: 015(line 1, pos 7)
+
+== SQL ==
+select date'015'
+---^^^
+
+
+-- !query
+select date'2021-4294967297-11'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the DATE value: 2021-4294967297-11(line 1, pos 7)
+
+== SQL ==
+select date'2021-4294967297-11'
+---^^^
+
+
+-- !query
+select current_date = current_date
+-- !query schema
+struct<(current_date() = current_date()):boolean>
+-- !query output
+true
+
+
+-- !query
+select current_date() = current_date()
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+no viable alternative at input 'current_date'(line 1, pos 7)
+
+== SQL ==
+select current_date() = current_date()
+---^^^
+
+
+-- !query
+select DATE_FROM_UNIX_DATE(0), DATE_FROM_UNIX_DATE(1000), 
DATE_FROM_UNIX_DATE(null)
+-- !query schema
+struct
+-- !query output
+1970-01-01 1972-09-27  NULL
+
+
+-- !query
+select UNIX_DATE(DATE('1970-01-01')), UNIX_DATE(DATE('2020-12-04')), 
UNIX_DATE(null)
+-- !query schema
+struct
+-- !query output
+0  18600   NULL
+
+
+-- !query
+select to_date(null), to_date('2016-12-31'), to_date('2016-12-31', 
'-MM-dd')
+-- !query schema
+struct
+-- !query output
+NULL   2016-12-31  2016-12-31
+
+
+-- !query
+select to_date("16", "dd")
+-- !query schema
+struct
+-- !query output
+1970-01-16
+
+
+-- !query
+select to_date("02-29", "MM-dd")
+-- !query schema
+struct<>
+-- !query output
+java.time.DateTimeException
+Invalid date 'February 29' as '1970' is not a leap year
+
+
+-- !query
+select dayofweek('2007-02-03'), dayofweek('2009-07-30'), 
dayofweek('2017-05-27'),
+  dayofweek(null), dayofweek('1582-10-15 13:10:15'), 
dayofweek(timestamp'1582-10-15 13:10:15')
+-- !query schema
+struct
+-- !query output
+7  5   7   NULL6   6
+
+
+-- !query
+select weekday('2007-02-03'), weekday('2009-07-30'), weekday('2017-05-27'),
+  weekday(null), weekday('1582-10-15 13:10:15'), weekday(timestamp'1582-10-15 
13:10:15')
+-- !query schema
+struct
+-- !query output
+5  3   5   NULL4   4
+
+
+-- !query
+select year('1500-01-01'), year('1582-10-15 13:10:15'), 
year(timestamp'1582-10-15 13:10:15')
+-- !query schema
+struct
+-- !query output
+1500   15821582
+
+
+-- !query
+select month('1500-01-01'), month('1582-10-15 13:10:15'), 
month(timestamp'1582-10-15 13:10:15')
+-- !query schema
+struct
+-- !query output
+1  10  10
+
+
+-- !query
+select dayOfYear('1500-01-01'), dayOfYear('1582-10-15 13:10:15'), 
dayOfYear(timestamp'1582-10-15 13:10:15')
+-- !query schema
+struct
+-- !query output
+1  288 288
+
+
+-- !query
+select next_day("2015-07-23", "Mon")
+-- !query schema
+struct
+-- !query output
+2015-07-27
+
+
+-- !query
+select next_day("2015-07-23", "xx")
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal input for day of week: xx
+
+
+-- !query
+select next_day("2015-07-23 12:12:12", "Mon")
+-- !query schema
+struct
+-- !query output
+2015-07-27
+
+
+-- !query
+select next_day(timestamp"2015-07-23 12:12:12", "Mon")

Review comment:
   I tried on Oracle:
   ```
   create table foo(ts timestamp);
   insert into foo values(timestamp'2015-07-23 12:12:12');
   select next_day(ts, 'Mon') from foo;
   ```
   The result is 
   ```
   2015-07-27T12:12:12Z
   ```
   Spark's current implementation for next_day(timestamp) is wrong.
   

##
File path: 
sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out
##
@@ -0,0 +1,579 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 68
+
+
+-- !query
+select timestamp '2019-01-01\t'
+-- !query schema
+struct
+-- !query output
+2019-01-01 00:00:00
+
+
+-- !query
+select timestamp '2019-01-01中文'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the TIMESTAMP value: 2019-01-01中文(line 1, pos 7)
+
+== SQL ==
+select 

[GitHub] [spark] gengliangwang commented on a change in pull request #33640: [SPARK-36409][SQL][TESTS] Splitting test cases from datetime.sql

2021-08-05 Thread GitBox


gengliangwang commented on a change in pull request #33640:
URL: https://github.com/apache/spark/pull/33640#discussion_r683188298



##
File path: 
sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerQueryTestSuite.scala
##
@@ -83,9 +83,12 @@ class ThriftServerQueryTestSuite extends SQLQueryTestSuite 
with SharedThriftServ
 "postgreSQL/case.sql",
 // SPARK-28624
 "date.sql",
-"datetime.sql",
+"timestamp.sql",
 "datetime-legacy.sql",
-"ansi/datetime.sql",
+"ansi/date.sql",

Review comment:
   Do we need to ignore all of these tests? I believe some of them can pass.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] gengliangwang commented on a change in pull request #33640: [SPARK-36409][SQL][TESTS] Splitting test cases from datetime.sql

2021-08-05 Thread GitBox


gengliangwang commented on a change in pull request #33640:
URL: https://github.com/apache/spark/pull/33640#discussion_r683187391



##
File path: sql/core/src/test/resources/sql-tests/inputs/timestamp_ltz.sql
##
@@ -0,0 +1,31 @@
+-- timestamp_ltz literals and constructors
+--CONFIG_DIM1 spark.sql.timestampType=TIMESTAMP_LTZ
+--CONFIG_DIM1 spark.sql.timestampType=TIMESTAMP_NTZ
+
+select timestamp_ltz'2016-12-31 00:12:00', timestamp_ltz'2016-12-31';
+
+select to_timestamp_ltz(null), to_timestamp_ltz('2016-12-31 00:12:00'), 
to_timestamp_ltz('2016-12-31', '-MM-dd');
+-- `to_timestamp_ltz` can also take date input
+select to_timestamp_ltz(to_date(null)), 
to_timestamp_ltz(to_date('2016-12-31'));
+-- `to_timestamp_ltz` can also take timestamp_ntz input
+select to_timestamp_ltz(to_timestamp_ntz(null)), 
to_timestamp_ltz(to_timestamp_ntz('2016-12-31 00:12:00'));
+
+-- TimestampLTZ numeric fields constructor
+SELECT make_timestamp_ltz(2021, 07, 11, 6, 30, 45.678);
+SELECT make_timestamp_ltz(2021, 07, 11, 6, 30, 45.678, 'CET');
+SELECT make_timestamp_ltz(2021, 07, 11, 6, 30, 60.007);
+
+-- [SPARK-31710] TIMESTAMP_SECONDS, TIMESTAMP_MILLISECONDS and 
TIMESTAMP_MICROSECONDS that always create timestamp_ltz
+select 
TIMESTAMP_SECONDS(1230219000),TIMESTAMP_SECONDS(-1230219000),TIMESTAMP_SECONDS(null);

Review comment:
   shall we move these test cases to `timestamp.sql`? So that this file is 
only for the functions with `_ltz` and developer can have better judgment where 
the new cases should be.
   Also, there are overflow test cases below. Even though the current behavior 
is not related to ANSI mode, we had better check the ANSI mode as well.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] gengliangwang commented on a change in pull request #33640: [SPARK-36409][SQL][TESTS] Splitting test cases from datetime.sql

2021-08-05 Thread GitBox


gengliangwang commented on a change in pull request #33640:
URL: https://github.com/apache/spark/pull/33640#discussion_r683158809



##
File path: sql/core/src/test/resources/sql-tests/inputs/date.sql
##
@@ -0,0 +1,106 @@
+-- date literals, functions and operations
+
+select date '2019-01-01\t';
+select date '2020-01-01中文';
+
+-- date with year outside [-]
+select date'99-03-18', date'-0001-1-28', date'0015';
+-- invalid: year field must have at least 4 digits
+select date'015';
+-- invalid: month field can have at most 2 digits
+select date'2021-4294967297-11';
+
+select current_date = current_date;
+-- under ANSI mode, `current_date` can't be a function name.
+select current_date() = current_date();
+
+-- conversions between date and unix_date (number of days from epoch)
+select DATE_FROM_UNIX_DATE(0), DATE_FROM_UNIX_DATE(1000), 
DATE_FROM_UNIX_DATE(null);
+select UNIX_DATE(DATE('1970-01-01')), UNIX_DATE(DATE('2020-12-04')), 
UNIX_DATE(null);
+
+select to_date(null), to_date('2016-12-31'), to_date('2016-12-31', 
'-MM-dd');
+
+-- missing fields in `to_date`
+select to_date("16", "dd");
+-- invalid: there is no 29 in February, 1970
+select to_date("02-29", "MM-dd");
+
+-- `dayofweek` accepts both date and timestamp inputs.
+select dayofweek('2007-02-03'), dayofweek('2009-07-30'), 
dayofweek('2017-05-27'),
+  dayofweek(null), dayofweek('1582-10-15 13:10:15'), 
dayofweek(timestamp'1582-10-15 13:10:15');
+
+-- `weekday` accepts both date and timestamp inputs.
+select weekday('2007-02-03'), weekday('2009-07-30'), weekday('2017-05-27'),
+  weekday(null), weekday('1582-10-15 13:10:15'), weekday(timestamp'1582-10-15 
13:10:15');
+
+-- `year` accepts both date and timestamp inputs.
+select year('1500-01-01'), year('1582-10-15 13:10:15'), 
year(timestamp'1582-10-15 13:10:15');
+
+-- `month` accepts both date and timestamp inputs.
+select month('1500-01-01'), month('1582-10-15 13:10:15'), 
month(timestamp'1582-10-15 13:10:15');
+
+-- `dayOfYear` accepts both date and timestamp inputs.
+select dayOfYear('1500-01-01'), dayOfYear('1582-10-15 13:10:15'), 
dayOfYear(timestamp'1582-10-15 13:10:15');

Review comment:
   ditto




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] gengliangwang commented on a change in pull request #33640: [SPARK-36409][SQL][TESTS] Splitting test cases from datetime.sql

2021-08-05 Thread GitBox


gengliangwang commented on a change in pull request #33640:
URL: https://github.com/apache/spark/pull/33640#discussion_r683158333



##
File path: sql/core/src/test/resources/sql-tests/inputs/date.sql
##
@@ -0,0 +1,106 @@
+-- date literals, functions and operations
+
+select date '2019-01-01\t';
+select date '2020-01-01中文';
+
+-- date with year outside [-]
+select date'99-03-18', date'-0001-1-28', date'0015';
+-- invalid: year field must have at least 4 digits
+select date'015';
+-- invalid: month field can have at most 2 digits
+select date'2021-4294967297-11';
+
+select current_date = current_date;
+-- under ANSI mode, `current_date` can't be a function name.
+select current_date() = current_date();
+
+-- conversions between date and unix_date (number of days from epoch)
+select DATE_FROM_UNIX_DATE(0), DATE_FROM_UNIX_DATE(1000), 
DATE_FROM_UNIX_DATE(null);
+select UNIX_DATE(DATE('1970-01-01')), UNIX_DATE(DATE('2020-12-04')), 
UNIX_DATE(null);
+
+select to_date(null), to_date('2016-12-31'), to_date('2016-12-31', 
'-MM-dd');
+
+-- missing fields in `to_date`
+select to_date("16", "dd");
+-- invalid: there is no 29 in February, 1970
+select to_date("02-29", "MM-dd");
+
+-- `dayofweek` accepts both date and timestamp inputs.
+select dayofweek('2007-02-03'), dayofweek('2009-07-30'), 
dayofweek('2017-05-27'),
+  dayofweek(null), dayofweek('1582-10-15 13:10:15'), 
dayofweek(timestamp'1582-10-15 13:10:15');
+
+-- `weekday` accepts both date and timestamp inputs.
+select weekday('2007-02-03'), weekday('2009-07-30'), weekday('2017-05-27'),
+  weekday(null), weekday('1582-10-15 13:10:15'), weekday(timestamp'1582-10-15 
13:10:15');
+
+-- `year` accepts both date and timestamp inputs.
+select year('1500-01-01'), year('1582-10-15 13:10:15'), 
year(timestamp'1582-10-15 13:10:15');

Review comment:
   ditto




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] gengliangwang commented on a change in pull request #33640: [SPARK-36409][SQL][TESTS] Splitting test cases from datetime.sql

2021-08-05 Thread GitBox


gengliangwang commented on a change in pull request #33640:
URL: https://github.com/apache/spark/pull/33640#discussion_r683158241



##
File path: sql/core/src/test/resources/sql-tests/inputs/date.sql
##
@@ -0,0 +1,106 @@
+-- date literals, functions and operations
+
+select date '2019-01-01\t';
+select date '2020-01-01中文';
+
+-- date with year outside [-]
+select date'99-03-18', date'-0001-1-28', date'0015';
+-- invalid: year field must have at least 4 digits
+select date'015';
+-- invalid: month field can have at most 2 digits
+select date'2021-4294967297-11';
+
+select current_date = current_date;
+-- under ANSI mode, `current_date` can't be a function name.
+select current_date() = current_date();
+
+-- conversions between date and unix_date (number of days from epoch)
+select DATE_FROM_UNIX_DATE(0), DATE_FROM_UNIX_DATE(1000), 
DATE_FROM_UNIX_DATE(null);
+select UNIX_DATE(DATE('1970-01-01')), UNIX_DATE(DATE('2020-12-04')), 
UNIX_DATE(null);
+
+select to_date(null), to_date('2016-12-31'), to_date('2016-12-31', 
'-MM-dd');
+
+-- missing fields in `to_date`
+select to_date("16", "dd");
+-- invalid: there is no 29 in February, 1970
+select to_date("02-29", "MM-dd");
+
+-- `dayofweek` accepts both date and timestamp inputs.
+select dayofweek('2007-02-03'), dayofweek('2009-07-30'), 
dayofweek('2017-05-27'),
+  dayofweek(null), dayofweek('1582-10-15 13:10:15'), 
dayofweek(timestamp'1582-10-15 13:10:15');
+
+-- `weekday` accepts both date and timestamp inputs.
+select weekday('2007-02-03'), weekday('2009-07-30'), weekday('2017-05-27'),
+  weekday(null), weekday('1582-10-15 13:10:15'), weekday(timestamp'1582-10-15 
13:10:15');

Review comment:
   We need to test timestamp_ntz input as well.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] gengliangwang commented on a change in pull request #33640: [SPARK-36409][SQL][TESTS] Splitting test cases from datetime.sql

2021-08-05 Thread GitBox


gengliangwang commented on a change in pull request #33640:
URL: https://github.com/apache/spark/pull/33640#discussion_r683158166



##
File path: sql/core/src/test/resources/sql-tests/inputs/date.sql
##
@@ -0,0 +1,106 @@
+-- date literals, functions and operations
+
+select date '2019-01-01\t';
+select date '2020-01-01中文';
+
+-- date with year outside [-]
+select date'99-03-18', date'-0001-1-28', date'0015';
+-- invalid: year field must have at least 4 digits
+select date'015';
+-- invalid: month field can have at most 2 digits
+select date'2021-4294967297-11';
+
+select current_date = current_date;
+-- under ANSI mode, `current_date` can't be a function name.
+select current_date() = current_date();
+
+-- conversions between date and unix_date (number of days from epoch)
+select DATE_FROM_UNIX_DATE(0), DATE_FROM_UNIX_DATE(1000), 
DATE_FROM_UNIX_DATE(null);
+select UNIX_DATE(DATE('1970-01-01')), UNIX_DATE(DATE('2020-12-04')), 
UNIX_DATE(null);
+
+select to_date(null), to_date('2016-12-31'), to_date('2016-12-31', 
'-MM-dd');
+
+-- missing fields in `to_date`
+select to_date("16", "dd");
+-- invalid: there is no 29 in February, 1970
+select to_date("02-29", "MM-dd");
+
+-- `dayofweek` accepts both date and timestamp inputs.
+select dayofweek('2007-02-03'), dayofweek('2009-07-30'), 
dayofweek('2017-05-27'),
+  dayofweek(null), dayofweek('1582-10-15 13:10:15'), 
dayofweek(timestamp'1582-10-15 13:10:15');
+
+-- `weekday` accepts both date and timestamp inputs.
+select weekday('2007-02-03'), weekday('2009-07-30'), weekday('2017-05-27'),
+  weekday(null), weekday('1582-10-15 13:10:15'), weekday(timestamp'1582-10-15 
13:10:15');
+
+-- `year` accepts both date and timestamp inputs.
+select year('1500-01-01'), year('1582-10-15 13:10:15'), 
year(timestamp'1582-10-15 13:10:15');
+
+-- `month` accepts both date and timestamp inputs.
+select month('1500-01-01'), month('1582-10-15 13:10:15'), 
month(timestamp'1582-10-15 13:10:15');
+
+-- `dayOfYear` accepts both date and timestamp inputs.
+select dayOfYear('1500-01-01'), dayOfYear('1582-10-15 13:10:15'), 
dayOfYear(timestamp'1582-10-15 13:10:15');
+
+-- next_day
+select next_day("2015-07-23", "Mon");
+select next_day("2015-07-23", "xx");
+select next_day("2015-07-23 12:12:12", "Mon");
+-- next_date does not accept timestamp input
+select next_day(timestamp"2015-07-23 12:12:12", "Mon");

Review comment:
   We need to test timestamp_ntz input as well.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] gengliangwang commented on a change in pull request #33640: [SPARK-36409][SQL][TESTS] Splitting test cases from datetime.sql

2021-08-05 Thread GitBox


gengliangwang commented on a change in pull request #33640:
URL: https://github.com/apache/spark/pull/33640#discussion_r683153484



##
File path: 
sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out
##
@@ -0,0 +1,579 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 68
+
+
+-- !query
+select timestamp '2019-01-01\t'
+-- !query schema
+struct
+-- !query output
+2019-01-01 00:00:00
+
+
+-- !query
+select timestamp '2019-01-01中文'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the TIMESTAMP value: 2019-01-01中文(line 1, pos 7)
+
+== SQL ==
+select timestamp '2019-01-01中文'
+---^^^
+
+
+-- !query
+select timestamp'-1969-12-31 16:00:00', timestamp'0015-03-18 16:00:00', 
timestamp'-01', timestamp'9-03-18T12:03:17'
+-- !query schema
+struct
+-- !query output
+-1969-12-31 16:00:00   0015-03-18 16:00:00 -0001-01-01 00:00:00
+9-03-18 12:03:17
+
+
+-- !query
+select timestamp'4294967297'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the TIMESTAMP value: 4294967297(line 1, pos 7)
+
+== SQL ==
+select timestamp'4294967297'
+---^^^
+
+
+-- !query
+select timestamp'2021-01-01T12:30:4294967297.123456'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the TIMESTAMP value: 2021-01-01T12:30:4294967297.123456(line 1, 
pos 7)
+
+== SQL ==
+select timestamp'2021-01-01T12:30:4294967297.123456'
+---^^^
+
+
+-- !query
+select current_timestamp = current_timestamp
+-- !query schema
+struct<(current_timestamp() = current_timestamp()):boolean>
+-- !query output
+true
+
+
+-- !query
+select current_timestamp() = current_timestamp()
+-- !query schema
+struct<(current_timestamp() = current_timestamp()):boolean>
+-- !query output
+true
+
+
+-- !query
+select localtimestamp() = localtimestamp()
+-- !query schema
+struct<(localtimestamp() = localtimestamp()):boolean>
+-- !query output
+true
+
+
+-- !query
+SELECT make_timestamp(2021, 07, 11, 6, 30, 45.678)
+-- !query schema
+struct
+-- !query output
+2021-07-11 06:30:45.678
+
+
+-- !query
+SELECT make_timestamp(2021, 07, 11, 6, 30, 45.678, 'CET')
+-- !query schema
+struct
+-- !query output
+2021-07-11 06:30:45.678
+
+
+-- !query
+SELECT make_timestamp(2021, 07, 11, 6, 30, 60.007)
+-- !query schema
+struct
+-- !query output
+NULL
+
+
+-- !query
+create temporary view ttf1 as select * from values
+  (1, 2),
+  (2, 3)
+  as ttf1(current_date, current_timestamp)
+-- !query schema
+struct<>
+-- !query output
+
+
+
+-- !query
+select current_date, current_timestamp from ttf1
+-- !query schema
+struct
+-- !query output
+1  2
+2  3
+
+
+-- !query
+create temporary view ttf2 as select * from values
+  (1, 2),
+  (2, 3)
+  as ttf2(a, b)
+-- !query schema
+struct<>
+-- !query output
+
+
+
+-- !query
+select current_date = current_date(), current_timestamp = current_timestamp(), 
a, b from ttf2
+-- !query schema
+struct<(current_date() = current_date()):boolean,(current_timestamp() = 
current_timestamp()):boolean,a:int,b:int>
+-- !query output
+true   true1   2
+true   true2   3
+
+
+-- !query
+select a, b from ttf2 order by a, current_date
+-- !query schema
+struct
+-- !query output
+1  2
+2  3
+
+
+-- !query
+select UNIX_SECONDS(timestamp'2020-12-01 14:30:08Z'), 
UNIX_SECONDS(timestamp'2020-12-01 14:30:08.99Z'), UNIX_SECONDS(null)
+-- !query schema
+struct
+-- !query output
+1606833008 1606833008  NULL
+
+
+-- !query
+select UNIX_MILLIS(timestamp'2020-12-01 14:30:08Z'), 
UNIX_MILLIS(timestamp'2020-12-01 14:30:08.99Z'), UNIX_MILLIS(null)
+-- !query schema
+struct
+-- !query output
+1606833008000  1606833008999   NULL
+
+
+-- !query
+select UNIX_MICROS(timestamp'2020-12-01 14:30:08Z'), 
UNIX_MICROS(timestamp'2020-12-01 14:30:08.99Z'), UNIX_MICROS(null)
+-- !query schema
+struct
+-- !query output
+160683300800   160683300899NULL
+
+
+-- !query
+select to_timestamp(null), to_timestamp('2016-12-31 00:12:00'), 
to_timestamp('2016-12-31', '-MM-dd')
+-- !query schema
+struct
+-- !query output
+NULL   2016-12-31 00:12:00 2016-12-31 00:00:00
+
+
+-- !query
+select to_timestamp('2019-10-06 10:11:12.', '-MM-dd HH:mm:ss.SS[zzz]')
+-- !query schema
+struct
+-- !query output
+NULL
+
+
+-- !query
+select to_timestamp('2019-10-06 10:11:12.0', '-MM-dd HH:mm:ss.SS[zzz]')
+-- !query schema
+struct
+-- !query output
+2019-10-06 10:11:12
+
+
+-- !query
+select to_timestamp('2019-10-06 10:11:12.1', '-MM-dd HH:mm:ss.SS[zzz]')
+-- !query schema
+struct
+-- !query output
+2019-10-06 10:11:12.1
+
+
+-- !query
+select to_timestamp('2019-10-06 10:11:12.12', '-MM-dd 
HH:mm:ss.SS[zzz]')
+-- !query schema
+struct
+-- !query output
+2019-10-06 10:11:12.12
+
+
+-- !query
+select to_timestamp('2019-10-06 10:11:12.123UTC', 

[GitHub] [spark] gengliangwang commented on a change in pull request #33640: [SPARK-36409][SQL][TESTS] Splitting test cases from datetime.sql

2021-08-04 Thread GitBox


gengliangwang commented on a change in pull request #33640:
URL: https://github.com/apache/spark/pull/33640#discussion_r683144515



##
File path: sql/core/src/test/resources/sql-tests/results/ansi/date.sql.out
##
@@ -0,0 +1,525 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 60
+
+
+-- !query
+select date '2019-01-01\t'
+-- !query schema
+struct
+-- !query output
+2019-01-01
+
+
+-- !query
+select date '2020-01-01中文'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the DATE value: 2020-01-01中文(line 1, pos 7)
+
+== SQL ==
+select date '2020-01-01中文'
+---^^^
+
+
+-- !query
+select date'99-03-18', date'-0001-1-28', date'0015'
+-- !query schema
+struct
+-- !query output
++99-03-18  -0001-01-28 0015-01-01
+
+
+-- !query
+select date'015'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the DATE value: 015(line 1, pos 7)
+
+== SQL ==
+select date'015'
+---^^^
+
+
+-- !query
+select date'2021-4294967297-11'
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+Cannot parse the DATE value: 2021-4294967297-11(line 1, pos 7)
+
+== SQL ==
+select date'2021-4294967297-11'
+---^^^
+
+
+-- !query
+select current_date = current_date
+-- !query schema
+struct<(current_date() = current_date()):boolean>
+-- !query output
+true
+
+
+-- !query
+select current_date() = current_date()
+-- !query schema
+struct<>
+-- !query output
+org.apache.spark.sql.catalyst.parser.ParseException
+
+no viable alternative at input 'current_date'(line 1, pos 7)
+
+== SQL ==
+select current_date() = current_date()
+---^^^
+
+
+-- !query
+select DATE_FROM_UNIX_DATE(0), DATE_FROM_UNIX_DATE(1000), 
DATE_FROM_UNIX_DATE(null)
+-- !query schema
+struct
+-- !query output
+1970-01-01 1972-09-27  NULL
+
+
+-- !query
+select UNIX_DATE(DATE('1970-01-01')), UNIX_DATE(DATE('2020-12-04')), 
UNIX_DATE(null)
+-- !query schema
+struct
+-- !query output
+0  18600   NULL
+
+
+-- !query
+select to_date(null), to_date('2016-12-31'), to_date('2016-12-31', 
'-MM-dd')
+-- !query schema
+struct
+-- !query output
+NULL   2016-12-31  2016-12-31
+
+
+-- !query
+select to_date("16", "dd")
+-- !query schema
+struct
+-- !query output
+1970-01-16
+
+
+-- !query
+select to_date("02-29", "MM-dd")
+-- !query schema
+struct<>
+-- !query output
+java.time.DateTimeException
+Invalid date 'February 29' as '1970' is not a leap year
+
+
+-- !query
+select dayofweek('2007-02-03'), dayofweek('2009-07-30'), 
dayofweek('2017-05-27'),
+  dayofweek(null), dayofweek('1582-10-15 13:10:15'), 
dayofweek(timestamp'1582-10-15 13:10:15')
+-- !query schema
+struct
+-- !query output
+7  5   7   NULL6   6
+
+
+-- !query
+select weekday('2007-02-03'), weekday('2009-07-30'), weekday('2017-05-27'),
+  weekday(null), weekday('1582-10-15 13:10:15'), weekday(timestamp'1582-10-15 
13:10:15')
+-- !query schema
+struct
+-- !query output
+5  3   5   NULL4   4
+
+
+-- !query
+select year('1500-01-01'), year('1582-10-15 13:10:15'), 
year(timestamp'1582-10-15 13:10:15')
+-- !query schema
+struct
+-- !query output
+1500   15821582
+
+
+-- !query
+select month('1500-01-01'), month('1582-10-15 13:10:15'), 
month(timestamp'1582-10-15 13:10:15')
+-- !query schema
+struct
+-- !query output
+1  10  10
+
+
+-- !query
+select dayOfYear('1500-01-01'), dayOfYear('1582-10-15 13:10:15'), 
dayOfYear(timestamp'1582-10-15 13:10:15')
+-- !query schema
+struct
+-- !query output
+1  288 288
+
+
+-- !query
+select next_day("2015-07-23", "Mon")
+-- !query schema
+struct
+-- !query output
+2015-07-27
+
+
+-- !query
+select next_day("2015-07-23", "xx")
+-- !query schema
+struct<>
+-- !query output
+java.lang.IllegalArgumentException
+Illegal input for day of week: xx
+
+
+-- !query
+select next_day("2015-07-23 12:12:12", "Mon")
+-- !query schema
+struct
+-- !query output
+2015-07-27
+
+
+-- !query
+select next_day(timestamp"2015-07-23 12:12:12", "Mon")

Review comment:
   I tried on Oracle:
   ```
   create table foo(ts timestamp);
   insert into foo values(timestamp'2015-07-23 12:12:12');
   select next_day(ts, 'Mon') from foo;
   ```
   The result is 
   ```
   2015-07-27T12:12:12Z
   ```
   Spark's current implementation for next_day(timestamp) is wrong.
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org