[ 
https://issues.apache.org/jira/browse/SPARK-32133?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17148201#comment-17148201
 ] 

Apache Spark commented on SPARK-32133:
--------------------------------------

User 'TJX2014' has created a pull request for this issue:
https://github.com/apache/spark/pull/28926

> Forbid time field steps for date start/end in Sequence
> ------------------------------------------------------
>
>                 Key: SPARK-32133
>                 URL: https://issues.apache.org/jira/browse/SPARK-32133
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: JinxinTang
>            Priority: Major
>             Fix For: 3.1.0
>
>
> *Sequence time field steps for date start/end looks strange in spark as 
> follows:*
> scala> sql("select explode(sequence(cast('2011-03-01' as date), 
> cast('2011-05-01' as date), interval 1 second))").head(3)
> res26: Array[org.apache.spark.sql.Row] = Array([2011-03-01], [2011-03-01], 
> [2011-03-01])
> scala> sql("select explode(sequence(cast('2011-03-01' as date), 
> cast('2011-05-01' as date), interval 1 minute))").head(3)
> res27: Array[org.apache.spark.sql.Row] = Array([2011-03-01], [2011-03-01], 
> [2011-03-01])
> scala> sql("select explode(sequence(cast('2011-03-01' as date), 
> cast('2011-05-01' as date), interval 1 hour))").head(3)
> res28: Array[org.apache.spark.sql.Row] = Array([2011-03-01], [2011-03-01], 
> [2011-03-01])
> *While this behavior in Prosto make sense:* 
> presto> select sequence(date('2011-03-01'),date('2011-03-02'),interval '1' 
> hour);
> Query 20200624_122744_00002_pehix failed: sequence step must be a day 
> interval if start and end values are dates
> presto> select sequence(date('2011-03-01'),date('2011-03-02'),interval '1' 
> day);
> _col0
> [2011-03-01, 2011-03-02]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to