LuciferYang commented on code in PR #38507:
URL: https://github.com/apache/spark/pull/38507#discussion_r1014785958


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala:
##########
@@ -2810,15 +2879,15 @@ case class Sequence(
     if (typesCorrect) {
       TypeCheckResult.TypeCheckSuccess
     } else {
-      TypeCheckResult.TypeCheckFailure(
-        s"""
-           |$prettyName uses the wrong parameter type. The parameter type must 
conform to:
-           |1. The start and stop expressions must resolve to the same type.
-           |2. If start and stop expressions resolve to the 'date' or 
'timestamp' type
-           |then the step expression must resolve to the 'interval' or
-           |'${YearMonthIntervalType.simpleString}' or 
'${DayTimeIntervalType.simpleString}' type,
-           |otherwise to the same type as the start and stop expressions.
-         """.stripMargin)
+      DataTypeMismatch(
+        errorSubClass = "SEQUENCE_WRONG_INPUT_TYPES",
+        messageParameters = Map(
+          "functionName" -> toSQLId(prettyName),
+          "startType" -> toSQLType(TypeCollection(TimestampType, 
TimestampNTZType, DateType)),
+          "stepType" -> toSQLType(

Review Comment:
   This is why I created SPARK-41011. 
   
   The original error message only considered that `startType` are 
`TimestampType, TimestampNTZType, DateType`. 
   
   Let me add `IntegralType` related content to the error message, and then 
further refine it in SPARK-41011
   
   Or wou can also consider completing SPARK-41011 together in this pr
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to