[ 
https://issues.apache.org/jira/browse/SPARK-36300?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-36300:
------------------------------------

    Assignee:     (was: Apache Spark)

> Refactor eleventh set of 20 query execution errors to use error classes
> -----------------------------------------------------------------------
>
>                 Key: SPARK-36300
>                 URL: https://issues.apache.org/jira/browse/SPARK-36300
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core, SQL
>    Affects Versions: 3.2.0
>            Reporter: Karen Feng
>            Priority: Major
>
> Refactor some exceptions in 
> [QueryExecutionErrors|https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala]
>  to use error classes.
> There are currently ~350 exceptions in this file; so this PR only focuses on 
> the eleventh set of 20.
> {code:java}
> expressionDecodingError
> expressionEncodingError
> classHasUnexpectedSerializerError
> cannotGetOuterPointerForInnerClassError
> userDefinedTypeNotAnnotatedAndRegisteredError
> invalidInputSyntaxForBooleanError
> unsupportedOperandTypeForSizeFunctionError
> unexpectedValueForStartInFunctionError
> unexpectedValueForLengthInFunctionError
> sqlArrayIndexNotStartAtOneError
> concatArraysWithElementsExceedLimitError
> flattenArraysWithElementsExceedLimitError
> createArrayWithElementsExceedLimitError
> unionArrayWithElementsExceedLimitError
> initialTypeNotTargetDataTypeError
> initialTypeNotTargetDataTypesError
> cannotConvertColumnToJSONError
> malformedRecordsDetectedInSchemaInferenceError
> malformedJSONError
> malformedRecordsDetectedInSchemaInferenceError
> {code}
> For more detail, see the parent ticket SPARK-36094.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to