[ 
https://issues.apache.org/jira/browse/SPARK-41118?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-41118:
------------------------------------

    Assignee:     (was: Apache Spark)

> to_number/try_to_number throws NullPointerException when format is null
> -----------------------------------------------------------------------
>
>                 Key: SPARK-41118
>                 URL: https://issues.apache.org/jira/browse/SPARK-41118
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.4.0, 3.3.1
>            Reporter: Bruce Robbins
>            Priority: Minor
>
> Example:
> {noformat}
> spark-sql> SELECT to_number('454', null);
> [INTERNAL_ERROR] The Spark SQL phase analysis failed with an internal error. 
> Please, fill a bug report in, and provide the full stack trace.
> org.apache.spark.SparkException: [INTERNAL_ERROR] The Spark SQL phase 
> analysis failed with an internal error. Please, fill a bug report in, and 
> provide the full stack trace.
>       at 
> org.apache.spark.SparkException$.internalError(SparkException.scala:88)
>       at 
> org.apache.spark.sql.execution.QueryExecution$.toInternalError(QueryExecution.scala:498)
>       at 
> org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:510)
>       at 
> org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:185)
> ...
> Caused by: java.lang.NullPointerException
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.numberFormat$lzycompute(numberFormatExpressions.scala:72)
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.numberFormat(numberFormatExpressions.scala:72)
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.numberFormatter$lzycompute(numberFormatExpressions.scala:73)
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.numberFormatter(numberFormatExpressions.scala:73)
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.checkInputDataTypes(numberFormatExpressions.scala:81)
> {noformat}
> Also:
> {noformat}
> spark-sql> SELECT try_to_number('454', null);
> [INTERNAL_ERROR] The Spark SQL phase analysis failed with an internal error. 
> Please, fill a bug report in, and provide the full stack trace.
> org.apache.spark.SparkException: [INTERNAL_ERROR] The Spark SQL phase 
> analysis failed with an internal error. Please, fill a bug report in, and 
> provide the full stack trace.
>       at 
> org.apache.spark.SparkException$.internalError(SparkException.scala:88)
>       at 
> org.apache.spark.sql.execution.QueryExecution$.toInternalError(QueryExecution.scala:498)
>       at 
> org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:510)
>       at 
> org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:185)
> ...
> Caused by: java.lang.NullPointerException
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.numberFormat$lzycompute(numberFormatExpressions.scala:72)
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.numberFormat(numberFormatExpressions.scala:72)
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.numberFormatter$lzycompute(numberFormatExpressions.scala:73)
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.numberFormatter(numberFormatExpressions.scala:73)
>       at 
> org.apache.spark.sql.catalyst.expressions.ToNumber.checkInputDataTypes(numberFormatExpressions.scala:81)
>       at 
> org.apache.spark.sql.catalyst.expressions.TryToNumber.checkInputDataTypes(numberFormatExpressions.scala:146)
> {noformat}
> Compare to {{to_binary}} and {{try_to_binary}}:
> {noformat}
> spark-sql> SELECT to_binary('abc', null);
> NULL
> Time taken: 3.111 seconds, Fetched 1 row(s)
> spark-sql> SELECT try_to_binary('abc', null);
> NULL
> Time taken: 0.06 seconds, Fetched 1 row(s)
> spark-sql>
> {noformat}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to