[ 
https://issues.apache.org/jira/browse/SPARK-31312?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17073407#comment-17073407
 ] 

Dongjoon Hyun commented on SPARK-31312:
---------------------------------------

It's for informimg the users (and the downstream distributors) the risk and to 
recommend upgrade their versions. If we set 2.4.5 only, that can be also 
considered as a bug occurred at 2.4.5 .

If we set 2.3.x at least, all 2.4.0 ~ 2.4.4 users also understand the risk.

> Transforming Hive simple UDF (using JAR) expression may incur CNFE in later 
> evaluation
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-31312
>                 URL: https://issues.apache.org/jira/browse/SPARK-31312
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.5, 3.0.0
>            Reporter: Jungtaek Lim
>            Assignee: Jungtaek Lim
>            Priority: Major
>             Fix For: 3.0.0, 2.4.6
>
>
> In SPARK-26560, we ensured that Hive UDF using JAR is executed regardless of 
> current thread context classloader.
> [~cloud_fan] pointed out another potential issue in post-review of 
> SPARK-26560 - quoting the comment:
> {quote}
> Found a potential problem: here we call HiveSimpleUDF.dateType (which is a 
> lazy val), to force to load the class with the corrected class loader.
> However, if the expression gets transformed later, which copies 
> HiveSimpleUDF, then calling HiveSimpleUDF.dataType will re-trigger the class 
> loading, and at that time there is no guarantee that the corrected 
> classloader is used.
> I think we should materialize the loaded class in HiveSimpleUDF.
> {quote}
> This JIRA issue is to track the effort of verifying the potential issue and 
> fixing the issue.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to