[ 
https://issues.apache.org/jira/browse/SPARK-33704?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yuming Wang resolved SPARK-33704.
---------------------------------
    Resolution: Duplicate

> Support latest version of initialize() in HiveGenericUDTF
> ---------------------------------------------------------
>
>                 Key: SPARK-33704
>                 URL: https://issues.apache.org/jira/browse/SPARK-33704
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0, 2.3.4, 2.4.3
>            Reporter: chenliang
>            Priority: Major
>
> For HiveGenericUDTF , there are two initialization methods:
> {code:java}
>   public StructObjectInspector initialize(StructObjectInspector argOIs)
>       throws UDFArgumentException {
>     List<? extends StructField> inputFields = argOIs.getAllStructFieldRefs();
>     ObjectInspector[] udtfInputOIs = new ObjectInspector[inputFields.size()];
>     for (int i = 0; i < inputFields.size(); i++) {
>       udtfInputOIs[i] = inputFields.get(i).getFieldObjectInspector();
>     }
>     return initialize(udtfInputOIs);
>   }
>   @Deprecated
>   public StructObjectInspector initialize(ObjectInspector[] argOIs)
>       throws UDFArgumentException {
>     throw new IllegalStateException("Should not be called directly");
>   }
> {code}
> As https://issues.apache.org/jira/browse/HIVE-5737 mentioned, hive provided 
> StructObjectInspector for UDTFs rather than ObjectInspect[], but Spark SQL  
> still only support deprecated function.
> An exception will be reported before fix:
> Error in query: No handler for UDF/UDAF/UDTF 'FeatureParseUDTF1': 
> java.lang.IllegalStateException: Should not be called directly
> Please make sure your function overrides public StructObjectInspector 
> initialize(ObjectInspector[] args).; line 1 pos 7
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to