[ 
https://issues.apache.org/jira/browse/FLINK-20606?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

akisaya updated FLINK-20606:
----------------------------
    Description: 
with flink version 1.12.0(versions before also affected)

I started a sql cli  with a hive catalog and specified a user jar file with -j 
option like this:
{code:java}
bin/sql-client.sh embedded -j /Users/akis/Desktop/flink-func/myfunc.jar
{code}
{color:#ff0000}when i tried to create a custom function using class from 
myfunc.jar,cli reported ClassNotFoundException.{color}

 
{code:java}
Flink SQL> use catalog myhive;

Flink SQL> create function myfunc1 as 'me.aki.flink.flinkudf.MyFunc';
[ERROR] Could not execute SQL statement. Reason:
java.lang.ClassNotFoundException: me.aki.flink.flinkudf.MyFunc
{code}
 

 

me.aki.flink.flinkudf.MyFunc is the identifier of udf,which defined like this

 
{code:java}
package me.aki.flink.flinkudf;

import org.apache.flink.table.functions.ScalarFunction;

public class MyFunc extends ScalarFunction {
    public String eval(String s) {
        return "myfunc_" + s;
    }
}
{code}
 

 

 

after walking through the related code, I believe this is a bug caused by wrong 
classloader

 

when using a hive catalog, flink will use  
{color:#ff0000}CatalogFunctionImpl{color}  to wrap the function。 The

isGeneric() methed  uses {color:#ff0000}Class.forName(String clazzName){color} 
which will use a current classloader(classloader loads flink/lib) to determine 
the class。

 

however with -j option, user jar is set to the ExecutionContext and loaded by 
another userClassLoader

 

and the fix can be easy to pass a classloader to the Class.forName method.
{code:java}
ClassLoader cl = Thread.currentThread().getContextClassLoader();
Class c = Class.forName(className, true, cl);
{code}
after do such fix and build a new flink dist,create function behaves right

 

 

 

 

 

 

 

 

  was:
I started a sql cli  with a hive catalog and specified a user jar file with -j 
option like this:
{code:java}
bin/sql-client.sh embedded -j /Users/akis/Desktop/flink-func/myfunc.jar
{code}
{color:#FF0000}when i tried to create a custom function using class from 
myfunc.jar,cli reported ClassNotFoundException.{color}

 
{code:java}
Flink SQL> use catalog myhive;

Flink SQL> create function myfunc1 as 'me.aki.flink.flinkudf.MyFunc';
[ERROR] Could not execute SQL statement. Reason:
java.lang.ClassNotFoundException: me.aki.flink.flinkudf.MyFunc
{code}
 

 

me.aki.flink.flinkudf.MyFunc is the identifier of udf,which defined like this

 
{code:java}
package me.aki.flink.flinkudf;

import org.apache.flink.table.functions.ScalarFunction;

public class MyFunc extends ScalarFunction {
    public String eval(String s) {
        return "myfunc_" + s;
    }
}
{code}
 

 

 

after walking through the related code, I believe this is a bug caused by wrong 
classloader

 

when using a hive catalog, flink will use  
{color:#FF0000}CatalogFunctionImpl{color}  to wrap the function。 The

isGeneric() methed  uses {color:#FF0000}Class.forName(String clazzName){color} 
which will use a current classloader(classloader loads flink/lib) to determine 
the class。

 

however with -j option, user jar is set to the ExecutionContext and loaded by 
another userClassLoader

 

and the fix can be easy to pass a classloader to the Class.forName method.
{code:java}
ClassLoader cl = Thread.currentThread().getContextClassLoader();
Class c = Class.forName(className, true, cl);
{code}
after do such fix and build a new flink dist,create function behaves right

 

 

 

 

 

 

 

 


> sql client cannot create function using user classes from jar which specified 
> by  -j option   
> ----------------------------------------------------------------------------------------------
>
>                 Key: FLINK-20606
>                 URL: https://issues.apache.org/jira/browse/FLINK-20606
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hive, Table SQL / API, Table SQL / Client
>    Affects Versions: 1.10.2, 1.12.0, 1.11.2
>            Reporter: akisaya
>            Priority: Major
>
> with flink version 1.12.0(versions before also affected)
> I started a sql cli  with a hive catalog and specified a user jar file with 
> -j option like this:
> {code:java}
> bin/sql-client.sh embedded -j /Users/akis/Desktop/flink-func/myfunc.jar
> {code}
> {color:#ff0000}when i tried to create a custom function using class from 
> myfunc.jar,cli reported ClassNotFoundException.{color}
>  
> {code:java}
> Flink SQL> use catalog myhive;
> Flink SQL> create function myfunc1 as 'me.aki.flink.flinkudf.MyFunc';
> [ERROR] Could not execute SQL statement. Reason:
> java.lang.ClassNotFoundException: me.aki.flink.flinkudf.MyFunc
> {code}
>  
>  
> me.aki.flink.flinkudf.MyFunc is the identifier of udf,which defined like this
>  
> {code:java}
> package me.aki.flink.flinkudf;
> import org.apache.flink.table.functions.ScalarFunction;
> public class MyFunc extends ScalarFunction {
>     public String eval(String s) {
>         return "myfunc_" + s;
>     }
> }
> {code}
>  
>  
>  
> after walking through the related code, I believe this is a bug caused by 
> wrong classloader
>  
> when using a hive catalog, flink will use  
> {color:#ff0000}CatalogFunctionImpl{color}  to wrap the function。 The
> isGeneric() methed  uses {color:#ff0000}Class.forName(String 
> clazzName){color} which will use a current classloader(classloader loads 
> flink/lib) to determine the class。
>  
> however with -j option, user jar is set to the ExecutionContext and loaded by 
> another userClassLoader
>  
> and the fix can be easy to pass a classloader to the Class.forName method.
> {code:java}
> ClassLoader cl = Thread.currentThread().getContextClassLoader();
> Class c = Class.forName(className, true, cl);
> {code}
> after do such fix and build a new flink dist,create function behaves right
>  
>  
>  
>  
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to