[jira] [Commented] (FLINK-20606) sql client cannot create function using user classes from jar which specified by -j option

2020-12-15 Thread akisaya (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17249738#comment-17249738
 ] 

akisaya commented on FLINK-20606:
-

[~lithium147],what do you think and could you assign it to me?
h3.  

> sql client cannot create function using user classes from jar which specified 
> by  -j option   
> --
>
> Key: FLINK-20606
> URL: https://issues.apache.org/jira/browse/FLINK-20606
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Hive, Table SQL / API, Table SQL / Client
>Affects Versions: 1.10.2, 1.12.0, 1.11.2
>Reporter: akisaya
>Priority: Major
>
> with flink version 1.12.0(versions before also affected)
> I started a sql cli  with a hive catalog and specified a user jar file with 
> -j option like this:
> {code:java}
> bin/sql-client.sh embedded -j /Users/akis/Desktop/flink-func/myfunc.jar
> {code}
> {color:#ff}when i tried to create a custom function using class from 
> myfunc.jar,cli reported ClassNotFoundException.{color}
>  
> {code:java}
> Flink SQL> use catalog myhive;
> Flink SQL> create function myfunc1 as 'me.aki.flink.flinkudf.MyFunc';
> [ERROR] Could not execute SQL statement. Reason:
> java.lang.ClassNotFoundException: me.aki.flink.flinkudf.MyFunc
> {code}
>  
>  
> me.aki.flink.flinkudf.MyFunc is the identifier of udf,which defined like this
>  
> {code:java}
> package me.aki.flink.flinkudf;
> import org.apache.flink.table.functions.ScalarFunction;
> public class MyFunc extends ScalarFunction {
> public String eval(String s) {
> return "myfunc_" + s;
> }
> }
> {code}
>  
>  
>  
> after walking through the related code, I believe this is a bug caused by 
> wrong classloader
>  
> when using a hive catalog, flink will use  
> {color:#ff}CatalogFunctionImpl{color}  to wrap the function。 The
> isGeneric() methed  uses {color:#ff}Class.forName(String 
> clazzName){color} which will use a current classloader(classloader loads 
> flink/lib) to determine the class。
>  
> however with -j option, user jar is set to the ExecutionContext and loaded by 
> another userClassLoader
>  
> and the fix can be easy to pass a classloader to the Class.forName method.
> {code:java}
> ClassLoader cl = Thread.currentThread().getContextClassLoader();
> Class c = Class.forName(className, true, cl);
> {code}
> after do such fix and build a new flink dist,create function behaves right
>  
> {code:java}
> Flink SQL> select myfunc1('1');
> // output
>      EXPR$0
>      myfunc_1
> {code}
>  
>  
>  
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-20606) sql client cannot create function using user classes from jar which specified by -j option

2020-12-14 Thread akisaya (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-20606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17249398#comment-17249398
 ] 

akisaya commented on FLINK-20606:
-

[~jark],I would like make a  pr to this issue, can you assign this to me?

> sql client cannot create function using user classes from jar which specified 
> by  -j option   
> --
>
> Key: FLINK-20606
> URL: https://issues.apache.org/jira/browse/FLINK-20606
> Project: Flink
>  Issue Type: Bug
>  Components: Connectors / Hive, Table SQL / API, Table SQL / Client
>Affects Versions: 1.10.2, 1.12.0, 1.11.2
>Reporter: akisaya
>Priority: Major
>
> I started a sql cli  with a hive catalog and specified a user jar file with 
> -j option like this:
> {code:java}
> bin/sql-client.sh embedded -j /Users/akis/Desktop/flink-func/myfunc.jar
> {code}
> {color:#FF}when i tried to create a custom function using class from 
> myfunc.jar,cli reported ClassNotFoundException.{color}
>  
> {code:java}
> Flink SQL> use catalog myhive;
> Flink SQL> create function myfunc1 as 'me.aki.flink.flinkudf.MyFunc';
> [ERROR] Could not execute SQL statement. Reason:
> java.lang.ClassNotFoundException: me.aki.flink.flinkudf.MyFunc
> {code}
>  
>  
> me.aki.flink.flinkudf.MyFunc is the identifier of udf,which defined like this
>  
> {code:java}
> package me.aki.flink.flinkudf;
> import org.apache.flink.table.functions.ScalarFunction;
> public class MyFunc extends ScalarFunction {
> public String eval(String s) {
> return "myfunc_" + s;
> }
> }
> {code}
>  
>  
>  
> after walking through the related code, I believe this is a bug caused by 
> wrong classloader
>  
> when using a hive catalog, flink will use  
> {color:#FF}CatalogFunctionImpl{color}  to wrap the function。 The
> isGeneric() methed  uses {color:#FF}Class.forName(String 
> clazzName){color} which will use a current classloader(classloader loads 
> flink/lib) to determine the class。
>  
> however with -j option, user jar is set to the ExecutionContext and loaded by 
> another userClassLoader
>  
> and the fix can be easy to pass a classloader to the Class.forName method.
> {code:java}
> ClassLoader cl = Thread.currentThread().getContextClassLoader();
> Class c = Class.forName(className, true, cl);
> {code}
> after do such fix and build a new flink dist,create function behaves right
>  
>  
>  
>  
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)