[jira] [Updated] (SPARK-30260) Spark-Shell throw ClassNotFoundException exception for more than one statement to use UDF jar

2020-01-06 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-30260:
--
Fix Version/s: (was: 2.4.3)
   (was: 2.3.0)

> Spark-Shell throw ClassNotFoundException exception for more than one 
> statement to use UDF jar
> -
>
> Key: SPARK-30260
> URL: https://issues.apache.org/jira/browse/SPARK-30260
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell, SQL
>Affects Versions: 2.2.0, 2.3.0, 2.4.3, 2.4.4
>Reporter: chenliang
>Priority: Major
>
> When we start spark-shell and use the udf for the first statement ,it's ok. 
> But for the other statements it failed to load jar to current classpath and 
> would throw ClassNotFoundException,the problem can be reproduced as described 
> in the below.
> {code:java}
> scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
>  --
>  |bigdata_test.Add(1, 2)|
>  --
>  | 3|
>  --
>  scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
>  org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
> 'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
> line 1 pos 8
>    at 
> scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>    at 
> org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:71)
>    at scala.util.Try.getOrElse(Try.scala:79)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog.makeFunctionExpression(HiveSessionCatalog.scala:71)
>    at 
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$makeFunctionBuilder$1.apply(SessionCatalog.scala:1133){code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-30260) Spark-Shell throw ClassNotFoundException exception for more than one statement to use UDF jar

2020-01-06 Thread Dongjoon Hyun (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun updated SPARK-30260:
--
Target Version/s:   (was: 2.3.0, 2.4.3)

> Spark-Shell throw ClassNotFoundException exception for more than one 
> statement to use UDF jar
> -
>
> Key: SPARK-30260
> URL: https://issues.apache.org/jira/browse/SPARK-30260
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell, SQL
>Affects Versions: 2.2.0, 2.3.0, 2.4.3, 2.4.4
>Reporter: chenliang
>Priority: Major
> Fix For: 2.3.0, 2.4.3
>
>
> When we start spark-shell and use the udf for the first statement ,it's ok. 
> But for the other statements it failed to load jar to current classpath and 
> would throw ClassNotFoundException,the problem can be reproduced as described 
> in the below.
> {code:java}
> scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
>  --
>  |bigdata_test.Add(1, 2)|
>  --
>  | 3|
>  --
>  scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
>  org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
> 'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
> line 1 pos 8
>    at 
> scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>    at 
> org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:71)
>    at scala.util.Try.getOrElse(Try.scala:79)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog.makeFunctionExpression(HiveSessionCatalog.scala:71)
>    at 
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$makeFunctionBuilder$1.apply(SessionCatalog.scala:1133){code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-30260) Spark-Shell throw ClassNotFoundException exception for more than one statement to use UDF jar

2019-12-13 Thread chenliang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

chenliang updated SPARK-30260:
--
Target Version/s: 2.4.3, 2.3.0  (was: 2.4.3)

> Spark-Shell throw ClassNotFoundException exception for more than one 
> statement to use UDF jar
> -
>
> Key: SPARK-30260
> URL: https://issues.apache.org/jira/browse/SPARK-30260
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell, SQL
>Affects Versions: 2.2.0, 2.3.0, 2.4.3, 2.4.4
>Reporter: chenliang
>Priority: Major
> Fix For: 2.3.0, 2.4.3
>
>
> When we start spark-shell and use the udf for the first statement ,it's ok. 
> But for the other statements it failed to load jar to current classpath and 
> would throw ClassNotFoundException,the problem can be reproduced as described 
> in the below.
> {code:java}
> scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
>  --
>  |bigdata_test.Add(1, 2)|
>  --
>  | 3|
>  --
>  scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
>  org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
> 'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
> line 1 pos 8
>    at 
> scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>    at 
> org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:71)
>    at scala.util.Try.getOrElse(Try.scala:79)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog.makeFunctionExpression(HiveSessionCatalog.scala:71)
>    at 
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$makeFunctionBuilder$1.apply(SessionCatalog.scala:1133){code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-30260) Spark-Shell throw ClassNotFoundException exception for more than one statement to use UDF jar

2019-12-13 Thread chenliang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

chenliang updated SPARK-30260:
--
Fix Version/s: 2.3.0

> Spark-Shell throw ClassNotFoundException exception for more than one 
> statement to use UDF jar
> -
>
> Key: SPARK-30260
> URL: https://issues.apache.org/jira/browse/SPARK-30260
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell, SQL
>Affects Versions: 2.2.0, 2.3.0, 2.4.3, 2.4.4
>Reporter: chenliang
>Priority: Major
> Fix For: 2.3.0, 2.4.3
>
>
> When we start spark-shell and use the udf for the first statement ,it's ok. 
> But for the other statements it failed to load jar to current classpath and 
> would throw ClassNotFoundException,the problem can be reproduced as described 
> in the below.
> {code:java}
> scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
>  --
>  |bigdata_test.Add(1, 2)|
>  --
>  | 3|
>  --
>  scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
>  org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
> 'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
> line 1 pos 8
>    at 
> scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>    at 
> org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)
>    at 
> org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)
>    at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:71)
>    at scala.util.Try.getOrElse(Try.scala:79)
>    at 
> org.apache.spark.sql.hive.HiveSessionCatalog.makeFunctionExpression(HiveSessionCatalog.scala:71)
>    at 
> org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$makeFunctionBuilder$1.apply(SessionCatalog.scala:1133){code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-30260) Spark-Shell throw ClassNotFoundException exception for more than one statement to use UDF jar

2019-12-13 Thread chenliang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

chenliang updated SPARK-30260:
--
Description: 
When we start spark-shell and use the udf for the first statement ,it's ok. But 
for the other statements it failed to load jar to current classpath and would 
throw ClassNotFoundException,the problem can be reproduced as described in the 
below.
 {{}}
{code:java}
scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
 --
 |bigdata_test.Add(1, 2)|
 --
 | 3|
 --
 scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
 org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
line 1 pos 8
   at 
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
   at 
org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)
   at 
org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)
   at org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)
   at 
org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)
   at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)
   at 
org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)
   at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)
   at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)
   at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:71)
   at scala.util.Try.getOrElse(Try.scala:79)
   at 
org.apache.spark.sql.hive.HiveSessionCatalog.makeFunctionExpression(HiveSessionCatalog.scala:71)
   at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$makeFunctionBuilder$1.apply(SessionCatalog.scala:1133){code}

  was:
When we start spark-shell and use the udf for the first statement ,it's ok. But 
for the other statements it failed to load jar to current classpath and would 
throw ClassNotFoundException,the problem can be reproduced as described in the 
below.

 
{{scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()}}
{{+--+}}
{{|bigdata_test.Add(1, 2)|}}
{{+--+}}
{{| 3|}}
{{+--+}}
{{scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()}}
{{org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
line 1 pos 8}}
{{  }}{{at 
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)}}
{{  }}{{at java.lang.ClassLoader.loadClass(ClassLoader.java:424)}}
{{  }}{{at java.lang.ClassLoader.loadClass(ClassLoader.java:357)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)}}
{{  }}{{at org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)}}
{{  }}{{at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)}}
{{  }}{{at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:71)}}
{{  }}{{at scala.util.Try.getOrElse(Try.scala:79)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSessionCatalog.makeFunctionExpression(HiveSessionCatalog.scala:71)}}
{{  }}{{at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$makeFunctionBuilder$1.apply(SessionCatalog.scala:1133)}}


> Spark-Shell throw ClassNotFoundException exception for more than one 
> statement to use UDF jar
> -
>
> Key: SPARK-30260
> URL: https://issues.apache.org/jira/browse/SPARK-30260
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell, SQL
>Affects Versions: 2.2.0, 2.3.0, 2.4.3, 2.4.4
>Reporter: chenliang
>Priority: Major
> Fix For: 2.4.3
>
>
> 

[jira] [Updated] (SPARK-30260) Spark-Shell throw ClassNotFoundException exception for more than one statement to use UDF jar

2019-12-13 Thread chenliang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

chenliang updated SPARK-30260:
--
Description: 
When we start spark-shell and use the udf for the first statement ,it's ok. But 
for the other statements it failed to load jar to current classpath and would 
throw ClassNotFoundException,the problem can be reproduced as described in the 
below.
{code:java}
scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
 --
 |bigdata_test.Add(1, 2)|
 --
 | 3|
 --
 scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
 org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
line 1 pos 8
   at 
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
   at 
org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)
   at 
org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)
   at org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)
   at 
org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)
   at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)
   at 
org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)
   at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)
   at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)
   at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:71)
   at scala.util.Try.getOrElse(Try.scala:79)
   at 
org.apache.spark.sql.hive.HiveSessionCatalog.makeFunctionExpression(HiveSessionCatalog.scala:71)
   at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$makeFunctionBuilder$1.apply(SessionCatalog.scala:1133){code}

  was:
When we start spark-shell and use the udf for the first statement ,it's ok. But 
for the other statements it failed to load jar to current classpath and would 
throw ClassNotFoundException,the problem can be reproduced as described in the 
below.
 {{}}
{code:java}
scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
 --
 |bigdata_test.Add(1, 2)|
 --
 | 3|
 --
 scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()
 org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
line 1 pos 8
   at 
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
   at 
org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)
   at 
org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)
   at org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)
   at 
org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)
   at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)
   at 
org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)
   at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)
   at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)
   at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:71)
   at scala.util.Try.getOrElse(Try.scala:79)
   at 
org.apache.spark.sql.hive.HiveSessionCatalog.makeFunctionExpression(HiveSessionCatalog.scala:71)
   at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$makeFunctionBuilder$1.apply(SessionCatalog.scala:1133){code}


> Spark-Shell throw ClassNotFoundException exception for more than one 
> statement to use UDF jar
> -
>
> Key: SPARK-30260
> URL: https://issues.apache.org/jira/browse/SPARK-30260
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell, SQL
>Affects Versions: 2.2.0, 2.3.0, 2.4.3, 2.4.4
>Reporter: chenliang
>Priority: Major
> Fix For: 2.4.3
>
>
> When we start spark-shell and use the udf for the first statement ,it's ok. 
> But for the other statements it failed to 

[jira] [Updated] (SPARK-30260) Spark-Shell throw ClassNotFoundException exception for more than one statement to use UDF jar

2019-12-13 Thread chenliang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

chenliang updated SPARK-30260:
--
Description: 
When we start spark-shell and use the udf for the first statement ,it's ok. But 
for the other statements it failed to load jar to current classpath and would 
throw ClassNotFoundException,the problem can be reproduced as described in the 
below.

 
{{scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()}}
{{+--+}}
{{|bigdata_test.Add(1, 2)|}}
{{+--+}}
{{| 3|}}
{{+--+}}
{{scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()}}
{{org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
line 1 pos 8}}
{{  }}{{at 
scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)}}
{{  }}{{at java.lang.ClassLoader.loadClass(ClassLoader.java:424)}}
{{  }}{{at java.lang.ClassLoader.loadClass(ClassLoader.java:357)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)}}
{{  }}{{at org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)}}
{{  }}{{at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)}}
{{  }}{{at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:71)}}
{{  }}{{at scala.util.Try.getOrElse(Try.scala:79)}}
{{  }}{{at 
org.apache.spark.sql.hive.HiveSessionCatalog.makeFunctionExpression(HiveSessionCatalog.scala:71)}}
{{  }}{{at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog$$anonfun$org$apache$spark$sql$catalyst$catalog$SessionCatalog$$makeFunctionBuilder$1.apply(SessionCatalog.scala:1133)}}

  was:
When we start spark-shell  and use the udf for the first statement ,it's ok. 
But for  the other statements it failed to load jar to current classpath and  
would throw ClassNotFoundException,the problem can be reproduced as described 
in the below.





> Spark-Shell throw ClassNotFoundException exception for more than one 
> statement to use UDF jar
> -
>
> Key: SPARK-30260
> URL: https://issues.apache.org/jira/browse/SPARK-30260
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell, SQL
>Affects Versions: 2.2.0, 2.3.0, 2.4.3, 2.4.4
>Reporter: chenliang
>Priority: Major
> Fix For: 2.4.3
>
>
> When we start spark-shell and use the udf for the first statement ,it's ok. 
> But for the other statements it failed to load jar to current classpath and 
> would throw ClassNotFoundException,the problem can be reproduced as described 
> in the below.
>  
> {{scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()}}
> {{+--+}}
> {{|bigdata_test.Add(1, 2)|}}
> {{+--+}}
> {{| 3|}}
> {{+--+}}
> {{scala> val res = spark.sql("select  bigdata_test.Add(1,2)").show()}}
> {{org.apache.spark.sql.AnalysisException: No handler for UDF/UDAF/UDTF 
> 'scala.didi.udf.Add': java.lang.ClassNotFoundException: scala.didi.udf.Add; 
> line 1 pos 8}}
> {{  }}{{at 
> scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)}}
> {{  }}{{at java.lang.ClassLoader.loadClass(ClassLoader.java:424)}}
> {{  }}{{at java.lang.ClassLoader.loadClass(ClassLoader.java:357)}}
> {{  }}{{at 
> org.apache.spark.sql.hive.HiveShim$HiveFunctionWrapper.createFunction(HiveShim.scala:251)}}
> {{  }}{{at 
> org.apache.spark.sql.hive.HiveSimpleUDF.function$lzycompute(hiveUDFs.scala:56)}}
> {{  }}{{at 
> org.apache.spark.sql.hive.HiveSimpleUDF.function(hiveUDFs.scala:56)}}
> {{  }}{{at 
> org.apache.spark.sql.hive.HiveSimpleUDF.method$lzycompute(hiveUDFs.scala:60)}}
> {{  }}{{at org.apache.spark.sql.hive.HiveSimpleUDF.method(hiveUDFs.scala:59)}}
> {{  }}{{at 
> org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:77)}}
> {{  }}{{at 
> org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:77)}}
> {{  }}{{at 
> org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionExpression$3.apply(HiveSessionCatalog.scala:79)}}
> {{  }}{{at 
> 

[jira] [Updated] (SPARK-30260) Spark-Shell throw ClassNotFoundException exception for more than one statement to use UDF jar

2019-12-13 Thread chenliang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

chenliang updated SPARK-30260:
--
Description: 
When we start spark-shell  and use the udf for the first statement ,it's ok. 
But for  the other statements it failed to load jar to current classpath and  
would throw ClassNotFoundException,the problem can be reproduced as described 
in the below.




  was:When we start spark-shell  and use the udf for the first statement ,it's 
ok. But for  the other statements it failed to load jar to current classpath 
and  would throw 


> Spark-Shell throw ClassNotFoundException exception for more than one 
> statement to use UDF jar
> -
>
> Key: SPARK-30260
> URL: https://issues.apache.org/jira/browse/SPARK-30260
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell, SQL
>Affects Versions: 2.2.0, 2.3.0, 2.4.3, 2.4.4
>Reporter: chenliang
>Priority: Major
> Fix For: 2.4.3
>
>
> When we start spark-shell  and use the udf for the first statement ,it's ok. 
> But for  the other statements it failed to load jar to current classpath and  
> would throw ClassNotFoundException,the problem can be reproduced as described 
> in the below.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-30260) Spark-Shell throw ClassNotFoundException exception for more than one statement to use UDF jar

2019-12-13 Thread chenliang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-30260?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

chenliang updated SPARK-30260:
--
Description: When we start spark-shell  and use the udf for the first 
statement ,it's ok. But for  the other statements it failed to load jar to 
current classpath and  would throw 

> Spark-Shell throw ClassNotFoundException exception for more than one 
> statement to use UDF jar
> -
>
> Key: SPARK-30260
> URL: https://issues.apache.org/jira/browse/SPARK-30260
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Shell, SQL
>Affects Versions: 2.2.0, 2.3.0, 2.4.3, 2.4.4
>Reporter: chenliang
>Priority: Major
> Fix For: 2.4.3
>
>
> When we start spark-shell  and use the udf for the first statement ,it's ok. 
> But for  the other statements it failed to load jar to current classpath and  
> would throw 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org