Hello, 
I met a problem when using Spark sql CLI. A custom UDTF with lateral view
throws ClassNotFound exception. I did a couple of experiments in same
environment (spark version 1.1.1): 
select + same custom UDTF (Passed)
select + lateral view + custom UDTF (ClassNotFoundException)
select + lateral view + built-in UDTF (Passed)

I have done some googling there days and found one related issue ticket of
Spark 
https://issues.apache.org/jira/browse/SPARK-4811
which is about "Custom UDTFs not working in Spark SQL".

It should be helpful to put actual code here to reproduce the problem.
However,  corporate regulations might prohibit this. So sorry about this.
Directly using explode's source code in a jar will help anyway.

Here is a portion of stack print when exception, just in case:
java.lang.ClassNotFoundException: XXX
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at
org.apache.spark.sql.hive.HiveFunctionFactory$class.createFunction(hiveUdfs.scala:81)
        at
org.apache.spark.sql.hive.HiveGenericUdtf.createFunction(hiveUdfs.scala:247)
        at
org.apache.spark.sql.hive.HiveGenericUdtf.function$lzycompute(hiveUdfs.scala:254)
        at 
org.apache.spark.sql.hive.HiveGenericUdtf.function(hiveUdfs.scala:254)
        at
org.apache.spark.sql.hive.HiveGenericUdtf.outputInspectors$lzycompute(hiveUdfs.scala:261)
        at
org.apache.spark.sql.hive.HiveGenericUdtf.outputInspectors(hiveUdfs.scala:260)
        at
org.apache.spark.sql.hive.HiveGenericUdtf.outputDataTypes$lzycompute(hiveUdfs.scala:265)
        at
org.apache.spark.sql.hive.HiveGenericUdtf.outputDataTypes(hiveUdfs.scala:265)
        at 
org.apache.spark.sql.hive.HiveGenericUdtf.makeOutput(hiveUdfs.scala:269)
        at
org.apache.spark.sql.catalyst.expressions.Generator.output(generators.scala:60)
        at
org.apache.spark.sql.catalyst.plans.logical.Generate$$anonfun$1.apply(basicOperators.scala:50)
        at
org.apache.spark.sql.catalyst.plans.logical.Generate$$anonfun$1.apply(basicOperators.scala:50)
        at scala.Option.map(Option.scala:145)
        at
org.apache.spark.sql.catalyst.plans.logical.Generate.generatorOutput(basicOperators.scala:50)
        at
org.apache.spark.sql.catalyst.plans.logical.Generate.output(basicOperators.scala:60)
        at
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveChildren$1.apply(LogicalPlan.scala:79)
        at
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveChildren$1.apply(LogicalPlan.scala:79)
        at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
        at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
        at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
....the rest is omitted.

Thank you.

Shenghua




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Custom-UDTF-with-Lateral-View-throws-ClassNotFound-exception-in-Spark-SQL-CLI-tp20689.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to