Hi Guys,
One more problem I have encountered using Zeppelin.
Using Spark 1.3.1 on Yarn Hadoop 2.4

I'm trying to create and use UDF (hc == z.sqlContext == HiveContext):
1. Create and register the UDF:
def getNum(): Int = {
    100
}

hc.udf.register("getNum",getNum _)
2. And I try to use on exist table:
%sql select getNum() from filteredNc limit 1

Or:
3. Trying using direct hc:
hc.sql("select getNum() from filteredNc limit 1").collect

Both of them yield with
*"java.lang.ClassNotFoundException:
org.apache.zeppelin.spark.ZeppelinContext"*
(see below the full exception).

And my questions is:
1. Can it be that ZeppelinContext is not available on Spark nodes?
2. Why it need ZeppelinContext anyway? Why it's relevant?

The exception:
 WARN [2015-06-28 08:43:53,850] ({task-result-getter-0}
Logging.scala[logWarning]:71) - Lost task 0.2 in stage 23.0 (TID 1626,
ip-10-216-204-246.ec2.internal): java.lang.NoClassDefFoundError:
Lorg/apache/zeppelin/spark/ZeppelinContext;
    at java.lang.Class.getDeclaredFields0(Native Method)
    at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
    at java.lang.Class.getDeclaredField(Class.java:1951)
    at
java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659)

<Many more of ObjectStreamClass lines of exception>

Caused by: java.lang.ClassNotFoundException:
org.apache.zeppelin.spark.ZeppelinContext
    at
org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:69)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 103 more
Caused by: java.lang.ClassNotFoundException:
org.apache.zeppelin.spark.ZeppelinContext
    at java.lang.ClassLoader.findClass(ClassLoader.java:531)
    at
org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at
org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:34)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at
org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
    at
org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:64)
    ... 105 more

Reply via email to