I have tested your code and can not reproduce the problem.

Could you share your environment? how did you configure Zeppelin with Spark?

Thanks,
moon

On Fri, Aug 21, 2015 at 2:25 AM David Salinas <david.salinas....@gmail.com>
wrote:

> Hi,
>
> I have a problem when using spark closure. This error was not appearing
> with spark 1.2.1.
>
> I have included a reproducible example that happens when taking the
> closure (Zeppelin has been built with head of master with this command mvn
> install -DskipTests -Pspark-1.4 -Dspark.version=1.4.1
> -Dhadoop.version=2.2.0 -Dprotobuf.version=2.5.0). Does anyone ever
> encountered this problem? All my previous notebooks are broken by this :(
>
> ------------------------------
> val textFile = sc.textFile("hdfs://somefile.txt")
>
> val f = (s: String) => s+s
> textFile.map(f).count
> //works fine
> //res145: Long = 407
>
>
> def f(s:String) = {
>     s+s
> }
> textFile.map(f).count
>
> //fails ->
>
> org.apache.spark.SparkException: Job aborted due to stage failure: Task
> 566 in stage 87.0 failed 4 times, most recent failure: Lost task 566.3 in
> stage 87.0 (TID 43396, XXX.com): java.lang.NoClassDefFoundError:
> Lorg/apache/zeppelin/spark/ZeppelinContext; at
> java.lang.Class.getDeclaredFields0(Native Method) at
> java.lang.Class.privateGetDeclaredFields(Class.java:2583) at
> java.lang.Class.getDeclaredField(Class.java:2068) ...
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924) at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000) at
>
> Best,
>
> David
>

Reply via email to