[ 
https://issues.apache.org/jira/browse/SPARK-5506?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15056413#comment-15056413
 ] 

Pavan Achanta commented on SPARK-5506:
--------------------------------------

I get the same exception while running a job from Intellij IDE. 

{code}

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;

public class App {
    public static void main(String[] args) {
        String logFile = "/usr/local/spark-1.5.2/README.md"; // Should be some 
file on your system
        SparkConf conf = new SparkConf().setAppName("Simple Application")
                .set("spark.eventLog.enabled", "true")
                .set("spark.eventLog.dir", "/opt/logs/")
//                .setMaster("local")
                .setMaster("spark://localhost:7077")
                ;
        JavaSparkContext sc = new JavaSparkContext(conf);
        JavaRDD<String> logData = sc.textFile(logFile).cache();

        long numAs = logData.filter(s -> s.contains("a")).count();
        long numBs = logData.filter(s -> s.contains("b")).count();

        System.out.println("Lines with a: " + numAs + ", lines with b: " + 
numBs);
    }
}
{code}



THe exception I see is as follows:
{code}
15/12/13 23:47:58 INFO SparkDeploySchedulerBackend: Registered executor: 
AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@127.0.0.1:50873/user/Executor#-484673147])
 with ID 0
15/12/13 23:47:59 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 
127.0.0.1, PROCESS_LOCAL, 2146 bytes)
15/12/13 23:47:59 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, 
127.0.0.1, PROCESS_LOCAL, 2146 bytes)
15/12/13 23:47:59 INFO BlockManagerMasterEndpoint: Registering block manager 
127.0.0.1:50877 with 530.0 MB RAM, BlockManagerId(0, 127.0.0.1, 50877)
15/12/13 23:48:00 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 
127.0.0.1:50877 (size: 2.2 KB, free: 530.0 MB)
15/12/13 23:48:01 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, 
127.0.0.1): java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1
        at 
java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)
        at 
java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2006)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

15/12/13 23:48:01 INFO TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) on 
executor 127.0.0.1: java.lang.ClassCastException (cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 1]
15/12/13 23:48:01 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID 2, 
127.0.0.1, PROCESS_LOCAL, 2146 bytes)
15/12/13 23:48:01 INFO TaskSetManager: Starting task 1.1 in stage 0.0 (TID 3, 
127.0.0.1, PROCESS_LOCAL, 2146 bytes)
15/12/13 23:48:01 INFO TaskSetManager: Lost task 1.1 in stage 0.0 (TID 3) on 
executor 127.0.0.1: java.lang.ClassCastException (cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 2]
15/12/13 23:48:01 INFO TaskSetManager: Starting task 1.2 in stage 0.0 (TID 4, 
127.0.0.1, PROCESS_LOCAL, 2146 bytes)
15/12/13 23:48:01 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 2) on 
executor 127.0.0.1: java.lang.ClassCastException (cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 3]
15/12/13 23:48:01 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID 5, 
127.0.0.1, PROCESS_LOCAL, 2146 bytes)
15/12/13 23:48:01 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID 5) on 
executor 127.0.0.1: java.lang.ClassCastException (cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 4]
15/12/13 23:48:01 INFO TaskSetManager: Starting task 0.3 in stage 0.0 (TID 6, 
127.0.0.1, PROCESS_LOCAL, 2146 bytes)
15/12/13 23:48:01 INFO TaskSetManager: Lost task 1.2 in stage 0.0 (TID 4) on 
executor 127.0.0.1: java.lang.ClassCastException (cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 5]
15/12/13 23:48:01 INFO TaskSetManager: Starting task 1.3 in stage 0.0 (TID 7, 
127.0.0.1, PROCESS_LOCAL, 2146 bytes)
15/12/13 23:48:01 INFO TaskSetManager: Lost task 1.3 in stage 0.0 (TID 7) on 
executor 127.0.0.1: java.lang.ClassCastException (cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 6]
15/12/13 23:48:01 ERROR TaskSetManager: Task 1 in stage 0.0 failed 4 times; 
aborting job
15/12/13 23:48:01 INFO TaskSetManager: Lost task 0.3 in stage 0.0 (TID 6) on 
executor 127.0.0.1: java.lang.ClassCastException (cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1) [duplicate 7]
15/12/13 23:48:01 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have 
all completed, from pool 
15/12/13 23:48:01 INFO TaskSchedulerImpl: Cancelling stage 0
15/12/13 23:48:01 INFO DAGScheduler: ResultStage 0 (count at App.java:21) 
failed in 6.109 s
15/12/13 23:48:01 INFO DAGScheduler: Job 0 failed: count at App.java:21, took 
6.390838 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to 
stage failure: Task 1 in stage 0.0 failed 4 times, most recent failure: Lost 
task 1.3 in stage 0.0 (TID 7, 127.0.0.1): java.lang.ClassCastException: cannot 
assign instance of java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1
        at 
java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)
        at 
java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2006)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
        at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
        at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
        at scala.Option.foreach(Option.scala:236)
        at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458)
        at 
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
        at 
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1824)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1837)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1850)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1921)
        at org.apache.spark.rdd.RDD.count(RDD.scala:1125)
        at 
org.apache.spark.api.java.JavaRDDLike$class.count(JavaRDDLike.scala:445)
        at 
org.apache.spark.api.java.AbstractJavaRDDLike.count(JavaRDDLike.scala:47)
        at com.sysomos.App.main(App.java:21)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassCastException: cannot assign instance of 
java.lang.invoke.SerializedLambda to field 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
org.apache.spark.api.java.function.Function in instance of 
org.apache.spark.api.java.JavaRDD$$anonfun$filter$1
        at 
java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)
        at 
java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2006)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at 
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2000)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1924)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
        at org.apache.spark.scheduler.Task.run(Task.scala:88)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
15/12/13 23:48:01 INFO SparkContext: Invoking stop() from shutdown hook
15/12/13 23:48:01 INFO SparkUI: Stopped Spark web UI at http://192.168.1.2:4040
15/12/13 23:48:01 INFO DAGScheduler: Stopping DAGScheduler
15/12/13 23:48:01 INFO SparkDeploySchedulerBackend: Shutting down all executors
15/12/13 23:48:01 INFO SparkDeploySchedulerBackend: Asking each executor to 
shut down
15/12/13 23:48:01 INFO MapOutputTrackerMasterEndpoint: 
MapOutputTrackerMasterEndpoint stopped!
15/12/13 23:48:01 INFO MemoryStore: MemoryStore cleared
15/12/13 23:48:01 INFO BlockManager: BlockManager stopped
15/12/13 23:48:01 INFO BlockManagerMaster: BlockManagerMaster stopped
15/12/13 23:48:01 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: 
OutputCommitCoordinator stopped!
15/12/13 23:48:01 INFO SparkContext: Successfully stopped SparkContext
15/12/13 23:48:01 INFO ShutdownHookManager: Shutdown hook called
15/12/13 23:48:01 INFO ShutdownHookManager: Deleting directory 
/private/var/folders/vz/5xf39jjd6zg1db7pt4w3nhzdpc028g/T/spark-1e71e2df-740c-440a-911b-86b6987940f5

{code}


> java.lang.ClassCastException using lambda expressions in combination of spark 
> and Servlet
> -----------------------------------------------------------------------------------------
>
>                 Key: SPARK-5506
>                 URL: https://issues.apache.org/jira/browse/SPARK-5506
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Core
>    Affects Versions: 1.2.0
>         Environment: spark server: Ubuntu 14.04 amd64
> $ java -version
> java version "1.8.0_25"
> Java(TM) SE Runtime Environment (build 1.8.0_25-b17)
> Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode)
>            Reporter: Milad Khajavi
>
> I'm trying to build a web API for my Apache spark jobs using sparkjava.com 
> framework. My code is:
> @Override
> public void init() {
>     get("/hello",
>             (req, res) -> {
>                 String sourcePath = "hdfs://spark:54310/input/*";
>                 SparkConf conf = new SparkConf().setAppName("LineCount");
>                 conf.setJars(new String[] { 
> "/home/sam/resin-4.0.42/webapps/test.war" });
>                 File configFile = new File("config.properties");
>                 String sparkURI = "spark://hamrah:7077";
>                 conf.setMaster(sparkURI);
>                 conf.set("spark.driver.allowMultipleContexts", "true");
>                 JavaSparkContext sc = new JavaSparkContext(conf);
>                 @SuppressWarnings("resource")
>                 JavaRDD<String> log = sc.textFile(sourcePath);
>                 JavaRDD<String> lines = log.filter(x -> {
>                     return true;
>                 });
>                 return lines.count();
>             });
> }
> If I remove the lambda expression or put it inside a simple jar rather than a 
> web service (somehow a Servlet) it will run without any error. But using a 
> lambda expression inside a Servlet will result this exception:
> 15/01/28 10:36:33 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 
> hamrah): java.lang.ClassCastException: cannot assign instance of 
> java.lang.invoke.SerializedLambda to field 
> org.apache.spark.api.java.JavaRDD$$anonfun$filter$1.f$1 of type 
> org.apache.spark.api.java.function.Function in instance of 
> org.apache.spark.api.java.JavaRDD$$anonfun$filter$1
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2089)
> at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1261)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1999)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
> at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
> at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)
> at 
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
> at 
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:57)
> at org.apache.spark.scheduler.Task.run(Task.scala:56)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> P.S: I tried combination of jersey and javaspark with jetty, tomcat and resin 
> and all of them led me to the same result.
> Here the same issue: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-YARN-java-lang-ClassCastException-SerializedLambda-to-org-apache-spark-api-java-function-Fu1-tt21261.html
> This is my colleague question in stackoverflow: 
> http://stackoverflow.com/questions/28186607/java-lang-classcastexception-using-lambda-expressions-in-spark-job-on-remote-ser



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to