Hello,

I am running a streaming app in Spark 1.2.1. When running local everything
works fine. When I try on yarn-cluster it fails and I see ClassCastException
in the log (see below). I can run Spark (non-streaming) apps in the cluster
with no problem.

Any ideas here? Thanks in advance!

WARN scheduler.TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1,
yarn-slave1): java.lang.ClassCastException:
org.apache.spark.storage.BlockManagerId cannot be cast to [B
        at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:61)
        at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
        at org.apache.spark.scheduler.Task.run(Task.scala:56)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/ClassCastException-BlockManagerId-cannot-be-cast-to-B-tp23276.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to