Corrupted Exception while deserialize task

2014-12-25 Thread WangTaoTheTonic
Hi Guys, I found an excetpion while running application using 1.2.0-snapshot version. It shows like this: 2014-12-23 07:45:36,333 | ERROR | [Executor task launch worker-0] | Exception in task 0.0 in stage 0.0 (TID 0) | org.apache.spark.Logging$class.logError(Logging.scala:96) java.io.StreamCorru

Re: Who manage the log4j appender while running spark on yarn?

2014-12-22 Thread WangTaoTheTonic
After some discussions with Hadoop guys, I got how the mechanism works. If we don't add -Dlog4j.configuration into java options to the container(AM or executors), they will use log4j.properties(if any) under container's classpath(extraClasspath plus yarn.application.classpath). If we wanna custom

Who manage the log4j appender while running spark on yarn?

2014-12-19 Thread WangTaoTheTonic
Hi guys, I recently ran spark on yarn and found spark didn't set any log4j properties file in configuration or code. And the log4j logs was writing into stderr file under ${yarn.nodemanager.log-dirs}/application_${appid}. I wanna know which side(spark or hadoop) controll the appender? Have found