Problem About Worker System.out
I have used Spark 1.4 for 6 months. Thanks all the members of this community for your great work.I have a question about the logging issue. I hope this question can be solved. The program is running under this configurations: YARN Cluster, YARN-client mode. In Scala,writing a code like:rdd.map( a => println(a) ); will get the output about the value of a in our console. However,in Java (1.7),writing rdd.map( new Function(){ @Override public Integer call(Integer a) throws Exception { System.out.println(a); }});won't get the output in our console. The configuration is the same. I have try this code but not work either: rdd.map( new Function (){ @Override public Integer call(Integer a) throws Exception {org.apache.log4j.Logger log = Logger.getLogger(this.getClass()); log.info(a); log.warn(a); log.error(a); log.fatal(a); }}); No output either:final org.apache.log4j.Logger log = Logger.getLogger(this.getClass()); rdd.map( new Function (){ @Override public Integer call(Integer a) throws Exception { log.info(a); log.warn(a); log.error(a); log.fatal(a); }}); It seems that the output of stdout in worker doesn't send the output back to our driver.I am wonder why it works in scala but not in java.Is there a simple way to make java work like scala? Thanks.
Re: Problem About Worker System.out
Stdout will not be sent back to driver, no matter you use Scala or Java. You must do something wrongly that makes you think it is an expected behavior. On Mon, Dec 28, 2015 at 5:33 PM, David Johnwrote: > I have used Spark *1.4* for 6 months. Thanks all the members of this > community for your great work. > I have a question about the logging issue. I hope this question can be > solved. > > The program is running under this configurations: > YARN Cluster, > YARN-client mode. > > In *Scala*, > writing a code like: > *rdd.map( a => println(a) ); * > will get the output about the value of a in our console. > > However, > in *Java (1.7)*, > writing > rdd.map( new Function (){ > @Override > public Integer call(Integer a) throws Exception { > *System.out.println(a);* > } > }); > won't get the output in our console. > > The configuration is the same. > > I have try this code but not work either: > rdd.map( new Function (){ > @Override > public Integer call(Integer a) throws Exception { > org.apache.log4j.Logger log = > Logger.getLogger(this.getClass()); > log.info(a); > log.warn(a); > log.error(a); > log.fatal(a); > } > }); > > No output either: > final org.apache.log4j.Logger log = Logger.getLogger(this.getClass()); > rdd.map( new Function (){ > @Override > public Integer call(Integer a) throws Exception { > log.info(a); > log.warn(a); > log.error(a); > log.fatal(a); > } > }); > > It seems that the output of stdout in worker doesn't send the output back > to our driver. > I am wonder why it works in scala but not in java. > Is there a simple way to make java work like scala? > > Thanks. >
FW: Problem About Worker System.out
Thanks. Can we use a slf4j/log4j logger to transfer our message from a worker to a driver?I saw some discussions say that we can use this code to transfer their message:object Holder extends Serializable { @transient lazy val log = Logger.getLogger(getClass.getName) } val someRdd = spark.parallelize(List(1, 2, 3)).foreach { element => Holder.log.info(element) }ref: http://stackoverflow.com/questions/29208844/apache-spark-logging-within-scala Is this a traditional way?Or Spark has a SocketAppender for developer?Date: Mon, 28 Dec 2015 17:52:17 +0800 Subject: Re: Problem About Worker System.out From: sai.sai.s...@gmail.com To: david_john_2...@outlook.com CC: user@spark.apache.org Stdout will not be sent back to driver, no matter you use Scala or Java. You must do something wrongly that makes you think it is an expected behavior. On Mon, Dec 28, 2015 at 5:33 PM, David John <david_john_2...@outlook.com> wrote: I have used Spark 1.4 for 6 months. Thanks all the members of this community for your great work.I have a question about the logging issue. I hope this question can be solved. The program is running under this configurations: YARN Cluster, YARN-client mode. In Scala,writing a code like:rdd.map( a => println(a) ); will get the output about the value of a in our console. However,in Java (1.7),writing rdd.map( new Function<Integer,Integer>(){ @Override public Integer call(Integer a) throws Exception { System.out.println(a); }});won't get the output in our console. The configuration is the same. I have try this code but not work either: rdd.map( new Function<Integer,Integer>(){ @Override public Integer call(Integer a) throws Exception {org.apache.log4j.Logger log = Logger.getLogger(this.getClass()); log.info(a); log.warn(a); log.error(a); log.fatal(a); }}); No output either:final org.apache.log4j.Logger log = Logger.getLogger(this.getClass()); rdd.map( new Function<Integer,Integer>(){ @Override public Integer call(Integer a) throws Exception { log.info(a); log.warn(a); log.error(a); log.fatal(a); }}); It seems that the output of stdout in worker doesn't send the output back to our driver.I am wonder why it works in scala but not in java.Is there a simple way to make java work like scala? Thanks.