Re: Using Log4j for logging messages inside lambda functions

2015-05-26 Thread Spico Florin
Hello! Thank you all for your answers. Akhil's proposed solution works fine. Thanks. Florin On Tue, May 26, 2015 at 3:08 AM, Wesley Miao wesley.mi...@gmail.com wrote: The reason it didn't work for you is that the function you registered with someRdd.map will be running on the

Re: Using Log4j for logging messages inside lambda functions

2015-05-25 Thread Akhil Das
Try this way: object Holder extends Serializable { @transient lazy val log = Logger.getLogger(getClass.getName)} val someRdd = spark.parallelize(List(1, 2, 3)) someRdd.map { element = Holder.*log.info http://log.info/(s$element will be processed)* element + 1

Using Log4j for logging messages inside lambda functions

2015-05-25 Thread Spico Florin
Hello! I would like to use the logging mechanism provided by the log4j, but I'm getting the Exception in thread main org.apache.spark.SparkException: Task not serializable - Caused by: java.io.NotSerializableException: org.apache.log4j.Logger The code (and the problem) that I'm using resembles

Re: Using Log4j for logging messages inside lambda functions

2015-05-25 Thread Wesley Miao
The reason it didn't work for you is that the function you registered with someRdd.map will be running on the worker/executor side, not in your driver's program. Then you need to be careful to not accidentally close over some objects instantiated from your driver's program, like the log object in