Hello!
Thank you all for your answers. Akhil's proposed solution works fine.
Thanks.
Florin
On Tue, May 26, 2015 at 3:08 AM, Wesley Miao wesley.mi...@gmail.com wrote:
The reason it didn't work for you is that the function you registered with
someRdd.map will be running on the
Try this way:
object Holder extends Serializable { @transient lazy val log =
Logger.getLogger(getClass.getName)}
val someRdd = spark.parallelize(List(1, 2, 3))
someRdd.map {
element =
Holder.*log.info http://log.info/(s$element will be processed)*
element + 1
Hello!
I would like to use the logging mechanism provided by the log4j, but I'm
getting the
Exception in thread main org.apache.spark.SparkException: Task not
serializable - Caused by: java.io.NotSerializableException:
org.apache.log4j.Logger
The code (and the problem) that I'm using resembles
The reason it didn't work for you is that the function you registered with
someRdd.map will be running on the worker/executor side, not in your
driver's program. Then you need to be careful to not accidentally close
over some objects instantiated from your driver's program, like the log
object in