It is too verbose, and will significantly increase the size event log. Here is the comment in the code:
// No-op because logging every update would be overkill > override def onBlockUpdated(event: SparkListenerBlockUpdated): Unit = {} > > On Thu, Feb 23, 2017 at 11:42 AM, Parag Chaudhari <paragp...@gmail.com> wrote: > Thanks a lot the information! > > Is there any reason why EventLoggingListener ignore this event? > > *Thanks,* > > > *Parag* > > On Wed, Feb 22, 2017 at 7:11 PM, Saisai Shao <sai.sai.s...@gmail.com> > wrote: > >> AFAIK, Spark's EventLoggingListerner ignores BlockUpdate event, so it >> will not be written into event-log, I think that's why you cannot get such >> info in history server. >> >> On Thu, Feb 23, 2017 at 9:51 AM, Parag Chaudhari <paragp...@gmail.com> >> wrote: >> >>> Hi, >>> >>> I am running spark shell in spark version 2.0.2. Here is my program, >>> >>> var myrdd = sc.parallelize(Array.range(1, 10)) >>> myrdd.setName("test") >>> myrdd.cache >>> myrdd.collect >>> >>> But I am not able to see any RDD info in "storage" tab in spark history >>> server. >>> >>> I looked at this >>> <https://forums.databricks.com/questions/117/why-is-my-rdd-not-showing-up-in-the-storage-tab-of.html> >>> but it is not helping as I have exact similar program mentioned there. Can >>> anyone help? >>> >>> >>> *Thanks,* >>> >>> *Parag* >>> >> >> >