Thanks Terry for the quick answer.

I did not tried it.  Lets say I will increase the value to 20000, what side
effect should I expect. In fact in the explanation of the property "How
many finished batches the Spark UI and status APIs remember before garbage
collecting." So the data is stored in memory, but the the memory of which
component ... I imagine the driver ?

regards,

On Fri, Jan 29, 2016 at 10:52 AM, Terry Hoo <hujie.ea...@gmail.com> wrote:

> Hi Mehdi,
>
> Do you try a larger value of "spark.streaming.ui.retainedBatches"(default
> is 1000)?
>
> Regards,
> - Terry
>
> On Fri, Jan 29, 2016 at 5:45 PM, Mehdi Ben Haj Abbes <
> mehdi.ab...@gmail.com> wrote:
>
>> Hi folks,
>>
>> I have a streaming job running for more than 24 hours. It seems that
>> there is a limit on the number of the batches displayed in the Streaming
>> Statics visualization screen. For example if I would launch a job Friday I
>> will not be able to have the statistics for what happened during Saturday.
>> I will have the batches that have run the previous 24 hours and today it
>> was like only the previous 3 hours.
>>
>> Any help will be very appreciated.
>> --
>> Mehdi BEN HAJ ABBES
>>
>>
>


-- 
Mehdi BEN HAJ ABBES

Reply via email to