You could use dstream.foreachRDD( (rdd, timestamp) => ???)  to get access
to the time of each batch.

On Fri, Dec 19, 2014 at 12:59 PM, nelson <nelson.verd...@ysance.com> wrote:
>
> Hi all,
>
> I know the topic have been discussed before, but i couldn't find an answer
> that might suits me.
>
> How do you retrieve the current batch timestamp in spark streaming? Maybe
> via BatchInfo but it does not seem to be linked to streaming context or
> else... I currently have 1 minutes micro-batch and i'd like to link every
> logs i process with the start of the batch a log belongs to.
>
> I also thought about broadcasting a new timestamp at the end of every batch
> thanks to a StreamingListener but i couldn't manage to overwrite the first
> broadcasted value.
>
> Do you guys have any ideas?
> Thanks a lot,
>
> Nelson.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Batch-timestamp-in-spark-streaming-tp20786.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to