Hi Mukesh,

Once you create a streming job, a DAG is created which contains your job
plan i.e. all map transformation and all action operations to be performed
on each batch of streaming application.

So, once your job is started, the input dstream take the data input from
specified source and all the transformations/actions are performed according
to the DAG created. Once all the operations on dstream are performed, the
dstream is destroyed in LRU fashion.




-----
Lalit Yadav
la...@sigmoidanalytics.com
--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Lifecycle-of-RDD-in-spark-streaming-tp19749p19850.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to