Hi Team,

Could you please help me on below query.

I'm using JavaStreamingContext to read streaming files from hdfs shared
directory. When i start spark streaming job it is reading files from hdfs
shared directory and doing some process. When i stop and restart the job it
is again reading old files. Is there any way to maintain checkpoint files
information in spark?

Reply via email to