bq. streamingContext.remember("duration") did not help

Can you give a bit more detail on the above ?
Did you mean the job encountered OOME later on ?

Which Spark release are you using ?

 tried these 2 global settings (and restarted the app) after enabling cache
for stream1
conf.set("spark.streaming.unpersist", "true")

streamingContext.remember(Seconds(batchDuration * 4))

batch duration is 4 sec

Using spark-1.4.1. The application runs for about 4-5 hrs then see out of
memory error





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/adding-a-split-and-union-to-a-streaming-application-cause-big-performance-hit-tp26259p26269.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to