I am wondering if there is a way to update an RDD that is used in a transform 
operation of a DStream.
To use the example from the spark streaming programming guide, let’s say I want 
to update my spamInfoRDD once an hour without having to restart the streaming 
app.

If an RDD used in Transform operations on DStreams can’t be updated, is there 
an alternative strategy using custom receivers and DStreams?

I just want to be able to leave my streaming app up 24/7 but add and remove 
filtering objects on a rather infrequent basis. My data is coming from a SQL 
database and is large enough that I don’t want to query it as frequent as my 
primary incoming data stream.

Thanks,
Sean
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to