As per the solution, if we are closing and starting the query, then what
happens to the the state which is maintained in memory, will that be
retained ?
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To
Appu,
I am also landed in same problem.
Are you able to solve this issue? Could you please share snippet of code if
your able to do?
Thanks,
Naresh
On Wed, Feb 14, 2018 at 8:04 PM, Tathagata Das
wrote:
> 1. Just loop like this.
>
>
> def startQuery(): Streaming
1. Just loop like this.
def startQuery(): Streaming Query = {
// Define the dataframes and start the query
}
// call this on main thread
while (notShutdown) {
val query = startQuery()
query.awaitTermination(refreshIntervalMs)
query.stop()
// refresh static data
}
2. Yes,
TD,
Thanks a lot for the quick reply :)
Did I understand it right that in the main thread, to wait for the
termination of the context I'll not be able to use
outStream.awaitTermination() - [ since i'll be closing in inside another
thread ]
What would be a good approach to keep the main app
Let me fix my mistake :)
What I suggested in that earlier thread does not work. The streaming query
that joins a streaming dataset with a batch view, does not correctly pick
up when the view is updated. It works only when you restart the query. That
is,
- stop the query
- recreate the dataframes,
More specifically,
Quoting TD from the previous thread
"Any streaming query that joins a streaming dataframe with the view will
automatically start using the most updated data as soon as the view is
updated”
Wondering if I’m doing something wrong in
Hi,
I had followed the instructions from the thread
https://mail-archives.apache.org/mod_mbox/spark-user/201704.mbox/%3cd1315d33-41cd-4ba3-8b77-0879f3669...@qvantel.com%3E
while
trying to reload a static data frame periodically that gets joined to a
structured streaming query.
However, the