Hello,

I am working on a use case where there is a need to join streaming data frame 
with a static data frame.
The streaming data frame continuously gets data from Kafka topics, whereas 
static data frame fetches data from a database table.

However, as the underlying database table is getting updated often, I must 
somehow manage to refresh my static data frame periodically to get the latest 
information from underlying database table.

My questions:

1.       Is it possible to periodically refresh static data frame?

2.       If refreshing static data frame is not possible, is there a mechanism 
to automatically stop & restarting spark structured streaming job, so that 
every time the job restarts, the static data frame gets updated with latest 
information from underlying database table.

3.       If 1) and 2) are not possible, please suggest alternatives to achieve 
my requirement described above.

Thanks,
Hemanth

Reply via email to