Hi, We have multiple spark jobs running on a single EMR cluster. All jobs use same business related configurations which are stored in Postgres. How to update this configuration data at all executors dynamically if any changes happened to Postgres db data with out spark restarts.
We are using Kinesis for streaming. Tried of creating new kinesis stream called cache. Pushing a dummy event and processing in all sparks to refresh all configuration data at all executors. But not working good. Any better approach for this problem statement? Or how to correctly implement this? Thanks in Advance. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org