DB Config data update across multiple Spark Streaming Jobs
Hi, We have multiple spark jobs running on a single EMR cluster. All jobs use same business related configurations which are stored in Postgres. How to update this configuration data at all executors dynamically if any changes happened to Postgres db data with out spark restarts. We are using
pyspark
Hi I could see your pyspark video. Do you have pdf manuals for pyspark sql?Thanks a lot