DB Config data update across multiple Spark Streaming Jobs

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

DB Config data update across multiple Spark Streaming Jobs

forece85
Hi,

We have multiple spark jobs running on a single EMR cluster. All jobs use
same business related configurations which are stored in Postgres. How to
update this configuration data at all executors dynamically if any changes
happened to Postgres db data with out spark restarts.

We are using Kinesis for streaming. Tried of creating new kinesis stream
called cache. Pushing a dummy event and processing in all sparks to refresh
all configuration data at all executors. But not working good. Any better
approach for this problem statement? Or how to correctly implement this?

Thanks in Advance.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: DB Config data update across multiple Spark Streaming Jobs

forece85
Any suggestion on this? How to update configuration data on all executors
with out downtime?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]