This post has NOT been accepted by the mailing list yet.
In my use case I have to use a config file(this will be changing on daily basis) in spark streaming job. To achieve this I am thinking of 2 approaches:
1. I can restart my job on daily basis to refresh broadcast variable. (Restarting job is not a desired option.)
2. As this config file is very small so I can load it into a scala map and broadcast it to make available on all the executors. But I am stuck on refreshing part, can anyone help me out how can I refresh and re-broadcast this config data without restarting the job?