Does Spark dynamic allocation work with more than one workers?

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Does Spark dynamic allocation work with more than one workers?

Varun kumar
Hi,

I'm using Spark dynamic allocation on a standalone server with 1 Master(2 cores & 4Gb RAM ) and 1 Worker node(14 cores & 30Gb RAM). It works fine with that setting however, when the number of workers are increased to 2 (7cores & 15Gb RAM each) via spark-env.sh (SPARK_WORKER_INSTANCES = 2, etc..), then the spark UI shows no workers as running and 0 cores, 0 memory being used.

Does spark support dynamic allocation work with more than one worker? If it is indeed possible, can anyone let me know how to make it work?

Thanks,
Varun D.
Reply | Threaded
Open this post in threaded view
|

Re: Does Spark dynamic allocation work with more than one workers?

srowen
Yes it does. It controls how many executors are allocated on workers, and isn't related to the number of workers. Something else is wrong with your setup. You would not typically, by the way, run multiple workers per machine at that scale.

On Thu, Jan 7, 2021 at 7:15 AM Varun kumar <[hidden email]> wrote:
Hi,

I'm using Spark dynamic allocation on a standalone server with 1 Master(2 cores & 4Gb RAM ) and 1 Worker node(14 cores & 30Gb RAM). It works fine with that setting however, when the number of workers are increased to 2 (7cores & 15Gb RAM each) via spark-env.sh (SPARK_WORKER_INSTANCES = 2, etc..), then the spark UI shows no workers as running and 0 cores, 0 memory being used.

Does spark support dynamic allocation work with more than one worker? If it is indeed possible, can anyone let me know how to make it work?

Thanks,
Varun D.