spark setting maximum available memory

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
spark setting maximum available memory – In my situation each slave has 8 GB memory. I want to use the maximum memory that I can: .set("spark.executor.memory", "?g") ...
Hi Ibrahim, If your worker machines only have 8GB of memory, then launching executors with all the memory will leave no room for system proces...
Ideally you should use less.. 75 % would be good to leave enough for scratch space for shuffle writes & system processes. Mayur Rustagi P...
Loading...