spark setting maximum available memory

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

spark setting maximum available memory

proofmoore
In my situation each slave has 8 GB memory.  I want to use the maximum memory that I can: .set("spark.executor.memory", "?g")
How can I determine the amount of memory I should set ? It fails when I set it to 8GB.
Reply | Threaded
Open this post in threaded view
|

Re: spark setting maximum available memory

Andrew Or-2
Hi Ibrahim,

If your worker machines only have 8GB of memory, then launching executors with all the memory will leave no room for system processes. There is no guideline, but I usually leave around 1GB just to be safe, so

conf.set("spark.executor.memory", "7g")

Andrew


2014-05-22 7:23 GMT-07:00 İbrahim Rıza HALLAÇ <[hidden email]>:
In my situation each slave has 8 GB memory.  I want to use the maximum memory that I can: .set("spark.executor.memory", "?g")
How can I determine the amount of memory I should set ? It fails when I set it to 8GB.

Reply | Threaded
Open this post in threaded view
|

Re: spark setting maximum available memory

Mayur Rustagi
Ideally you should use less.. 75 % would be good to leave enough for scratch space for shuffle writes & system processes. 

Mayur Rustagi
Ph: +1 (760) 203 3257


On Fri, May 23, 2014 at 1:41 AM, Andrew Or <[hidden email]> wrote:
Hi Ibrahim,

If your worker machines only have 8GB of memory, then launching executors with all the memory will leave no room for system processes. There is no guideline, but I usually leave around 1GB just to be safe, so

conf.set("spark.executor.memory", "7g")

Andrew


2014-05-22 7:23 GMT-07:00 İbrahim Rıza HALLAÇ <[hidden email]>:

In my situation each slave has 8 GB memory.  I want to use the maximum memory that I can: .set("spark.executor.memory", "?g")
How can I determine the amount of memory I should set ? It fails when I set it to 8GB.