Apache Spark User List
only in this topic
spark setting maximum available memory
spark setting maximum available memory –
In my situation each slave has 8 GB memory. I want to use the maximum memory that I can: .set("spark.executor.memory", "?g") ...
Hi Ibrahim, If your worker machines only have 8GB of memory, then launching executors with all the memory will leave no room for system proces...
Ideally you should use less.. 75 % would be good to leave enough for scratch space for shuffle writes & system processes. Mayur Rustagi P...
Return to Apache Spark User List
1 view|%1 views
Free forum by Nabble
Edit this page