Understanding spark.executor.memoryOverhead

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Understanding spark.executor.memoryOverhead

Akash Mishra
Hi *, 

I would like to know what part of spark codebase uses spark.executor.memoryOverhead? I have a job which has spark.executor.memory=18g but it requires spark.executor.memoryOverhead=4g for the same process otherwise I get task error, 

ExecutorLostFailure (executor 28 exited caused by one of the running tasks) Reason: Container killed by YARN for exceeding memory limits. 23.7 GB of 22 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.

There is only 1 task running on this executor and JVM Heap usage is around 14 GB. I am not able to understand that what exactly is using 5.7 GB of memory other than Java? 

Is it netty for block read or something else?



--

Regards,
Akash Mishra.


"It's not our abilities that make us, but our decisions."--Albus Dumbledore