Spark Java Heap Size issue

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark Java Heap Size issue

Jaggu
Hi Team,

I was trying to run a stand-alone app in spark cluster .
When I executed the same I got Java Heap size error.
I have two workers with 4G Ram and two workers.

The error is pasted at http://pastebin.com/FCFj01UX

I have set SPARK_WORKER_MRMORY and SPARK_DAEMON_MEMORY as 4g too

Any clue how to resolve the same.

Best regards

Jaggu
Reply | Threaded
Open this post in threaded view
|

Re: Spark Java Heap Size issue

mohankreddy
One way to fix the issue is to set ("spark.executor.memory", "8g") in your SparkConf

ex code :

 val conf = new SparkConf()
      .set("spark.executor.memory", "8g")
      .set("spark.locality.wait", "10000")
    val sc = new SparkContext(master, "whatever", conf)