Error - Dropping SparkListenerEvent because no remaining room in event queue

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Error - Dropping SparkListenerEvent because no remaining room in event queue

karan alang
Hello - 
we are running a Spark job, and getting the following error -

"LiveListenerBus: Dropping SparkListenerEvent because no remaining room in event queue" 

As per the recommendation in the Spark Docs -

I've increased the value of property
spark.scheduler.listenerbus.eventqueue.capacity to 90000 (from the default 10000)
and also increased the Diver memory

That seems to have mitigated the issue.

The question is - is there is any Code optimization (or any other) that can be done to resolve this problem ? 
Pls note - we are primarily using functions like - reduce(), collectAsList() and persist() as part of the job.
Reply | Threaded
Open this post in threaded view
|

Re: Error - Dropping SparkListenerEvent because no remaining room in event queue

karan alang

Pls note - Spark version is 2.2.0

On Wed, Oct 24, 2018 at 3:57 PM karan alang <[hidden email]> wrote:
Hello - 
we are running a Spark job, and getting the following error -

"LiveListenerBus: Dropping SparkListenerEvent because no remaining room in event queue" 

As per the recommendation in the Spark Docs -

I've increased the value of property
spark.scheduler.listenerbus.eventqueue.capacity to 90000 (from the default 10000)
and also increased the Diver memory

That seems to have mitigated the issue.

The question is - is there is any Code optimization (or any other) that can be done to resolve this problem ? 
Pls note - we are primarily using functions like - reduce(), collectAsList() and persist() as part of the job.
Reply | Threaded
Open this post in threaded view
|

Re: Error - Dropping SparkListenerEvent because no remaining room in event queue

Arun Mahadevan
Maybe you have spark listeners that are not processing the events fast enough? 
Do you have spark event logging enabled? 
You might have to profile the built in and your custom listeners to see whats going on.

- Arun

On Wed, 24 Oct 2018 at 16:08, karan alang <[hidden email]> wrote:

Pls note - Spark version is 2.2.0

On Wed, Oct 24, 2018 at 3:57 PM karan alang <[hidden email]> wrote:
Hello - 
we are running a Spark job, and getting the following error -

"LiveListenerBus: Dropping SparkListenerEvent because no remaining room in event queue" 

As per the recommendation in the Spark Docs -

I've increased the value of property
spark.scheduler.listenerbus.eventqueue.capacity to 90000 (from the default 10000)
and also increased the Diver memory

That seems to have mitigated the issue.

The question is - is there is any Code optimization (or any other) that can be done to resolve this problem ? 
Pls note - we are primarily using functions like - reduce(), collectAsList() and persist() as part of the job.