Spark executor pods not getting killed after task completion

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark executor pods not getting killed after task completion

manishgupta88
Hi

I am trying to run spark submit on kubernetes. I am able to achieve the desired results in a way that driver and executors are getting launched as per the given configuration and my job is able to run successfully.

But even after job completion spark driver pod is always in Running state and none of the executor pods are getting killed whereas when I run a simple SparkPi application to test it with the same image executors are getting killed and the driver shows the status as Completed.

Can someone please guide me on this issue.

Regards
Manish Gupta
Reply | Threaded
Open this post in threaded view
|

Re: Spark executor pods not getting killed after task completion

manishgupta88
Issue got resolved after closing the sparkcontext.
https://stackoverflow.com/questions/57964848/spark-job-in-kubernetes-stuck-in-running-state



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]