How to do stop streaming before the application got killed

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

How to do stop streaming before the application got killed

noppanit
I'm trying to write a deployment job for Spark application. Basically the job will send yarn application --kill app_id to the cluster but after the application received the signal it dies without finishing whatever is processing or stopping the stream. 

I'm using Spark Streaming. What's the best way to stop Spark application so we won't lose any data. 


Reply | Threaded
Open this post in threaded view
|

Re: [E] How to do stop streaming before the application got killed

Rastogi, Pankaj
You can add a shutdown hook to your JVM and request spark streaming context to stop gracefully. 
  /**
* Shutdown hook to shutdown JVM gracefully
* @param ssCtx
*/
def addShutdownHook(ssCtx: StreamingContext) = {

Runtime.getRuntime.addShutdownHook( new Thread() {

override def run() = {

println("In shutdown hook")
// stop gracefully
ssCtx.stop(true, true)
}
})
}
}
Pankaj

On Fri, Dec 22, 2017 at 9:56 AM, Toy <[hidden email]> wrote:
I'm trying to write a deployment job for Spark application. Basically the job will send yarn application --kill app_id to the cluster but after the application received the signal it dies without finishing whatever is processing or stopping the stream. 

I'm using Spark Streaming. What's the best way to stop Spark application so we won't lose any data.