SparkAppHandle can not stop application in yarn client mode

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

SparkAppHandle can not stop application in yarn client mode

张 帅
Hi all,

When using spark launcher starts app in yarn client mode, the sparkAppHandle#stop() can not stop the application.

SparkLauncher launcher = new SparkLauncher()
.setAppName("My Launcher")
.setJavaHome("/usr/bin/hadoop/software/java")
.setSparkHome("/usr/bin/hadoop/software/sparkonyarn")
.setConf("spark.executor.instances", "1")
.setConf("spark.executor.memory", "1G")
.setConf(SparkLauncher.EXECUTOR_CORES, "1")
.setAppResource(jarPath)
.setMainClass(mainClass);

SparkAppHandle sparkAppHandle = launcher.startApplication(new SparkAppHandle.Listener() {...});

When app started, the launcher app receive a signal outside and stop the spark app.

sparkAppHandle.stop();

But the spark app is still running.

In yarn cluster mode, sparkAppHandle#stop() can stop the spark app.

I find some code in org/apache/spark/deploy/yarn/Client.scala

private val launcherBackend = new LauncherBackend() {
override protected def conf: SparkConf = sparkConf

override def onStopRequest(): Unit = {
if (isClusterMode && appId != null) {
yarnClient.killApplication(appId)
} else {
setState(SparkAppHandle.State.KILLED)
stop()
}
}
}
def stop(): Unit = {
launcherBackend.close()
yarnClient.stop()
}

Maybe only stop the yarn client and not stop the spark app.

My spark version is 2.3.1 and I tried 3.0.0-preview, but the same result.

Can anyone help me? 

Thanks.