org.apache.spark.deploy.yarn.ExecutorLauncher not found when running Spark 3.0 on Hadoop

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

org.apache.spark.deploy.yarn.ExecutorLauncher not found when running Spark 3.0 on Hadoop

ArtemisDev

I've been trying to set up the latest stable version of Spark 3.0 on a hadoop cluster using yarn.  When running spark-submit in client mode, I always got an error of org.apache.spark.deploy.yarn.ExecutorLauncher not found.  This happened when I preload the spark jar files onto HDFS and specified the spark.yarn.jars property to the HDFS address (i.e. set spark.yarn.jars to hdfs:///spark-3/jars or hdfs://namenode:8020/spark-3/jars).  I've checked the /spark-3/jars directory on HDFS and all the jar files are accessible.  The exception messages are listed below.

This problem won't occur when I commended out the spark.yarn.jars line in the spark-defaults.conf file.  spark-submit finishes without any problems.

Any ideas what I have done wrong?  Thanks!

-- ND

======================================================================

Exception in thread "main" org.apache.spark.SparkException: Application application_1594664166056_0005 failed 2 times due to AM Container for appattempt_1594664166056_0005_000002 exited with  exitCode: 1
Failing this attempt.Diagnostics: [2020-07-13 20:07:20.882]Exception from container-launch.
Container id: container_1594664166056_0005_02_000001
Exit code: 1

[2020-07-13 20:07:20.886]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
Error: Could not find or load main class org.apache.spark.deploy.yarn.ExecutorLauncher


Reply | Threaded
Open this post in threaded view
|

Re:org.apache.spark.deploy.yarn.ExecutorLauncher not found when running Spark 3.0 on Hadoop

godlumen

If you are sure that there are yarn related jars in the jars directory, try to use --conf spark.yarn.jars=hdfs://namenode:8020/spark-3/jars/*





--
Lumen


At 2020-07-14 04:31:38, "ArtemisDev" <[hidden email]> wrote:

I've been trying to set up the latest stable version of Spark 3.0 on a hadoop cluster using yarn.  When running spark-submit in client mode, I always got an error of org.apache.spark.deploy.yarn.ExecutorLauncher not found.  This happened when I preload the spark jar files onto HDFS and specified the spark.yarn.jars property to the HDFS address (i.e. set spark.yarn.jars to hdfs:///spark-3/jars or hdfs://namenode:8020/spark-3/jars).  I've checked the /spark-3/jars directory on HDFS and all the jar files are accessible.  The exception messages are listed below.

This problem won't occur when I commended out the spark.yarn.jars line in the spark-defaults.conf file.  spark-submit finishes without any problems.

Any ideas what I have done wrong?  Thanks!

-- ND

======================================================================

Exception in thread "main" org.apache.spark.SparkException: Application application_1594664166056_0005 failed 2 times due to AM Container for appattempt_1594664166056_0005_000002 exited with  exitCode: 1
Failing this attempt.Diagnostics: [2020-07-13 20:07:20.882]Exception from container-launch.
Container id: container_1594664166056_0005_02_000001
Exit code: 1

[2020-07-13 20:07:20.886]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
Error: Could not find or load main class org.apache.spark.deploy.yarn.ExecutorLauncher