Spark on YARN use only one node

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Spark on YARN use only one node

Assaf
Hi,

I've installed Spark 0.81 on IDH 3.0.2 as on YARN.
My cluster have 3 servers, 1 is NN and DN, other 2 only DN.
I manage to launch spark-shell and execute the mllib kmeans.
The problem is it is using only one node ( the NN ) and not running on the other 2 DN

Please advise

My spark-env.sh file:

export SPARK_CLASSPATH=/usr/lib/hbase/hbase-0.94.7-Intel.jar:/usr/lib/hadoop/hadoop-auth-2.0.4-Intel.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/hadoop-common-2.0.4-Intel.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs.jar
export SPARK_LIBRARY_PATH=/usr/lib/hadoop/lib/native
export HADOOP_CONF_DIR=/etc/hadoop/conf:/etc/hbase/conf
export SPARK_PRINT_LAUNCH_COMMAND=1
export YARN_CONF_DIR=/etc/hadoop/conf

Thanks,
Assaf