cannot deploy job in to Spark Standalone cluster

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

cannot deploy job in to Spark Standalone cluster

prassee
This post has NOT been accepted by the mailing list yet.
hi all,
I have a Spark (0.9.0) cluster setup  with 3 machines (1 master and 2 worker).
From my local machine I have created a sample job.

object SimpleJob extends App {
    // location of spark same for all nodes
    val sparkHome = "/home/hduser/sparkClusterSetup/spark-0.9.0-incubating-bin-hadoop1"
    val sconf = new SparkConf().setMaster("spark://labscs1:7077") // spark://labscs1:7077
        .setAppName("spark scala")
        .setSparkHome(sparkHome)
        // the jar file location in each of the nodes same for all nodes
        .setJars(Seq("/home/hduser/sparkscalaapp_2.10-1.0.jar"))

    val sctx = new SparkContext(sconf)
    println("starting parallelization ")
    println(sctx.parallelize(1 to 100).count)
    println("stopping parallelizatin")
}

The above code runs fine in localhost (with master being changed to local) but when I change my master url to "spark://labscs1:7077". I got the below error

14/02/13 10:31:17 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory


Please let me know what Iam missing in this approach or there is anything else better than this.