SparkContext startup time out

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

SparkContext startup time out

yaoxin
This post has NOT been accepted by the mailing list yet.
When I run following code, I got a Startup timed out exception. Full exception stack is in the end.
    val sc = new SparkContext("local", "test")

I am using spark 0.9 with scala 2.10 on macos.
I googled it, seems like no one encountered this.
Any ideas? Thanks.


This is the full exception stack

[ERROR] [02/19/2014 20:09:32.309] [main] [Remoting] Remoting error: [Startup timed out] [
Exception in thread "main" java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
akka.remote.RemoteTransportException: Startup timed out
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
    at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
    at akka.remote.Remoting.start(Remoting.scala:191)
    at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
    at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
    at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
    at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
    at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
    at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
    at scala.concurrent.Await$.result(package.scala:107)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
    at akka.remote.Remoting.start(Remoting.scala:173)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
    at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
    at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
    at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
    at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:100)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
    at com.xxx.spark.App$.main(App.scala:15)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
    at com.xxx.spark.App.main(App.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:100)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
    at com.xxx.spark.App$.main(App.scala:15)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
    at com.xxx.spark.App.main(App.scala)
    at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at scala.concurrent.Await$.result(package.scala:107)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
    at akka.remote.Remoting.start(Remoting.scala:173)
    ... 17 more
]
Reply | Threaded
Open this post in threaded view
|

Re: SparkContext startup time out

yaoxin
I setup a project in idea with libraryDependencies  "org.apache.spark" % "spark-core_2.10" % "0.9.0-incubating".

This project only contain one object.

Should I run this in a spark cluster or I missed some library.
Reply | Threaded
Open this post in threaded view
|

Re: SparkContext startup time out

Mayur Rustagi
Yes just sbt run



On Thu, Feb 20, 2014 at 8:09 PM, yaoxin <[hidden email]> wrote:
I setup a project in idea with libraryDependencies  "org.apache.spark" %
"spark-core_2.10" % "0.9.0-incubating".

This project only contain one object.

Should I run this in a spark cluster or I missed some library.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-startup-time-out-tp1754p1868.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.