Spark not installed + no access to web UI

classic Classic list List threaded Threaded
3 messages Options
mrm
Reply | Threaded
Open this post in threaded view
|

Spark not installed + no access to web UI

mrm
Hi,

I have been launching Spark in the same ways for the past months, but I have only recently started to have problems with it. I launch Spark using spark-ec2 script, but then I cannot access the web UI when I type address:8080 into the browser (it doesn't work with lynx either from the master node), and I cannot find pyspark in the usual spark/bin/pyspark folder. Any hints as to what might be happening?

Thanks in advance!
Reply | Threaded
Open this post in threaded view
|

Re: Spark not installed + no access to web UI

Akhil
Which version of spark are you having?

Thanks
Best Regards

On Thu, Sep 11, 2014 at 3:10 PM, mrm <[hidden email]> wrote:
Hi,

I have been launching Spark in the same ways for the past months, but I have
only recently started to have problems with it. I launch Spark using
spark-ec2 script, but then I cannot access the web UI when I type
address:8080 into the browser (it doesn't work with lynx either from the
master node), and I cannot find pyspark in the usual spark/bin/pyspark
folder. Any hints as to what might be happening?

Thanks in advance!



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-not-installed-no-access-to-web-UI-tp13952.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]
For additional commands, e-mail: [hidden email]


mrm
Reply | Threaded
Open this post in threaded view
|

Re: Spark not installed + no access to web UI

mrm
I tried 1.0.0, 1.0.1 and 1.0.2. I also tried the latest github commit.

After several hours trying to launch it, now it seems to be working, this is what I did (not sure if any of these steps helped):
1/ clone the spark repo into the master node
2/ run sbt/sbt assembly
3/ copy spark and spark-ec2 directories to my slaves
4/ launch the cluster again with "--resume"

Now I can finally access the web UI and spark is properly installed!