WebUI's Application count doesn't get updated

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

WebUI's Application count doesn't get updated

MrAsanjar .
  • HI all,
  • Application running and completed count does not get updated, it is always zero. I have ran
  • SparkPi application at least 10 times. please help

  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE
Reply | Threaded
Open this post in threaded view
|

Re: WebUI's Application count doesn't get updated

Andrew Ash
Your applications are probably not connecting to your existing cluster and instead running in local mode.  Are you passing the master URL to the SparkPi application?

Andrew


On Tue, Jun 3, 2014 at 12:30 AM, MrAsanjar . <[hidden email]> wrote:
  • HI all,
  • Application running and completed count does not get updated, it is always zero. I have ran
  • SparkPi application at least 10 times. please help

  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE

Reply | Threaded
Open this post in threaded view
|

Re: WebUI's Application count doesn't get updated

MrAsanjar .
Thanks for your reply Andrew. I am running  applications directly on the master node. My cluster also contain three worker nodes, all are visible  on WebUI.

Spark Master at spark://sanjar-local-machine-1:7077

  • URL: spark://sanjar-local-machine-1:7077
  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE

Workers

Id AddressState CoresMemory
worker-20140603013834-sanjar-local-machine-2-43334 sanjar-local-machine-2:43334 ALIVE 8 (0 Used)14.6 GB (0.0 B Used)
worker-20140603015921-sanjar-local-machine-3-51926 sanjar-local-machine-3:51926 ALIVE8 (0 Used) 14.6 GB (0.0 B Used)
worker-20140603020250-sanjar-local-machine-4-43167 sanjar-local-machine-4:43167 ALIVE8 (0 Used) 14.6 GB (0.0 B Used)

Running Applications

ID NameCores Memory per NodeSubmitted Time UserState Duration

Completed Applications

ID NameCores Memory per NodeSubmitted Time UserState Duration



On Tue, Jun 3, 2014 at 2:33 AM, Andrew Ash <[hidden email]> wrote:
Your applications are probably not connecting to your existing cluster and instead running in local mode.  Are you passing the master URL to the SparkPi application?

Andrew


On Tue, Jun 3, 2014 at 12:30 AM, MrAsanjar . <[hidden email]> wrote:
  • HI all,
  • Application running and completed count does not get updated, it is always zero. I have ran
  • SparkPi application at least 10 times. please help

  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE


Reply | Threaded
Open this post in threaded view
|

Re: WebUI's Application count doesn't get updated

Akhil
​As Andrew said, your application is running on Standalone mode. You need to pass

MASTER=spark://sanjar-local-machine-1:7077

before running your sparkPi example.


Thanks
Best Regards


On Tue, Jun 3, 2014 at 1:12 PM, MrAsanjar . <[hidden email]> wrote:
Thanks for your reply Andrew. I am running  applications directly on the master node. My cluster also contain three worker nodes, all are visible  on WebUI.

Spark Master at spark://sanjar-local-machine-1:7077

  • URL: spark://sanjar-local-machine-1:7077
  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE

Workers

Id AddressState CoresMemory
worker-20140603013834-sanjar-local-machine-2-43334 sanjar-local-machine-2:43334 ALIVE 8 (0 Used)14.6 GB (0.0 B Used)
worker-20140603015921-sanjar-local-machine-3-51926 sanjar-local-machine-3:51926 ALIVE8 (0 Used) 14.6 GB (0.0 B Used)
worker-20140603020250-sanjar-local-machine-4-43167 sanjar-local-machine-4:43167 ALIVE8 (0 Used) 14.6 GB (0.0 B Used)

Running Applications

ID NameCores Memory per NodeSubmitted Time UserState Duration

Completed Applications

ID NameCores Memory per NodeSubmitted Time UserState Duration



On Tue, Jun 3, 2014 at 2:33 AM, Andrew Ash <[hidden email]> wrote:
Your applications are probably not connecting to your existing cluster and instead running in local mode.  Are you passing the master URL to the SparkPi application?

Andrew


On Tue, Jun 3, 2014 at 12:30 AM, MrAsanjar . <[hidden email]> wrote:
  • HI all,
  • Application running and completed count does not get updated, it is always zero. I have ran
  • SparkPi application at least 10 times. please help

  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE



Reply | Threaded
Open this post in threaded view
|

Re: WebUI's Application count doesn't get updated

MrAsanjar .
thanks guys, that fixed my problem. As you might have noticed, I am VERY new to spark. Building a spark cluster using LXC has been a challenge.


On Tue, Jun 3, 2014 at 2:49 AM, Akhil Das <[hidden email]> wrote:
​As Andrew said, your application is running on Standalone mode. You need to pass

MASTER=spark://sanjar-local-machine-1:7077

before running your sparkPi example.


Thanks
Best Regards


On Tue, Jun 3, 2014 at 1:12 PM, MrAsanjar . <[hidden email]> wrote:
Thanks for your reply Andrew. I am running  applications directly on the master node. My cluster also contain three worker nodes, all are visible  on WebUI.

Spark Master at spark://sanjar-local-machine-1:7077

  • URL: spark://sanjar-local-machine-1:7077
  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE

Workers

Id AddressState CoresMemory
worker-20140603013834-sanjar-local-machine-2-43334 sanjar-local-machine-2:43334 ALIVE 8 (0 Used)14.6 GB (0.0 B Used)
worker-20140603015921-sanjar-local-machine-3-51926 sanjar-local-machine-3:51926 ALIVE8 (0 Used) 14.6 GB (0.0 B Used)
worker-20140603020250-sanjar-local-machine-4-43167 sanjar-local-machine-4:43167 ALIVE8 (0 Used) 14.6 GB (0.0 B Used)

Running Applications

ID NameCores Memory per NodeSubmitted Time UserState Duration

Completed Applications

ID NameCores Memory per NodeSubmitted Time UserState Duration



On Tue, Jun 3, 2014 at 2:33 AM, Andrew Ash <[hidden email]> wrote:
Your applications are probably not connecting to your existing cluster and instead running in local mode.  Are you passing the master URL to the SparkPi application?

Andrew


On Tue, Jun 3, 2014 at 12:30 AM, MrAsanjar . <[hidden email]> wrote:
  • HI all,
  • Application running and completed count does not get updated, it is always zero. I have ran
  • SparkPi application at least 10 times. please help

  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE




Reply | Threaded
Open this post in threaded view
|

Re: WebUI's Application count doesn't get updated

Mayur Rustagi
Did you use docker or plain lxc specifically?


Mayur Rustagi
Ph: +1 (760) 203 3257


On Tue, Jun 3, 2014 at 1:40 PM, MrAsanjar . <[hidden email]> wrote:
thanks guys, that fixed my problem. As you might have noticed, I am VERY new to spark. Building a spark cluster using LXC has been a challenge.


On Tue, Jun 3, 2014 at 2:49 AM, Akhil Das <[hidden email]> wrote:
​As Andrew said, your application is running on Standalone mode. You need to pass

MASTER=spark://sanjar-local-machine-1:7077

before running your sparkPi example.


Thanks
Best Regards


On Tue, Jun 3, 2014 at 1:12 PM, MrAsanjar . <[hidden email]> wrote:
Thanks for your reply Andrew. I am running  applications directly on the master node. My cluster also contain three worker nodes, all are visible  on WebUI.

Spark Master at spark://sanjar-local-machine-1:7077

  • URL: spark://sanjar-local-machine-1:7077
  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE

Workers

Id AddressState CoresMemory
worker-20140603013834-sanjar-local-machine-2-43334 sanjar-local-machine-2:43334 ALIVE 8 (0 Used)14.6 GB (0.0 B Used)
worker-20140603015921-sanjar-local-machine-3-51926 sanjar-local-machine-3:51926 ALIVE8 (0 Used) 14.6 GB (0.0 B Used)
worker-20140603020250-sanjar-local-machine-4-43167 sanjar-local-machine-4:43167 ALIVE8 (0 Used) 14.6 GB (0.0 B Used)

Running Applications

ID NameCores Memory per NodeSubmitted Time UserState Duration

Completed Applications

ID NameCores Memory per NodeSubmitted Time UserState Duration



On Tue, Jun 3, 2014 at 2:33 AM, Andrew Ash <[hidden email]> wrote:
Your applications are probably not connecting to your existing cluster and instead running in local mode.  Are you passing the master URL to the SparkPi application?

Andrew


On Tue, Jun 3, 2014 at 12:30 AM, MrAsanjar . <[hidden email]> wrote:
  • HI all,
  • Application running and completed count does not get updated, it is always zero. I have ran
  • SparkPi application at least 10 times. please help

  • Workers: 3
  • Cores: 24 Total, 0 Used
  • Memory: 43.7 GB Total, 0.0 B Used
  • Applications: 0 Running, 0 Completed
  • Drivers: 0 Running, 0 Completed
  • Status: ALIVE