Master and worker nodes in standalone deployment

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Master and worker nodes in standalone deployment

Manoj Samel
When spark is deployed on cluster in standalone deployment mode (V 0.81), one of the node is started as master and others as workers.

What does the master node does ? Can it participates in actual computations or does it just acts as coordinator ?

Thanks,

Manoj
Reply | Threaded
Open this post in threaded view
|

Re: Master and worker nodes in standalone deployment

Nan Zhu
you can start a worker process in the master node

so that all nodes in your cluster can participate in the computation

Best,

-- 
Nan Zhu

On Wednesday, January 15, 2014 at 11:32 PM, Manoj Samel wrote:

When spark is deployed on cluster in standalone deployment mode (V 0.81), one of the node is started as master and others as workers.

What does the master node does ? Can it participates in actual computations or does it just acts as coordinator ?

Thanks,

Manoj

Reply | Threaded
Open this post in threaded view
|

Re: Master and worker nodes in standalone deployment

Manoj Samel
Thanks,

Could you still explain what does master process does ?


On Wed, Jan 15, 2014 at 8:36 PM, Nan Zhu <[hidden email]> wrote:
you can start a worker process in the master node

so that all nodes in your cluster can participate in the computation

Best,

-- 
Nan Zhu

On Wednesday, January 15, 2014 at 11:32 PM, Manoj Samel wrote:

When spark is deployed on cluster in standalone deployment mode (V 0.81), one of the node is started as master and others as workers.

What does the master node does ? Can it participates in actual computations or does it just acts as coordinator ?

Thanks,

Manoj


Reply | Threaded
Open this post in threaded view
|

Re: Master and worker nodes in standalone deployment

Nan Zhu
it maintains the running of worker process, create executor for the tasks in the worker nodes, contacts with driver program, etc.

-- 
Nan Zhu

On Wednesday, January 15, 2014 at 11:37 PM, Manoj Samel wrote:

Thanks,

Could you still explain what does master process does ?


On Wed, Jan 15, 2014 at 8:36 PM, Nan Zhu <[hidden email]> wrote:
you can start a worker process in the master node

so that all nodes in your cluster can participate in the computation

Best,

-- 
Nan Zhu

On Wednesday, January 15, 2014 at 11:32 PM, Manoj Samel wrote:

When spark is deployed on cluster in standalone deployment mode (V 0.81), one of the node is started as master and others as workers.

What does the master node does ? Can it participates in actual computations or does it just acts as coordinator ?

Thanks,

Manoj