Processing multiple request in cluster

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Processing multiple request in cluster

subacini Arunkumar
hi All,

How to run concurrently multiple requests on same cluster.

I have a program using spark streaming context which reads streaming data and writes it to HBase. It works fine, the problem is when multiple requests are submitted to cluster, only first request is processed as the entire cluster is used for this request. Rest of the requests are in waiting mode.

i have set  spark.cores.max to 2 or less, so that it can process another request,but if there is only one request cluster is not utilized properly.

Is there any way, that spark cluster can process streaming request concurrently at the same time effectively utitlizing cluster, something like sharkserver

Thanks
Subacini
Reply | Threaded
Open this post in threaded view
|

Re: Processing multiple request in cluster

Akhil
You can try spark on Mesos or Yarn since they have lot more support for scheduling and all

Thanks
Best Regards

On Thu, Sep 25, 2014 at 4:50 AM, Subacini B <[hidden email]> wrote:
hi All,

How to run concurrently multiple requests on same cluster.

I have a program using spark streaming context which reads streaming data and writes it to HBase. It works fine, the problem is when multiple requests are submitted to cluster, only first request is processed as the entire cluster is used for this request. Rest of the requests are in waiting mode.

i have set  spark.cores.max to 2 or less, so that it can process another request,but if there is only one request cluster is not utilized properly.

Is there any way, that spark cluster can process streaming request concurrently at the same time effectively utitlizing cluster, something like sharkserver

Thanks
Subacini

Reply | Threaded
Open this post in threaded view
|

Re: Processing multiple request in cluster

Mayur Rustagi
There are two problems you may be facing. 
1. your application is taking all resources
2. inside your application task submission is not scheduling properly. 

for 1  you can either configure your app to take less resources or use mesos/yarn types scheduler to dynamically change or juggle resources
for 2. you can use fair scheduler so that application tasks can be scheduled more fairly. 

Regards
Mayur

Mayur Rustagi
Ph: +1 (760) 203 3257

On Thu, Sep 25, 2014 at 12:32 PM, Akhil Das <[hidden email]> wrote:
You can try spark on Mesos or Yarn since they have lot more support for scheduling and all

Thanks
Best Regards

On Thu, Sep 25, 2014 at 4:50 AM, Subacini B <[hidden email]> wrote:
hi All,

How to run concurrently multiple requests on same cluster.

I have a program using spark streaming context which reads streaming data and writes it to HBase. It works fine, the problem is when multiple requests are submitted to cluster, only first request is processed as the entire cluster is used for this request. Rest of the requests are in waiting mode.

i have set  spark.cores.max to 2 or less, so that it can process another request,but if there is only one request cluster is not utilized properly.

Is there any way, that spark cluster can process streaming request concurrently at the same time effectively utitlizing cluster, something like sharkserver

Thanks
Subacini