Can I run Spark executors in a Hadoop cluster from a Kubernetes container

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Can I run Spark executors in a Hadoop cluster from a Kubernetes container

mailfordebu
Hi,
I want to deploy Spark client in a Kubernetes container. Further on , I want to run the spark job in a Hadoop cluster (meaning the resources of the Hadoop cluster will be leveraged) but call it from the K8S container. My question is whether this mode of implementation possible? Do let me know please.
Thanks,
Debu

Sent from my iPhone
---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Can I run Spark executors in a Hadoop cluster from a Kubernetes container

ZHANG Wei
Looks like you'd like to submit Spark job out of Spark cluster, Apache Livy [https://livy.incubator.apache.org/] worths a try, which provides a REST service for Spark in a Hadoop cluster.

Cheers,
-z

________________________________________
From: [hidden email] <[hidden email]>
Sent: Thursday, April 16, 2020 20:26
To: user
Subject: Can I run Spark executors in a Hadoop cluster from a Kubernetes container

Hi,
I want to deploy Spark client in a Kubernetes container. Further on , I want to run the spark job in a Hadoop cluster (meaning the resources of the Hadoop cluster will be leveraged) but call it from the K8S container. My question is whether this mode of implementation possible? Do let me know please.
Thanks,
Debu

Sent from my iPhone
---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]