Do we need to kill a spark job every time we change and deploy it?

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Do we need to kill a spark job every time we change and deploy it?

Mina Aslani
Hi,

I have a question for you.
Do we need to kill a spark job every time we change and deploy it to cluster? Or, is there a way for Spark to automatically pick up the recent jar version?

Best regards,
Mina
Reply | Threaded
Open this post in threaded view
|

Re: Do we need to kill a spark job every time we change and deploy it?

Irving Duran
Are you referring to have spark picking up a new jar build?  If so, you can probably script that on bash.

Thank You,

Irving Duran


On Wed, Nov 28, 2018 at 12:44 PM Mina Aslani <[hidden email]> wrote:
Hi,

I have a question for you.
Do we need to kill a spark job every time we change and deploy it to cluster? Or, is there a way for Spark to automatically pick up the recent jar version?

Best regards,
Mina
965
Reply | Threaded
Open this post in threaded view
|

回复:Do we need to kill a spark job every time we change and deploy it?

965
In reply to this post by Mina Aslani
I think if your job is running and you want to deploy a new jar which is the new version for the other, spark will think the new jar is another job ,
they distinguish job by  Job ID , so if you want to replace the jar ,you have to kil job every time;

------------------ 原始邮件 ------------------
发件人:  "Mina Aslani"<[hidden email]>;
发送时间:  2018年11月29日(星期四)凌晨2:44
收件人:  "user @spark"<[hidden email]>;
主题:  Do we need to kill a spark job every time we change and deploy it?

Hi,

I have a question for you.
Do we need to kill a spark job every time we change and deploy it to cluster? Or, is there a way for Spark to automatically pick up the recent jar version?

Best regards,
Mina