Spark http:<masterip:8080> Not showing completed apps

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark http:<masterip:8080> Not showing completed apps

purna pradeep
Hi,

I'm using spark  standalone in aws ec2 .And I'm using spark rest API http:<masterip>:8080/Json to get completed apps but the Json completed apps as empty array though the job ran successfully.


Reply | Threaded
Open this post in threaded view
|

spark job paused(active stages finished)

bingli3@iflytek.com
Dear,All
    I have a simple spark job, as below, all tasks in the stage 2(sth failed, retry) already finished. But the next stage never run.

   
        driver thread dump:  attachment( thread.dump)
        driver last log:
            
            
     driver do not receive the 16 retry tasks report.Thank you ideas.
            


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

thread.dump (129K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: spark job paused(active stages finished)

Margusja
You have to deal with failed jobs. In example try catch in your code.

Br Margus Roo


On 9 Nov 2017, at 05:37, [hidden email] wrote:

Dear,All
    I have a simple spark job, as below, all tasks in the stage 2(sth failed, retry) already finished. But the next stage never run.

<Catch.jpg>
   
        driver thread dump:  attachment( thread.dump)
        driver last log:
            <CatchAFFD.jpg>
            
     driver do not receive the 16 retry tasks report.Thank you ideas.
            
<thread.dump>
---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Re: spark job paused(active stages finished)

bingli3@iflytek.com
    Thank you for your reply.
    
    But,sometimes successed, when i rerun the job.
    And the job process the same data using the same code.

 
Date: 2017-11-09 14:25
Subject: Re: spark job paused(active stages finished)
You have to deal with failed jobs. In example try catch in your code.

Br Margus Roo


On 9 Nov 2017, at 05:37, [hidden email] wrote:

Dear,All
    I have a simple spark job, as below, all tasks in the stage 2(sth failed, retry) already finished. But the next stage never run.

<Catch.jpg>
   
        driver thread dump:  attachment( thread.dump)
        driver last log:
            <CatchAFFD.jpg>
            
     driver do not receive the 16 retry tasks report.Thank you ideas.
            
<thread.dump>
---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]