[pyspark 2.3.0] Task was denied committing errors

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

[pyspark 2.3.0] Task was denied committing errors

rishishah.star
Hi All,

I have two relatively big tables and join on them keeps throwing TaskCommitErrors, eventually job succeeds but I was wondering what these errors are and if there's any solution?

--
Regards,

Rishi Shah
Reply | Threaded
Open this post in threaded view
|

Re: [pyspark 2.3.0] Task was denied committing errors

rishishah.star
Any suggestions?

On Wed, Nov 6, 2019 at 7:30 AM Rishi Shah <[hidden email]> wrote:
Hi All,

I have two relatively big tables and join on them keeps throwing TaskCommitErrors, eventually job succeeds but I was wondering what these errors are and if there's any solution?

--
Regards,

Rishi Shah


--
Regards,

Rishi Shah
Reply | Threaded
Open this post in threaded view
|

Re: [pyspark 2.3.0] Task was denied committing errors

rishishah.star
Hi Team,

I could really use your insight here, any help is appreciated!

Thanks,
Rishi


On Wed, Nov 6, 2019 at 8:27 PM Rishi Shah <[hidden email]> wrote:
Any suggestions?

On Wed, Nov 6, 2019 at 7:30 AM Rishi Shah <[hidden email]> wrote:
Hi All,

I have two relatively big tables and join on them keeps throwing TaskCommitErrors, eventually job succeeds but I was wondering what these errors are and if there's any solution?

--
Regards,

Rishi Shah


--
Regards,

Rishi Shah


--
Regards,

Rishi Shah