How to get logging right for Spark applications in the YARN ecosystem

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

How to get logging right for Spark applications in the YARN ecosystem

raman gugnani
HI ,

I am looking for right solution for logging the logs produced by the executors. Most of the places I have seen logging done by log4j properties, but no where people I have seen any solution where logs are being compressed.

Is there anyway I can compress the logs, So that further those logs can be shipped to S3.

--
Raman Gugnani
Reply | Threaded
Open this post in threaded view
|

Re: How to get logging right for Spark applications in the YARN ecosystem

srinath
Hi Raman,

Probably use the rolling file appender in log4j to compress the rotated log file?

Regards.


On Fri, Aug 2, 2019 at 12:47 AM raman gugnani <[hidden email]> wrote:
HI ,

I am looking for right solution for logging the logs produced by the executors. Most of the places I have seen logging done by log4j properties, but no where people I have seen any solution where logs are being compressed.

Is there anyway I can compress the logs, So that further those logs can be shipped to S3.

--
Raman Gugnani
Reply | Threaded
Open this post in threaded view
|

Re: How to get logging right for Spark applications in the YARN ecosystem

Girish bhat m
Hi Raman,

Since you are using YARN, you collect the yarn logs (which also contains the application logs) by executing below command and move to the desired location

yarn logs -applicationId <your_yarn_app_id> 

Best
Girish 

On Fri, Aug 2, 2019 at 10:17 AM Srinath C <[hidden email]> wrote:
Hi Raman,

Probably use the rolling file appender in log4j to compress the rotated log file?

Regards.


On Fri, Aug 2, 2019 at 12:47 AM raman gugnani <[hidden email]> wrote:
HI ,

I am looking for right solution for logging the logs produced by the executors. Most of the places I have seen logging done by log4j properties, but no where people I have seen any solution where logs are being compressed.

Is there anyway I can compress the logs, So that further those logs can be shipped to S3.

--
Raman Gugnani


--
Girish bhat m
 
Reply | Threaded
Open this post in threaded view
|

Re: How to get logging right for Spark applications in the YARN ecosystem

raman gugnani
HI Srinath,

I am not able to use log4j2 , Rolling file appender is only supported in log4j 2.

On Fri, 2 Aug 2019 at 15:48, Girish bhat m <[hidden email]> wrote:
Hi Raman,

Since you are using YARN, you collect the yarn logs (which also contains the application logs) by executing below command and move to the desired location

yarn logs -applicationId <your_yarn_app_id> 

Best
Girish 

On Fri, Aug 2, 2019 at 10:17 AM Srinath C <[hidden email]> wrote:
Hi Raman,

Probably use the rolling file appender in log4j to compress the rotated log file?

Regards.


On Fri, Aug 2, 2019 at 12:47 AM raman gugnani <[hidden email]> wrote:
HI ,

I am looking for right solution for logging the logs produced by the executors. Most of the places I have seen logging done by log4j properties, but no where people I have seen any solution where logs are being compressed.

Is there anyway I can compress the logs, So that further those logs can be shipped to S3.

--
Raman Gugnani


--
Girish bhat m
 


--
Raman Gugnani