Spark Version 3.0.1 Gui Display Query

classic Classic list List threaded Threaded
19 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark Version 3.0.1 Gui Display Query

Ranju Jain

Hi ,

 

I started using Spark 3.0.1 version recently and noticed the Executors Tab on Spark GUI appears as blank.

Please suggest what could be the reason of this type of display?

 

Regards

Ranju

Reply | Threaded
Open this post in threaded view
|

Re: Spark Version 3.0.1 Gui Display Query

Kapil Garg
Hi Ranju,
Is it happening just after you submit the spark application ? or are you not able to see executors info throughout the application lifetime ?

Because you won't be able to see any info there until executors have been added to the application and tasks have been submitted.

On Tue, Mar 2, 2021 at 11:04 AM Ranju Jain <[hidden email]> wrote:

Hi ,

 

I started using Spark 3.0.1 version recently and noticed the Executors Tab on Spark GUI appears as blank.

Please suggest what could be the reason of this type of display?

 

Regards

Ranju



--
Regards
Kapil Garg

-----------------------------------------------------------------------------------------

This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error, please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee, you should not disseminate, distribute or copy this email. Please notify the sender immediately by email if you have received this email by mistake and delete this email from your system. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

 

Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. Any information on shares, debentures or similar instruments, recommended product pricing, valuations and the like are for information purposes only. It is not meant to be an instruction or recommendation, as the case may be, to buy or to sell securities, products, services nor an offer to buy or sell securities, products or services unless specifically stated to be so on behalf of the Flipkart group. Employees of the Flipkart group of companies are expressly required not to make defamatory statements and not to infringe or authorise any infringement of copyright or any other legal right by email communications. Any such communication is contrary to organizational policy and outside the scope of the employment of the individual concerned. The organization will not accept any liability in respect of such communication, and the employee responsible will be personally liable for any damages or other liability arising.

 

Our organization accepts no liability for the content of this email, or for the consequences of any actions taken on the basis of the information provided, unless that information is subsequently confirmed in writing. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

-----------------------------------------------------------------------------------------

Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Ranju Jain

Hi Kapil,

 

I am not able to see executor info throughout the application lifetime.

Attaching screenshots.

  1. Jobs Tab during application start
  2. Executors Tab during application lifetime

 

I need to tune my application , this Executor Info would be a great help for tuning the parameters. But currently it is blank shown.

Regards

Ranju

 

From: Kapil Garg <[hidden email]>
Sent: Tuesday, March 2, 2021 11:39 AM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju,

Is it happening just after you submit the spark application ? or are you not able to see executors info throughout the application lifetime ?

Because you won't be able to see any info there until executors have been added to the application and tasks have been submitted.

 

On Tue, Mar 2, 2021 at 11:04 AM Ranju Jain <[hidden email]> wrote:

Hi ,

 

I started using Spark 3.0.1 version recently and noticed the Executors Tab on Spark GUI appears as blank.

Please suggest what could be the reason of this type of display?

 

Regards

Ranju



--

Regards
Kapil Garg



-----------------------------------------------------------------------------------------

This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error, please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee, you should not disseminate, distribute or copy this email. Please notify the sender immediately by email if you have received this email by mistake and delete this email from your system. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

 

Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. Any information on shares, debentures or similar instruments, recommended product pricing, valuations and the like are for information purposes only. It is not meant to be an instruction or recommendation, as the case may be, to buy or to sell securities, products, services nor an offer to buy or sell securities, products or services unless specifically stated to be so on behalf of the Flipkart group. Employees of the Flipkart group of companies are expressly required not to make defamatory statements and not to infringe or authorise any infringement of copyright or any other legal right by email communications. Any such communication is contrary to organizational policy and outside the scope of the employment of the individual concerned. The organization will not accept any liability in respect of such communication, and the employee responsible will be personally liable for any damages or other liability arising.

 

Our organization accepts no liability for the content of this email, or for the consequences of any actions taken on the basis of the information provided, unless that information is subsequently confirmed in writing. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

-----------------------------------------------------------------------------------------

Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Ranju Jain

Hi Kapil,

 

I am not able to see executor info throughout the application lifetime.

Attaching screenshots.

  1. Jobs Tab during application start
  2. Executors Tab during application lifetime

 

I need to tune my application , this Executor Info would be a great help for tuning the parameters. But currently it is blank shown.

Regards

Ranju

 

From: Kapil Garg <[hidden email]>
Sent: Tuesday, March 2, 2021 11:39 AM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju,

Is it happening just after you submit the spark application ? or are you not able to see executors info throughout the application lifetime ?

Because you won't be able to see any info there until executors have been added to the application and tasks have been submitted.

 

On Tue, Mar 2, 2021 at 11:04 AM Ranju Jain <[hidden email]> wrote:

Hi ,

 

I started using Spark 3.0.1 version recently and noticed the Executors Tab on Spark GUI appears as blank.

Please suggest what could be the reason of this type of display?

 

Regards

Ranju



--

Regards
Kapil Garg

 

-----------------------------------------------------------------------------------------

This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error, please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee, you should not disseminate, distribute or copy this email. Please notify the sender immediately by email if you have received this email by mistake and delete this email from your system. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

 

Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. Any information on shares, debentures or similar instruments, recommended product pricing, valuations and the like are for information purposes only. It is not meant to be an instruction or recommendation, as the case may be, to buy or to sell securities, products, services nor an offer to buy or sell securities, products or services unless specifically stated to be so on behalf of the Flipkart group. Employees of the Flipkart group of companies are expressly required not to make defamatory statements and not to infringe or authorise any infringement of copyright or any other legal right by email communications. Any such communication is contrary to organizational policy and outside the scope of the employment of the individual concerned. The organization will not accept any liability in respect of such communication, and the employee responsible will be personally liable for any damages or other liability arising.

 

Our organization accepts no liability for the content of this email, or for the consequences of any actions taken on the basis of the information provided, unless that information is subsequently confirmed in writing. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

-----------------------------------------------------------------------------------------

Reply | Threaded
Open this post in threaded view
|

Re: Spark Version 3.0.1 Gui Display Query

Kapil Garg
Okay,
Please share console outputs and network logs of executors tab from browser

On Tue, Mar 2, 2021 at 11:56 AM Ranju Jain <[hidden email]> wrote:

Hi Kapil,

 

I am not able to see executor info throughout the application lifetime.

Attaching screenshots.

  1. Jobs Tab during application start
  2. Executors Tab during application lifetime

 

I need to tune my application , this Executor Info would be a great help for tuning the parameters. But currently it is blank shown.

Regards

Ranju

 

From: Kapil Garg <[hidden email]>
Sent: Tuesday, March 2, 2021 11:39 AM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju,

Is it happening just after you submit the spark application ? or are you not able to see executors info throughout the application lifetime ?

Because you won't be able to see any info there until executors have been added to the application and tasks have been submitted.

 

On Tue, Mar 2, 2021 at 11:04 AM Ranju Jain <[hidden email]> wrote:

Hi ,

 

I started using Spark 3.0.1 version recently and noticed the Executors Tab on Spark GUI appears as blank.

Please suggest what could be the reason of this type of display?

 

Regards

Ranju



--

Regards
Kapil Garg

 

-----------------------------------------------------------------------------------------

This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error, please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee, you should not disseminate, distribute or copy this email. Please notify the sender immediately by email if you have received this email by mistake and delete this email from your system. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

 

Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. Any information on shares, debentures or similar instruments, recommended product pricing, valuations and the like are for information purposes only. It is not meant to be an instruction or recommendation, as the case may be, to buy or to sell securities, products, services nor an offer to buy or sell securities, products or services unless specifically stated to be so on behalf of the Flipkart group. Employees of the Flipkart group of companies are expressly required not to make defamatory statements and not to infringe or authorise any infringement of copyright or any other legal right by email communications. Any such communication is contrary to organizational policy and outside the scope of the employment of the individual concerned. The organization will not accept any liability in respect of such communication, and the employee responsible will be personally liable for any damages or other liability arising.

 

Our organization accepts no liability for the content of this email, or for the consequences of any actions taken on the basis of the information provided, unless that information is subsequently confirmed in writing. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

-----------------------------------------------------------------------------------------



--
Regards
Kapil Garg

-----------------------------------------------------------------------------------------

This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error, please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee, you should not disseminate, distribute or copy this email. Please notify the sender immediately by email if you have received this email by mistake and delete this email from your system. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

 

Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. Any information on shares, debentures or similar instruments, recommended product pricing, valuations and the like are for information purposes only. It is not meant to be an instruction or recommendation, as the case may be, to buy or to sell securities, products, services nor an offer to buy or sell securities, products or services unless specifically stated to be so on behalf of the Flipkart group. Employees of the Flipkart group of companies are expressly required not to make defamatory statements and not to infringe or authorise any infringement of copyright or any other legal right by email communications. Any such communication is contrary to organizational policy and outside the scope of the employment of the individual concerned. The organization will not accept any liability in respect of such communication, and the employee responsible will be personally liable for any damages or other liability arising.

 

Our organization accepts no liability for the content of this email, or for the consequences of any actions taken on the basis of the information provided, unless that information is subsequently confirmed in writing. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

-----------------------------------------------------------------------------------------

Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Ranju Jain

Hi Kapil,

 

Attaching logs of Driver and Executor . Here I requested for 3 minimum executors from Kube-API server.

 

Driver

 

3 Executors

 

Attached Sprklogs.txt Contains Driver logs

 

Attached exe.txt contains one of 3 Executor Logs.

 

Driver Gui

 

Exec Tab on Gui

After few secs , the blank page as shown below:

 

Regards

Ranju

 

From: Kapil Garg <[hidden email]>
Sent: Tuesday, March 2, 2021 12:39 PM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Okay,

Please share console outputs and network logs of executors tab from browser

 

On Tue, Mar 2, 2021 at 11:56 AM Ranju Jain <[hidden email]> wrote:

Hi Kapil,

 

I am not able to see executor info throughout the application lifetime.

Attaching screenshots.

  1. Jobs Tab during application start
  2. Executors Tab during application lifetime

 

I need to tune my application , this Executor Info would be a great help for tuning the parameters. But currently it is blank shown.

Regards

Ranju

 

From: Kapil Garg <[hidden email]>
Sent: Tuesday, March 2, 2021 11:39 AM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju,

Is it happening just after you submit the spark application ? or are you not able to see executors info throughout the application lifetime ?

Because you won't be able to see any info there until executors have been added to the application and tasks have been submitted.

 

On Tue, Mar 2, 2021 at 11:04 AM Ranju Jain <[hidden email]> wrote:

Hi ,

 

I started using Spark 3.0.1 version recently and noticed the Executors Tab on Spark GUI appears as blank.

Please suggest what could be the reason of this type of display?

 

Regards

Ranju



--

Regards
Kapil Garg

 

-----------------------------------------------------------------------------------------

This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error, please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee, you should not disseminate, distribute or copy this email. Please notify the sender immediately by email if you have received this email by mistake and delete this email from your system. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

 

Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. Any information on shares, debentures or similar instruments, recommended product pricing, valuations and the like are for information purposes only. It is not meant to be an instruction or recommendation, as the case may be, to buy or to sell securities, products, services nor an offer to buy or sell securities, products or services unless specifically stated to be so on behalf of the Flipkart group. Employees of the Flipkart group of companies are expressly required not to make defamatory statements and not to infringe or authorise any infringement of copyright or any other legal right by email communications. Any such communication is contrary to organizational policy and outside the scope of the employment of the individual concerned. The organization will not accept any liability in respect of such communication, and the employee responsible will be personally liable for any damages or other liability arising.

 

Our organization accepts no liability for the content of this email, or for the consequences of any actions taken on the basis of the information provided, unless that information is subsequently confirmed in writing. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

-----------------------------------------------------------------------------------------



--

Regards
Kapil Garg



-----------------------------------------------------------------------------------------

This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error, please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee, you should not disseminate, distribute or copy this email. Please notify the sender immediately by email if you have received this email by mistake and delete this email from your system. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

 

Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. Any information on shares, debentures or similar instruments, recommended product pricing, valuations and the like are for information purposes only. It is not meant to be an instruction or recommendation, as the case may be, to buy or to sell securities, products, services nor an offer to buy or sell securities, products or services unless specifically stated to be so on behalf of the Flipkart group. Employees of the Flipkart group of companies are expressly required not to make defamatory statements and not to infringe or authorise any infringement of copyright or any other legal right by email communications. Any such communication is contrary to organizational policy and outside the scope of the employment of the individual concerned. The organization will not accept any liability in respect of such communication, and the employee responsible will be personally liable for any damages or other liability arising.

 

Our organization accepts no liability for the content of this email, or for the consequences of any actions taken on the basis of the information provided, unless that information is subsequently confirmed in writing. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

-----------------------------------------------------------------------------------------



---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

sprklogs.txt (166K) Download Attachment
exe.txt (51K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Attila Zsolt Piros
Hi Ranju!

The UI is built up from events. This is why history server able to show the
state of the a finished app as those events are replayed to build a state,
for details you can check  web UI page and the following section too <
https://spark.apache.org/docs/latest/monitoring.html#web-interfaces>  .

So you should share/look into the event log.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Ranju Jain
Hi Attila,

I checked the Section < https://spark.apache.org/docs/latest/monitoring.html#web-interfaces>  and Web UI Page

What document is saying that if I want to view information only for the duration of the application, then I do not need to
generate the event logs and do not need to set spark.eventLog.enabled=true and spark.eventLog.dir=<dir-name> .

But If I want to see this info after the application completes , then I should persist the logs.

My Requirement is to monitor the Executor Tab only during the Job run and not after. How can I see only during application run.

Regards
Ranju

-----Original Message-----
From: Attila Zsolt Piros <[hidden email]>
Sent: Wednesday, March 3, 2021 10:37 PM
To: [hidden email]
Subject: RE: Spark Version 3.0.1 Gui Display Query

Hi Ranju!

The UI is built up from events. This is why history server able to show the state of the a finished app as those events are replayed to build a state, for details you can check  web UI page and the following section too < https://spark.apache.org/docs/latest/monitoring.html#web-interfaces>  .

So you should share/look into the event log.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Attila Zsolt Piros
Hi Ranju!

I meant the event log would be very helpful for analyzing the problem at
your side.

The three logs together (driver, executors, event) is the best from the same
run of course.
 
I know you want check the executors tab during the job is running. And for
this you do not need to eventlog. But the event log is still useful for
finding out what happened.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Spark Version 3.0.1 Gui Display Query

Mich Talebzadeh
In reply to this post by Ranju Jain
Well I cannot recall in Spark 3.0.1. However, this looks fine in Spark 3.1.1 (recent release)

See attached the image from the executor tabs




LinkedIn  https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

 



Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.

 



On Tue, 2 Mar 2021 at 05:35, Ranju Jain <[hidden email]> wrote:

Hi ,

 

I started using Spark 3.0.1 version recently and noticed the Executors Tab on Spark GUI appears as blank.

Please suggest what could be the reason of this type of display?

 

Regards

Ranju



---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

executors.PNG (94K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Ranju Jain
In reply to this post by Attila Zsolt Piros
Hi Attila,

Ok , I understood. I will switch on event logs .

Regards
Ranju

-----Original Message-----
From: Attila Zsolt Piros <[hidden email]>
Sent: Thursday, March 4, 2021 11:38 PM
To: [hidden email]
Subject: RE: Spark Version 3.0.1 Gui Display Query

Hi Ranju!

I meant the event log would be very helpful for analyzing the problem at your side.

The three logs together (driver, executors, event) is the best from the same run of course.
 
I know you want check the executors tab during the job is running. And for this you do not need to eventlog. But the event log is still useful for finding out what happened.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Spark Version 3.0.1 Gui Display Query

Kapil Garg
Hi Ranju,
The screenshots and logs you shared are from spark driver and executor. I meant for you to check the web page logs in chrome console. There might be some error logs indicating why UI is unable to fetch the information.

I have faced a similar problem when I was accessing spark UI via a proxy and the proxy was having problems resolving the backend URL and data was not visible in executors tab.

Just check the chrome console logs once and if you find any error logs then do share here for others to look at.

On Fri, Mar 5, 2021 at 9:35 AM Ranju Jain <[hidden email]> wrote:
Hi Attila,

Ok , I understood. I will switch on event logs .

Regards
Ranju

-----Original Message-----
From: Attila Zsolt Piros <[hidden email]>
Sent: Thursday, March 4, 2021 11:38 PM
To: [hidden email]
Subject: RE: Spark Version 3.0.1 Gui Display Query

Hi Ranju!

I meant the event log would be very helpful for analyzing the problem at your side.

The three logs together (driver, executors, event) is the best from the same run of course.

I know you want check the executors tab during the job is running. And for this you do not need to eventlog. But the event log is still useful for finding out what happened.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
Regards
Kapil Garg

-----------------------------------------------------------------------------------------

This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error, please notify the system manager. This message contains confidential information and is intended only for the individual named. If you are not the named addressee, you should not disseminate, distribute or copy this email. Please notify the sender immediately by email if you have received this email by mistake and delete this email from your system. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

 

Any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the organization. Any information on shares, debentures or similar instruments, recommended product pricing, valuations and the like are for information purposes only. It is not meant to be an instruction or recommendation, as the case may be, to buy or to sell securities, products, services nor an offer to buy or sell securities, products or services unless specifically stated to be so on behalf of the Flipkart group. Employees of the Flipkart group of companies are expressly required not to make defamatory statements and not to infringe or authorise any infringement of copyright or any other legal right by email communications. Any such communication is contrary to organizational policy and outside the scope of the employment of the individual concerned. The organization will not accept any liability in respect of such communication, and the employee responsible will be personally liable for any damages or other liability arising.

 

Our organization accepts no liability for the content of this email, or for the consequences of any actions taken on the basis of the information provided, unless that information is subsequently confirmed in writing. If you are not the intended recipient, you are notified that disclosing, copying, distributing or taking any action in reliance on the contents of this information is strictly prohibited.

-----------------------------------------------------------------------------------------

Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Ranju Jain
In reply to this post by Attila Zsolt Piros
Hi Attila,

I checked Event logs, driver and executor logs.
Here I have configured spark
> Min executors as 3
> Each executor with :
                > spark.executor.cores=6 cores
                > spark.kubernetes.executor.request.cores=5600m
                            > spark.executor.memory= 10 GB.
So 6 Tasks are started parallelly in an executor and  Event log shows Peak Execution Memory for Task in an executor - 1024

But Still, cannot connect with why SPARK UI on Browser shows blank Executors Tab during app run.
Do you see any reason?

Regards
Ranju

-----Original Message-----
From: Attila Zsolt Piros <[hidden email]>
Sent: Thursday, March 4, 2021 11:38 PM
To: [hidden email]
Subject: RE: Spark Version 3.0.1 Gui Display Query

Hi Ranju!

I meant the event log would be very helpful for analyzing the problem at your side.

The three logs together (driver, executors, event) is the best from the same run of course.
 
I know you want check the executors tab during the job is running. And for this you do not need to eventlog. But the event log is still useful for finding out what happened.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

eventlogs.txt (1M) Download Attachment
driver_600000_25sec_3exec.txt (166K) Download Attachment
exe_600000_25sec_3exec.txt (452K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Spark Version 3.0.1 Gui Display Query

Attila Zsolt Piros
Hi Ranju!

The event logs (the events) seems to be correct as 

1) as the added executors are in the logs, see the last 3:
$ grep "SparkListenerBlockManagerAdded" /tmp/spark-events/app-1615118161388-0000
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"driver","Host":"spark-driver-example2","Port":33004},"Maximum Memory":18493184409,"Timestamp":1615116671171,"Maximum Onheap Memory":18493184409,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"1","Host":"192.168.70.128","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675644,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"3","Host":"192.168.226.44","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675802,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"2","Host":"192.168.249.138","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116681156,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}


2) even 3.0.1 SHS renders the executors tab correctly:
image.png

This confirms the driver gets the events as the event log is written by the driver.

Could you please check the REST url for executors?

In your browser you can start a developer tool and check the networking, like:
image.png
I draw a red rectangles around the "Network" tab in developer tools, "allexecutors" and "Response" to help you.


Best regards,
Attila

PS: Sorry for the repost I managed to delete the pictures via the forum UI when I tried to fix some typos 

On Sun, Mar 7, 2021 at 1:18 PM Ranju Jain <[hidden email]> wrote:
Hi Attila,

I checked Event logs, driver and executor logs.
Here I have configured spark
> Min executors as 3
> Each executor with :
                > spark.executor.cores=6 cores
                > spark.kubernetes.executor.request.cores=5600m
                            > spark.executor.memory= 10 GB.
So 6 Tasks are started parallelly in an executor and  Event log shows Peak Execution Memory for Task in an executor - 1024

But Still, cannot connect with why SPARK UI on Browser shows blank Executors Tab during app run.
Do you see any reason?

Regards
Ranju

-----Original Message-----
From: Attila Zsolt Piros <[hidden email]>
Sent: Thursday, March 4, 2021 11:38 PM
To: [hidden email]
Subject: RE: Spark Version 3.0.1 Gui Display Query

Hi Ranju!

I meant the event log would be very helpful for analyzing the problem at your side.

The three logs together (driver, executors, event) is the best from the same run of course.

I know you want check the executors tab during the job is running. And for this you do not need to eventlog. But the event log is still useful for finding out what happened.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Ranju Jain

Hi Attila,

 

I checked Rest url for executors and found that applications marked as red giving 500 Server Error [GET http://seliics04773:31001/api/v1/applications]

 

<p>Problem accessing /api/v1/applications. Reason:

<pre>    Server Error</pre></p><h3>Caused by:</h3><pre>java.lang.NoSuchMethodError: javax.ws.rs.core.Application.getProperties()Ljava/util/Map;

                at org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:307)

                at org.glassfish.jersey.server.ApplicationHandler.lambda$initialize$1(ApplicationHandler.java:293)

                at org.glassfish.jersey.internal.Errors.process(Errors.java:292)

                at org.glassfish.jersey.internal.Errors.process(Errors.java:274)

                at org.glassfish.jersey.internal.Errors.processWithException(Errors.java:232)

                at org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:292)

                at org.glassfish.jersey.server.ApplicationHandler.&lt;init&gt;(ApplicationHandler.java:259)

                at org.glassfish.jersey.servlet.WebComponent.&lt;init&gt;(WebComponent.java:311)

                at org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:154)

                at org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:347)

                at javax.servlet.GenericServlet.init(GenericServlet.java:244)

                at org.sparkproject.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:671)

 

Attaching Network Browser logs and pom.xml. Am I missing some jar to add or I need to exclude some jersey.

 

 

 

Regards

Ranju

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Monday, March 8, 2021 7:45 PM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju!

 

The event logs (the events) seems to be correct as 


1) as the added executors are in the logs, see the last 3:
$ grep "SparkListenerBlockManagerAdded" /tmp/spark-events/app-1615118161388-0000
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"driver","Host":"spark-driver-example2","Port":33004},"Maximum Memory":18493184409,"Timestamp":1615116671171,"Maximum Onheap Memory":18493184409,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"1","Host":"192.168.70.128","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675644,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"3","Host":"192.168.226.44","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675802,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"2","Host":"192.168.249.138","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116681156,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}

 

2) even 3.0.1 SHS renders the executors tab correctly:
image.png

This confirms the driver gets the events as the event log is written by the driver.

Could you please check the REST url for executors?

In your browser you can start a developer tool and check the networking, like:
image.png

I draw a red rectangles around the "Network" tab in developer tools, "allexecutors" and "Response" to help you.


Best regards,
Attila

PS: Sorry for the repost I managed to delete the pictures via the forum UI when I tried to fix some typos 

 

On Sun, Mar 7, 2021 at 1:18 PM Ranju Jain <[hidden email]> wrote:

Hi Attila,

I checked Event logs, driver and executor logs.
Here I have configured spark
> Min executors as 3
> Each executor with :
                > spark.executor.cores=6 cores
                > spark.kubernetes.executor.request.cores=5600m
                            > spark.executor.memory= 10 GB.
So 6 Tasks are started parallelly in an executor and  Event log shows Peak Execution Memory for Task in an executor - 1024

But Still, cannot connect with why SPARK UI on Browser shows blank Executors Tab during app run.
Do you see any reason?

Regards
Ranju

-----Original Message-----
From: Attila Zsolt Piros <[hidden email]>
Sent: Thursday, March 4, 2021 11:38 PM
To: [hidden email]
Subject: RE: Spark Version 3.0.1 Gui Display Query

Hi Ranju!

I meant the event log would be very helpful for analyzing the problem at your side.

The three logs together (driver, executors, event) is the best from the same run of course.

I know you want check the executors tab during the job is running. And for this you do not need to eventlog. But the event log is still useful for finding out what happened.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

pom.xml (10K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Spark Version 3.0.1 Gui Display Query

Attila Zsolt Piros
This post was updated on .
Hi Ranju!

You should not need to include spark packages into the jar with dependencies. So please
add the provided scope for them:

   <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.12</artifactId>
      <version>3.0.1</version>
      <scope>provided</scope> 
 </dependency>

Do this for spark-hive_2.12 and spark-sql_2.12 too.

Moreover I do not see why you need the "spark-kubernetes_2.12" at all.

Best Regards,
Attila

On Tue, Mar 9, 2021 at 3:07 AM Ranju Jain <Ranju.Jain@ericsson.com> wrote:

> Hi Attila,
>
>
>
> I checked Rest url for executors and found that *applications* marked as
> red giving 500 Server Error [GET
> http://seliics04773:31001/api/v1/applications]
>
>
>
> <p>Problem accessing /api/v1/applications. Reason:
>
> <pre>    Server Error</pre></p>

Caused
> by:

<pre>java.lang.NoSuchMethodError:
> javax.ws.rs.core.Application.getProperties()Ljava/util/Map;
>
>                 at
> org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:307)
>
>                 at
> org.glassfish.jersey.server.ApplicationHandler.lambda$initialize$1(ApplicationHandler.java:293)
>
>                 at
> org.glassfish.jersey.internal.Errors.process(Errors.java:292)
>
>                 at
> org.glassfish.jersey.internal.Errors.process(Errors.java:274)
>
>                 at
> org.glassfish.jersey.internal.Errors.processWithException(Errors.java:232)
>
>                 at
> org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:292)
>
>                 at
> org.glassfish.jersey.server.ApplicationHandler.<init>(ApplicationHandler.java:259)
>
>                 at
> org.glassfish.jersey.servlet.WebComponent.<init>(WebComponent.java:311)
>
>                 at
> org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:154)
>
>                 at
> org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:347)
>
>                 at
> javax.servlet.GenericServlet.init(GenericServlet.java:244)
>
>                 at
> org.sparkproject.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:671)
>
>
>
> Attaching *Network Browser logs and pom.xml*. Am I missing some jar to
> add or I need to exclude some jersey.
>
>
>
>
>
>
>
> Regards
>
> Ranju
>
>
>
> *From:* Attila Zsolt Piros <piros.attila.zsolt@gmail.com>
> *Sent:* Monday, March 8, 2021 7:45 PM
> *To:* Ranju Jain <Ranju.Jain@ericsson.com>
> *Cc:* user@spark.apache.org
> *Subject:* Re: Spark Version 3.0.1 Gui Display Query
>
>
>
> Hi Ranju!
>
>
>
> The event logs (the events) seems to be correct as
>
>
> 1) as the added executors are in the logs, see the last 3:
> $ grep "SparkListenerBlockManagerAdded"
> /tmp/spark-events/app-1615118161388-0000
> {"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor
> ID":"driver","Host":"spark-driver-example2","Port":33004},"Maximum
> Memory":18493184409,"Timestamp":1615116671171,"Maximum Onheap
> Memory":18493184409,"Maximum Offheap Memory":0}
> {"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor
> ID":"1","Host":"192.168.70.128","Port":33004},"Maximum
> Memory":5985376665,"Timestamp":1615116675644,"Maximum Onheap
> Memory":5985376665,"Maximum Offheap Memory":0}
> {"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor
> ID":"3","Host":"192.168.226.44","Port":33004},"Maximum
> Memory":5985376665,"Timestamp":1615116675802,"Maximum Onheap
> Memory":5985376665,"Maximum Offheap Memory":0}
> {"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor
> ID":"2","Host":"192.168.249.138","Port":33004},"Maximum
> Memory":5985376665,"Timestamp":1615116681156,"Maximum Onheap
> Memory":5985376665,"Maximum Offheap Memory":0}
>
>
>
> 2) even 3.0.1 SHS renders the executors tab correctly:
> [image: image.png]
>
> This confirms the driver gets the events as the event log is written by
> the driver.
>
> Could you please check the REST url for executors?
>
> In your browser you can start a developer tool and check the networking,
> like:
> [image: image.png]
>
> I draw a red rectangles around the "Network" tab in developer tools,
> "allexecutors" and "Response" to help you.
>
>
> Best regards,
> Attila
>
> PS: Sorry for the repost I managed to delete the pictures via the forum UI
> when I tried to fix some typos
>
>
>
> On Sun, Mar 7, 2021 at 1:18 PM Ranju Jain <Ranju.Jain@ericsson.com> wrote:
>
> Hi Attila,
>
> I checked Event logs, driver and executor logs.
> Here I have configured spark
> > Min executors as 3
> > Each executor with :
>                 > spark.executor.cores=6 cores
>                 > spark.kubernetes.executor.request.cores=5600m
>                             > spark.executor.memory= 10 GB.
> So 6 Tasks are started parallelly in an executor and  Event log shows Peak
> Execution Memory for Task in an executor - 1024
>
> But Still, cannot connect with why SPARK UI on Browser shows blank
> Executors Tab during app run.
> Do you see any reason?
>
> Regards
> Ranju
>
> -----Original Message-----
> From: Attila Zsolt Piros <piros.attila.zsolt@gmail.com>
> Sent: Thursday, March 4, 2021 11:38 PM
> To: user@spark.apache.org
> Subject: RE: Spark Version 3.0.1 Gui Display Query
>
> Hi Ranju!
>
> I meant the event log would be very helpful for analyzing the problem at
> your side.
>
> The three logs together (driver, executors, event) is the best from the
> same run of course.
>
> I know you want check the executors tab during the job is running. And for
> this you do not need to eventlog. But the event log is still useful for
> finding out what happened.
>
> Regards,
> Attila
>
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>


image005.png (86K) <http://apache-spark-user-list.1001560.n3.nabble.com/attachment/39686/0/image005.png>
image006.png (172K) <http://apache-spark-user-list.1001560.n3.nabble.com/attachment/39686/1/image006.png>
image007.png (370K) <http://apache-spark-user-list.1001560.n3.nabble.com/attachment/39686/2/image007.png>
Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Ranju Jain

Hi Attila,

 

  1. Moreover I do not see why you need the "spark-kubernetes_2.12" at all.

 

Do you mean that spark_core is sufficient? I get an error below if I do not include spark-kubernetes jar.

 

21/03/09 11:00:31 ERROR SparkContext: Error initializing SparkContext.

org.apache.spark.SparkException: Could not parse Master URL: 'k8s://https://kubernetes.default.svc.cluster.local:443'

        at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2944)

        at org.apache.spark.SparkContext.<init>(SparkContext.scala:533)

        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2574)

        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:934)

        at scala.Option.getOrElse(Option.scala:189)

        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:928)

        at database.SparkAppOnPodCassExecSSL2.loadData(SparkAppOnPodCassExecSSL2.java:438)

        at database.SparkAppOnPodCassExecSSL2.main(SparkAppOnPodCassExecSSL2.java:99)

  1. You should not need to include spark packages into the uberjar. So please add the provided scope for them:

 

I understood your point that uber jar [fat jar] will have more chances of version conflicts. But If I am adding provided scope for these jars , when I am running mvn clean package,

my final jar file does not have spark-core or other jars. It simply has only my class files. So its execution failed. Attached pom.xml

 

  1. Good News is that Executors Tab started to display. 😊

As we see in Browser screenshot below , Rest API was using javax.ws.rs.core old version.

So I noticed, In my fat jar, when spark-core expands , javax.ws.rs.core is of  old version, I replace it with javax.ws.rs.core latest which is a part of jakarta.ws.rs-api placed under /opt/spark/jars.

But all I did this manually via adding this folder in fat jar. Attaching Executor Tab screenshot.

 

I need to check why <scope>provided</scope> is resulting this behavior . Also How can I exclude the old version of javax.ws.rs.core from spark-core and add this new jakatra in pom.xml.

Also Why kubernetes jar is not a need?

 

 

  A Big Thank you to you and community 😊. Your approach of debugging and guiding me, starting from event logs to Browser taught me many things.

 

Regards

Ranju

 

 

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Tuesday, March 9, 2021 2:28 PM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju!

 

You should not need to include spark packages into the uberjar. So please add the provided scope for them:

   <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.12</artifactId>
      <version>3.0.1</version>
      <scope>provided</scope>
   </dependency>

Do this for spark-hive_2.12 and spark-sql_2.12 too.

Moreover I do not see why you need the "
spark-kubernetes_2.12" at all.

Best Regards,

Attila

 

On Tue, Mar 9, 2021 at 3:07 AM Ranju Jain <[hidden email]> wrote:

Hi Attila,

 

I checked Rest url for executors and found that applications marked as red giving 500 Server Error [GET http://seliics04773:31001/api/v1/applications]

 

<p>Problem accessing /api/v1/applications. Reason:

<pre>    Server Error</pre></p><h3>Caused by:</h3><pre>java.lang.NoSuchMethodError: javax.ws.rs.core.Application.getProperties()Ljava/util/Map;

                at org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:307)

                at org.glassfish.jersey.server.ApplicationHandler.lambda$initialize$1(ApplicationHandler.java:293)

                at org.glassfish.jersey.internal.Errors.process(Errors.java:292)

                at org.glassfish.jersey.internal.Errors.process(Errors.java:274)

                at org.glassfish.jersey.internal.Errors.processWithException(Errors.java:232)

                at org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:292)

                at org.glassfish.jersey.server.ApplicationHandler.&lt;init&gt;(ApplicationHandler.java:259)

                at org.glassfish.jersey.servlet.WebComponent.&lt;init&gt;(WebComponent.java:311)

                at org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:154)

                at org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:347)

                at javax.servlet.GenericServlet.init(GenericServlet.java:244)

                at org.sparkproject.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:671)

 

Attaching Network Browser logs and pom.xml. Am I missing some jar to add or I need to exclude some jersey.

 

 

 

Regards

Ranju

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Monday, March 8, 2021 7:45 PM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju!

 

The event logs (the events) seems to be correct as 


1) as the added executors are in the logs, see the last 3:
$ grep "SparkListenerBlockManagerAdded" /tmp/spark-events/app-1615118161388-0000
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"driver","Host":"spark-driver-example2","Port":33004},"Maximum Memory":18493184409,"Timestamp":1615116671171,"Maximum Onheap Memory":18493184409,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"1","Host":"192.168.70.128","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675644,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"3","Host":"192.168.226.44","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675802,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"2","Host":"192.168.249.138","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116681156,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}

 

2) even 3.0.1 SHS renders the executors tab correctly:
image.png

This confirms the driver gets the events as the event log is written by the driver.

Could you please check the REST url for executors?

In your browser you can start a developer tool and check the networking, like:
image.png

I draw a red rectangles around the "Network" tab in developer tools, "allexecutors" and "Response" to help you.


Best regards,
Attila

PS: Sorry for the repost I managed to delete the pictures via the forum UI when I tried to fix some typos 

 

On Sun, Mar 7, 2021 at 1:18 PM Ranju Jain <[hidden email]> wrote:

Hi Attila,

I checked Event logs, driver and executor logs.
Here I have configured spark
> Min executors as 3
> Each executor with :
                > spark.executor.cores=6 cores
                > spark.kubernetes.executor.request.cores=5600m
                            > spark.executor.memory= 10 GB.
So 6 Tasks are started parallelly in an executor and  Event log shows Peak Execution Memory for Task in an executor - 1024

But Still, cannot connect with why SPARK UI on Browser shows blank Executors Tab during app run.
Do you see any reason?

Regards
Ranju

-----Original Message-----
From: Attila Zsolt Piros <[hidden email]>
Sent: Thursday, March 4, 2021 11:38 PM
To: [hidden email]
Subject: RE: Spark Version 3.0.1 Gui Display Query

Hi Ranju!

I meant the event log would be very helpful for analyzing the problem at your side.

The three logs together (driver, executors, event) is the best from the same run of course.

I know you want check the executors tab during the job is running. And for this you do not need to eventlog. But the event log is still useful for finding out what happened.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

pom.xml (4K) Download Attachment
Executor_Tab.docx (77K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Spark Version 3.0.1 Gui Display Query

Attila Zsolt Piros
Hi Ranju!

I guess I know what causes the confusion regarding my advices: not using spark-kubernetes_2.12 and using the provided scope for spark-core and spark-sql.

So I assumed you have a downloaded spark binary and you are using spark-submit of that binary distribution like it is given for spark examples at the Running Spark on Kubernetes page:
$ ./bin/spark-submit \
    --master k8s://https://<k8s-apiserver-host>:<k8s-apiserver-port> \
    --deploy-mode cluster \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=5 \
    --conf spark.kubernetes.container.image=<spark-image> \
    local:///path/to/examples.jar
Now check the content of the examples.jar, it does not contain anything from Spark core:

$ unzip -l examples/jars/spark-examples_2.12-3.0.1.jar | grep core -c
0
Mostly it contains classes from the examples package and some test data:
$ unzip -l examples/jars/spark-examples_2.12-3.0.1.jar | grep -v examples
  Length      Date    Time    Name
---------  ---------- -----   ----
        0  08-28-2020 08:01   META-INF/
      339  08-28-2020 08:01   META-INF/MANIFEST.MF
      334  08-28-2020 07:58   users.avro
        0  08-28-2020 08:01   org/
        0  08-28-2020 08:01   org/apache/
        0  08-28-2020 08:01   org/apache/spark/
    11358  08-28-2020 07:58   META-INF/LICENSE
        0  08-28-2020 08:01   META-INF/maven/
        0  08-28-2020 08:01   META-INF/maven/org.apache.spark/
      240  08-28-2020 07:58   full_user.avsc
       32  08-28-2020 07:58   people.txt
        0  08-28-2020 08:01   dir1/
      520  08-28-2020 07:58   dir1/file1.parquet
       24  08-28-2020 07:58   dir1/file3.json
      483  08-28-2020 07:58   META-INF/DEPENDENCIES
       49  08-28-2020 07:58   people.csv
      130  08-28-2020 07:58   employees.json
       73  08-28-2020 07:58   people.json
        0  08-28-2020 08:01   dir1/dir2/
      520  08-28-2020 07:58   dir1/dir2/file2.parquet
      185  08-28-2020 07:58   user.avsc
      547  08-28-2020 07:58   users.orc
      174  08-28-2020 07:58   META-INF/NOTICE
     5812  08-28-2020 07:58   kv1.txt
      615  08-28-2020 07:58   users.parquet
---------                     -------
  3043874                     804 files
Best Regards,
Attila


On Wed, Mar 10, 2021 at 9:50 AM Ranju Jain <[hidden email]> wrote:

Hi Attila,

 

  1. Moreover I do not see why you need the "spark-kubernetes_2.12" at all.

 

Do you mean that spark_core is sufficient? I get an error below if I do not include spark-kubernetes jar.

 

21/03/09 11:00:31 ERROR SparkContext: Error initializing SparkContext.

org.apache.spark.SparkException: Could not parse Master URL: 'k8s://https://kubernetes.default.svc.cluster.local:443'

        at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2944)

        at org.apache.spark.SparkContext.<init>(SparkContext.scala:533)

        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2574)

        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:934)

        at scala.Option.getOrElse(Option.scala:189)

        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:928)

        at database.SparkAppOnPodCassExecSSL2.loadData(SparkAppOnPodCassExecSSL2.java:438)

        at database.SparkAppOnPodCassExecSSL2.main(SparkAppOnPodCassExecSSL2.java:99)

  1. You should not need to include spark packages into the uberjar. So please add the provided scope for them:

 

I understood your point that uber jar [fat jar] will have more chances of version conflicts. But If I am adding provided scope for these jars , when I am running mvn clean package,

my final jar file does not have spark-core or other jars. It simply has only my class files. So its execution failed. Attached pom.xml

 

  1. Good News is that Executors Tab started to display. 😊

As we see in Browser screenshot below , Rest API was using javax.ws.rs.core old version.

So I noticed, In my fat jar, when spark-core expands , javax.ws.rs.core is of  old version, I replace it with javax.ws.rs.core latest which is a part of jakarta.ws.rs-api placed under /opt/spark/jars.

But all I did this manually via adding this folder in fat jar. Attaching Executor Tab screenshot.

 

I need to check why <scope>provided</scope> is resulting this behavior . Also How can I exclude the old version of javax.ws.rs.core from spark-core and add this new jakatra in pom.xml.

Also Why kubernetes jar is not a need?

 

 

  A Big Thank you to you and community 😊. Your approach of debugging and guiding me, starting from event logs to Browser taught me many things.

 

Regards

Ranju

 

 

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Tuesday, March 9, 2021 2:28 PM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju!

 

You should not need to include spark packages into the uberjar. So please add the provided scope for them:

   <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.12</artifactId>
      <version>3.0.1</version>
      <scope>provided</scope>
   </dependency>

Do this for spark-hive_2.12 and spark-sql_2.12 too.

Moreover I do not see why you need the "
spark-kubernetes_2.12" at all.

Best Regards,

Attila

 

On Tue, Mar 9, 2021 at 3:07 AM Ranju Jain <[hidden email]> wrote:

Hi Attila,

 

I checked Rest url for executors and found that applications marked as red giving 500 Server Error [GET http://seliics04773:31001/api/v1/applications]

 

<p>Problem accessing /api/v1/applications. Reason:

<pre>    Server Error</pre></p><h3>Caused by:</h3><pre>java.lang.NoSuchMethodError: javax.ws.rs.core.Application.getProperties()Ljava/util/Map;

                at org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:307)

                at org.glassfish.jersey.server.ApplicationHandler.lambda$initialize$1(ApplicationHandler.java:293)

                at org.glassfish.jersey.internal.Errors.process(Errors.java:292)

                at org.glassfish.jersey.internal.Errors.process(Errors.java:274)

                at org.glassfish.jersey.internal.Errors.processWithException(Errors.java:232)

                at org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:292)

                at org.glassfish.jersey.server.ApplicationHandler.&lt;init&gt;(ApplicationHandler.java:259)

                at org.glassfish.jersey.servlet.WebComponent.&lt;init&gt;(WebComponent.java:311)

                at org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:154)

                at org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:347)

                at javax.servlet.GenericServlet.init(GenericServlet.java:244)

                at org.sparkproject.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:671)

 

Attaching Network Browser logs and pom.xml. Am I missing some jar to add or I need to exclude some jersey.

 

 

 

Regards

Ranju

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Monday, March 8, 2021 7:45 PM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju!

 

The event logs (the events) seems to be correct as 


1) as the added executors are in the logs, see the last 3:
$ grep "SparkListenerBlockManagerAdded" /tmp/spark-events/app-1615118161388-0000
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"driver","Host":"spark-driver-example2","Port":33004},"Maximum Memory":18493184409,"Timestamp":1615116671171,"Maximum Onheap Memory":18493184409,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"1","Host":"192.168.70.128","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675644,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"3","Host":"192.168.226.44","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675802,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"2","Host":"192.168.249.138","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116681156,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}

 

2) even 3.0.1 SHS renders the executors tab correctly:
image.png

This confirms the driver gets the events as the event log is written by the driver.

Could you please check the REST url for executors?

In your browser you can start a developer tool and check the networking, like:
image.png

I draw a red rectangles around the "Network" tab in developer tools, "allexecutors" and "Response" to help you.


Best regards,
Attila

PS: Sorry for the repost I managed to delete the pictures via the forum UI when I tried to fix some typos 

 

On Sun, Mar 7, 2021 at 1:18 PM Ranju Jain <[hidden email]> wrote:

Hi Attila,

I checked Event logs, driver and executor logs.
Here I have configured spark
> Min executors as 3
> Each executor with :
                > spark.executor.cores=6 cores
                > spark.kubernetes.executor.request.cores=5600m
                            > spark.executor.memory= 10 GB.
So 6 Tasks are started parallelly in an executor and  Event log shows Peak Execution Memory for Task in an executor - 1024

But Still, cannot connect with why SPARK UI on Browser shows blank Executors Tab during app run.
Do you see any reason?

Regards
Ranju

-----Original Message-----
From: Attila Zsolt Piros <[hidden email]>
Sent: Thursday, March 4, 2021 11:38 PM
To: [hidden email]
Subject: RE: Spark Version 3.0.1 Gui Display Query

Hi Ranju!

I meant the event log would be very helpful for analyzing the problem at your side.

The three logs together (driver, executors, event) is the best from the same run of course.

I know you want check the executors tab during the job is running. And for this you do not need to eventlog. But the event log is still useful for finding out what happened.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

RE: Spark Version 3.0.1 Gui Display Query

Ranju Jain

Hi Attila,

 

Ok! Actually I am not using spark-submit of the binary distribution. And If I am not using kubernetes jar, URL does not parse.

 

Also, I am surprised from where spark core picking this old version javax.ws.rs.core.

 

Anyways Thanks Attila, I was in great need of this Tab as I need to do Memory/CPU Tuning and this Tab was necessity for me. Thanks once again.

 

Regards

Ranju

From: Attila Zsolt Piros <[hidden email]>
Sent: Wednesday, March 10, 2021 3:39 PM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju!

I guess I know what causes the confusion regarding my advices: not using spark-kubernetes_2.12 and using the provided scope for spark-core and spark-sql.

So I assumed you have a downloaded spark binary and you are using spark-submit of that binary distribution like it is given for spark examples at the Running Spark on Kubernetes page:

$ ./bin/spark-submit \
    --master k8s://https://<k8s-apiserver-host>:<k8s-apiserver-port> \
    --deploy-mode cluster \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=5 \
    --conf spark.kubernetes.container.image=<spark-image> \
    local:///path/to/examples.jar

Now check the content of the examples.jar, it does not contain anything from Spark core:


$ unzip -l examples/jars/spark-examples_2.12-3.0.1.jar | grep core -c
0

Mostly it contains classes from the examples package and some test data:

$ unzip -l examples/jars/spark-examples_2.12-3.0.1.jar | grep -v examples
  Length      Date    Time    Name
---------  ---------- -----   ----
        0  08-28-2020 08:01   META-INF/
      339  08-28-2020 08:01   META-INF/MANIFEST.MF
      334  08-28-2020 07:58   users.avro
        0  08-28-2020 08:01   org/
        0  08-28-2020 08:01   org/apache/
        0  08-28-2020 08:01   org/apache/spark/
    11358  08-28-2020 07:58   META-INF/LICENSE
        0  08-28-2020 08:01   META-INF/maven/
        0  08-28-2020 08:01   META-INF/maven/org.apache.spark/
      240  08-28-2020 07:58   full_user.avsc
       32  08-28-2020 07:58   people.txt
        0  08-28-2020 08:01   dir1/
      520  08-28-2020 07:58   dir1/file1.parquet
       24  08-28-2020 07:58   dir1/file3.json
      483  08-28-2020 07:58   META-INF/DEPENDENCIES
       49  08-28-2020 07:58   people.csv
      130  08-28-2020 07:58   employees.json
       73  08-28-2020 07:58   people.json
        0  08-28-2020 08:01   dir1/dir2/
      520  08-28-2020 07:58   dir1/dir2/file2.parquet
      185  08-28-2020 07:58   user.avsc
      547  08-28-2020 07:58   users.orc
      174  08-28-2020 07:58   META-INF/NOTICE
     5812  08-28-2020 07:58   kv1.txt
      615  08-28-2020 07:58   users.parquet
---------                     -------
  3043874                     804 files

Best Regards,

Attila

 

On Wed, Mar 10, 2021 at 9:50 AM Ranju Jain <[hidden email]> wrote:

Hi Attila,

 

  1. Moreover I do not see why you need the "spark-kubernetes_2.12" at all.

 

Do you mean that spark_core is sufficient? I get an error below if I do not include spark-kubernetes jar.

 

21/03/09 11:00:31 ERROR SparkContext: Error initializing SparkContext.

org.apache.spark.SparkException: Could not parse Master URL: 'k8s://https://kubernetes.default.svc.cluster.local:443'

        at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2944)

        at org.apache.spark.SparkContext.<init>(SparkContext.scala:533)

        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2574)

        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:934)

        at scala.Option.getOrElse(Option.scala:189)

        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:928)

        at database.SparkAppOnPodCassExecSSL2.loadData(SparkAppOnPodCassExecSSL2.java:438)

        at database.SparkAppOnPodCassExecSSL2.main(SparkAppOnPodCassExecSSL2.java:99)

  1. You should not need to include spark packages into the uberjar. So please add the provided scope for them:

 

I understood your point that uber jar [fat jar] will have more chances of version conflicts. But If I am adding provided scope for these jars , when I am running mvn clean package,

my final jar file does not have spark-core or other jars. It simply has only my class files. So its execution failed. Attached pom.xml

 

  1. Good News is that Executors Tab started to display. 😊

As we see in Browser screenshot below , Rest API was using javax.ws.rs.core old version.

So I noticed, In my fat jar, when spark-core expands , javax.ws.rs.core is of  old version, I replace it with javax.ws.rs.core latest which is a part of jakarta.ws.rs-api placed under /opt/spark/jars.

But all I did this manually via adding this folder in fat jar. Attaching Executor Tab screenshot.

 

I need to check why <scope>provided</scope> is resulting this behavior . Also How can I exclude the old version of javax.ws.rs.core from spark-core and add this new jakatra in pom.xml.

Also Why kubernetes jar is not a need?

 

 

  A Big Thank you to you and community 😊. Your approach of debugging and guiding me, starting from event logs to Browser taught me many things.

 

Regards

Ranju

 

 

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Tuesday, March 9, 2021 2:28 PM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju!

 

You should not need to include spark packages into the uberjar. So please add the provided scope for them:

   <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.12</artifactId>
      <version>3.0.1</version>
      <scope>provided</scope>
   </dependency>

Do this for spark-hive_2.12 and spark-sql_2.12 too.

Moreover I do not see why you need the "
spark-kubernetes_2.12" at all.

Best Regards,

Attila

 

On Tue, Mar 9, 2021 at 3:07 AM Ranju Jain <[hidden email]> wrote:

Hi Attila,

 

I checked Rest url for executors and found that applications marked as red giving 500 Server Error [GET http://seliics04773:31001/api/v1/applications]

 

<p>Problem accessing /api/v1/applications. Reason:

<pre>    Server Error</pre></p><h3>Caused by:</h3><pre>java.lang.NoSuchMethodError: javax.ws.rs.core.Application.getProperties()Ljava/util/Map;

                at org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:307)

                at org.glassfish.jersey.server.ApplicationHandler.lambda$initialize$1(ApplicationHandler.java:293)

                at org.glassfish.jersey.internal.Errors.process(Errors.java:292)

                at org.glassfish.jersey.internal.Errors.process(Errors.java:274)

                at org.glassfish.jersey.internal.Errors.processWithException(Errors.java:232)

                at org.glassfish.jersey.server.ApplicationHandler.initialize(ApplicationHandler.java:292)

                at org.glassfish.jersey.server.ApplicationHandler.&lt;init&gt;(ApplicationHandler.java:259)

                at org.glassfish.jersey.servlet.WebComponent.&lt;init&gt;(WebComponent.java:311)

                at org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:154)

                at org.glassfish.jersey.servlet.ServletContainer.init(ServletContainer.java:347)

                at javax.servlet.GenericServlet.init(GenericServlet.java:244)

                at org.sparkproject.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:671)

 

Attaching Network Browser logs and pom.xml. Am I missing some jar to add or I need to exclude some jersey.

 

 

 

Regards

Ranju

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Monday, March 8, 2021 7:45 PM
To: Ranju Jain <[hidden email]>
Cc: [hidden email]
Subject: Re: Spark Version 3.0.1 Gui Display Query

 

Hi Ranju!

 

The event logs (the events) seems to be correct as 


1) as the added executors are in the logs, see the last 3:
$ grep "SparkListenerBlockManagerAdded" /tmp/spark-events/app-1615118161388-0000
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"driver","Host":"spark-driver-example2","Port":33004},"Maximum Memory":18493184409,"Timestamp":1615116671171,"Maximum Onheap Memory":18493184409,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"1","Host":"192.168.70.128","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675644,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"3","Host":"192.168.226.44","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116675802,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}
{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor ID":"2","Host":"192.168.249.138","Port":33004},"Maximum Memory":5985376665,"Timestamp":1615116681156,"Maximum Onheap Memory":5985376665,"Maximum Offheap Memory":0}

 

2) even 3.0.1 SHS renders the executors tab correctly:
image.png

This confirms the driver gets the events as the event log is written by the driver.

Could you please check the REST url for executors?

In your browser you can start a developer tool and check the networking, like:
image.png

I draw a red rectangles around the "Network" tab in developer tools, "allexecutors" and "Response" to help you.


Best regards,
Attila

PS: Sorry for the repost I managed to delete the pictures via the forum UI when I tried to fix some typos 

 

On Sun, Mar 7, 2021 at 1:18 PM Ranju Jain <[hidden email]> wrote:

Hi Attila,

I checked Event logs, driver and executor logs.
Here I have configured spark
> Min executors as 3
> Each executor with :
                > spark.executor.cores=6 cores
                > spark.kubernetes.executor.request.cores=5600m
                            > spark.executor.memory= 10 GB.
So 6 Tasks are started parallelly in an executor and  Event log shows Peak Execution Memory for Task in an executor - 1024

But Still, cannot connect with why SPARK UI on Browser shows blank Executors Tab during app run.
Do you see any reason?

Regards
Ranju

-----Original Message-----
From: Attila Zsolt Piros <[hidden email]>
Sent: Thursday, March 4, 2021 11:38 PM
To: [hidden email]
Subject: RE: Spark Version 3.0.1 Gui Display Query

Hi Ranju!

I meant the event log would be very helpful for analyzing the problem at your side.

The three logs together (driver, executors, event) is the best from the same run of course.

I know you want check the executors tab during the job is running. And for this you do not need to eventlog. But the event log is still useful for finding out what happened.

Regards,
Attila




--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]