Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

Ranju Jain

Hi All,

 

Virtual Machine running an application, this application is having various other 3PPs components running such as spark, database etc .

 

My requirement is to monitor every component and isolate the resources consuming individually by every component.

 

I am thinking of using a common tool such as Java Visual VM , where I specify the JMX URL of every component and monitor every component.

 

For other components I am able to view their resources.

 

Is there a possibility of Viewing the Spark Executor CPU/Memory via Java Visual VM Tool?

 

Please guide.

 

Regards

Ranju

Reply | Threaded
Open this post in threaded view
|

Re: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

Mich Talebzadeh
Hi,

Have you considered spark GUI first?


   view my Linkedin profile

 

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.

 



On Sat, 20 Mar 2021 at 16:06, Ranju Jain <[hidden email]> wrote:

Hi All,

 

Virtual Machine running an application, this application is having various other 3PPs components running such as spark, database etc .

 

My requirement is to monitor every component and isolate the resources consuming individually by every component.

 

I am thinking of using a common tool such as Java Visual VM , where I specify the JMX URL of every component and monitor every component.

 

For other components I am able to view their resources.

 

Is there a possibility of Viewing the Spark Executor CPU/Memory via Java Visual VM Tool?

 

Please guide.

 

Regards

Ranju

Reply | Threaded
Open this post in threaded view
|

Re: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

Attila Zsolt Piros
Hi Ranju!

You can configure Spark's metric system. 

Check the memoryMetrics.* of executor-metrics and in the component-instance-executor the CPU times.

Regarding the details I suggest to check Luca Canali's presentations about Spark's metric system and maybe his github repo.

Best Regards,
Attila

On Sat, Mar 20, 2021 at 5:41 PM Mich Talebzadeh <[hidden email]> wrote:
Hi,

Have you considered spark GUI first?


   view my Linkedin profile

 

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.

 



On Sat, 20 Mar 2021 at 16:06, Ranju Jain <[hidden email]> wrote:

Hi All,

 

Virtual Machine running an application, this application is having various other 3PPs components running such as spark, database etc .

 

My requirement is to monitor every component and isolate the resources consuming individually by every component.

 

I am thinking of using a common tool such as Java Visual VM , where I specify the JMX URL of every component and monitor every component.

 

For other components I am able to view their resources.

 

Is there a possibility of Viewing the Spark Executor CPU/Memory via Java Visual VM Tool?

 

Please guide.

 

Regards

Ranju

Reply | Threaded
Open this post in threaded view
|

RE: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

Ranju Jain

Hi Mich/Attila,

 

[hidden email]: I considered spark GUI , but I have a confusion first at memory level.

 

App Configuration: spark.executor.memory= 4g for running spark job.

 

In spark GUI I see running spark job has Peak Execution Memory is 1 KB as highlighted below:

I do not have Storage Memory screenshot. So  I calculated Total Memory consumption at that point of time was:

 

Spark UI shows :  spark.executor.memory= Peak Execution Memory + Storage Mem + Reserved Mem + User Memory

                                                                             = 1 Kb + Storage Mem + 300 Mb + (4g *0.25)

                                                                                   = 1 Kb + Storage Mem + 300 Mb + 1g

                                                                                  = Approx 1.5 g

 

 

 

And if I see Executor 0,1,2 actual memory consumption on virtual server using top  commnd , it shows below reading:

 

Executor – 2:       top

 

 

Executor-0 :    top

 

Please suggest On Spark GUI, Can I go with below formula to isolate that how much spark component is consuming  memory out of several other components of a Web application.

  spark.executor.memory= Peak Execution Memory + Storage Mem + Reserved Mem + User Memory

                                                  = 1 Kb + Storage Mem + 300 Mb + (4g *0.25)

 

 

[hidden email]: I checked the memoryMetrics.* of executor-metrics, but here I have a confusion about

usedOnHeapStorageMemory

usedOffHeapStorageMemory

totalOnHeapStorageMemory

totalOffHeapStorageMemory

 

Why only UsedstorageMemory should be checked?

 

To isolate spark.executor.memory, Should I check memoryMetrics.* where only storageMemory is given  or Should I check peakMemoryMetrics.* where all Peaks are specified

  1. Execution
  2. Storage
  3. JVM Heap

 

Also I noticed cpuTime provides cpu time spent by an executor. But there is no metric by which I can calculate the number of cores.

 

As suggested, I checked Luca Canali’s presentation, there I see JMXSink which Registers metrics for viewing in JMX Console. I think exposing this metric via JMXSink take it to visualize

spark.executor.memory and number of cores by an executor on Java Monitoring tool.

Also I see Grafana, a very good visualization tool where I see all the metrics can be viewed , but I have less idea for steps to install on virtual server and integrate. I need to go through in detail the Grafana.

 

Kindly suggest your views.

 

Regards

Ranju

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Sunday, March 21, 2021 3:42 AM
To: Mich Talebzadeh <[hidden email]>
Cc: Ranju Jain <[hidden email]>; [hidden email]
Subject: Re: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

 

Hi Ranju!

You can configure Spark's metric system. 

Check the memoryMetrics.* of executor-metrics and in the component-instance-executor the CPU times.

Regarding the details I suggest to check Luca Canali's presentations about Spark's metric system and maybe his github repo.

Best Regards,
Attila

 

On Sat, Mar 20, 2021 at 5:41 PM Mich Talebzadeh <[hidden email]> wrote:

Hi,

 

Have you considered spark GUI first?

 

 

 Image removed by sender.  view my Linkedin profile

 

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.

 

 

 

On Sat, 20 Mar 2021 at 16:06, Ranju Jain <[hidden email]> wrote:

Hi All,

 

Virtual Machine running an application, this application is having various other 3PPs components running such as spark, database etc .

 

My requirement is to monitor every component and isolate the resources consuming individually by every component.

 

I am thinking of using a common tool such as Java Visual VM , where I specify the JMX URL of every component and monitor every component.

 

For other components I am able to view their resources.

 

Is there a possibility of Viewing the Spark Executor CPU/Memory via Java Visual VM Tool?

 

Please guide.

 

Regards

Ranju

Reply | Threaded
Open this post in threaded view
|

Re: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

Attila Zsolt Piros
Hi Ranju!

I am quite sure for your requirement "monitor every component and isolate the resources consuming individually by every component" Spark metrics is the right direction to go.
  
> Why only UsedstorageMemory should be checked?

Right, for you only storage memory won't be enough you need the system and the execution memory too.
I expect ".JVMHeapMemory" and ".JVMOffHeapMemory" is what you looking for.

> Also I noticed cpuTime provides cpu time spent by an executor. But there is no metric by which I can calculate the number of cores.


Number of cores is specified by the Spark submit. IIRC if you pass 3 it means that each executor can run a maximum of 3 tasks at the same time.
So all these cores will be used if there is enough tasks. I know this is not perfect solution but I hope it helps.

> Also I see Grafana, a very good visualization tool where I see all the metrics can be viewed , but I have less idea for steps to install on virtual server and integrate.

I cannot help in this with specifics but a monitoring system is a good idea either Grafana or Prometheus.

Best regards,
Attila

On Sun, Mar 21, 2021 at 3:01 PM Ranju Jain <[hidden email]> wrote:

Hi Mich/Attila,

 

[hidden email]: I considered spark GUI , but I have a confusion first at memory level.

 

App Configuration: spark.executor.memory= 4g for running spark job.

 

In spark GUI I see running spark job has Peak Execution Memory is 1 KB as highlighted below:

I do not have Storage Memory screenshot. So  I calculated Total Memory consumption at that point of time was:

 

Spark UI shows :  spark.executor.memory= Peak Execution Memory + Storage Mem + Reserved Mem + User Memory

                                                                             = 1 Kb + Storage Mem + 300 Mb + (4g *0.25)

                                                                                   = 1 Kb + Storage Mem + 300 Mb + 1g

                                                                                  = Approx 1.5 g

 

 

 

And if I see Executor 0,1,2 actual memory consumption on virtual server using top  commnd , it shows below reading:

 

Executor – 2:       top

 

 

Executor-0 :    top

 

Please suggest On Spark GUI, Can I go with below formula to isolate that how much spark component is consuming  memory out of several other components of a Web application.

  spark.executor.memory= Peak Execution Memory + Storage Mem + Reserved Mem + User Memory

                                                  = 1 Kb + Storage Mem + 300 Mb + (4g *0.25)

 

 

[hidden email]: I checked the memoryMetrics.* of executor-metrics, but here I have a confusion about

usedOnHeapStorageMemory

usedOffHeapStorageMemory

totalOnHeapStorageMemory

totalOffHeapStorageMemory

 

Why only UsedstorageMemory should be checked?

 

To isolate spark.executor.memory, Should I check memoryMetrics.* where only storageMemory is given  or Should I check peakMemoryMetrics.* where all Peaks are specified

  1. Execution
  2. Storage
  3. JVM Heap

 

Also I noticed cpuTime provides cpu time spent by an executor. But there is no metric by which I can calculate the number of cores.

 

As suggested, I checked Luca Canali’s presentation, there I see JMXSink which Registers metrics for viewing in JMX Console. I think exposing this metric via JMXSink take it to visualize

spark.executor.memory and number of cores by an executor on Java Monitoring tool.

Also I see Grafana, a very good visualization tool where I see all the metrics can be viewed , but I have less idea for steps to install on virtual server and integrate. I need to go through in detail the Grafana.

 

Kindly suggest your views.

 

Regards

Ranju

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Sunday, March 21, 2021 3:42 AM
To: Mich Talebzadeh <[hidden email]>
Cc: Ranju Jain <[hidden email]>; [hidden email]
Subject: Re: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

 

Hi Ranju!

You can configure Spark's metric system. 

Check the memoryMetrics.* of executor-metrics and in the component-instance-executor the CPU times.

Regarding the details I suggest to check Luca Canali's presentations about Spark's metric system and maybe his github repo.

Best Regards,
Attila

 

On Sat, Mar 20, 2021 at 5:41 PM Mich Talebzadeh <[hidden email]> wrote:

Hi,

 

Have you considered spark GUI first?

 

 

 Image removed by sender.  view my Linkedin profile

 

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.

 

 

 

On Sat, 20 Mar 2021 at 16:06, Ranju Jain <[hidden email]> wrote:

Hi All,

 

Virtual Machine running an application, this application is having various other 3PPs components running such as spark, database etc .

 

My requirement is to monitor every component and isolate the resources consuming individually by every component.

 

I am thinking of using a common tool such as Java Visual VM , where I specify the JMX URL of every component and monitor every component.

 

For other components I am able to view their resources.

 

Is there a possibility of Viewing the Spark Executor CPU/Memory via Java Visual VM Tool?

 

Please guide.

 

Regards

Ranju

Reply | Threaded
Open this post in threaded view
|

RE: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

Ranju Jain

Hi Attila,

 

I was configuring metrics.properties  by following below steps:

 

  1. *.sink.jmx.class=org.apache.spark.metrics.sink.JmxSink

master.source.jvm.class=org.apache.spark.metrics.source.JvmSource

worker.source.jvm.class=org.apache.spark.metrics.source.JvmSource

driver.source.jvm.class=org.apache.spark.metrics.source.JvmSource

executor.source.jvm.class=org.apache.spark.metrics.source.JvmSource

 

  1. Restart the spark master workers
  2. Connect to monitoring tool using  <driverhost>:<driverport> e.g.   <host_machine>:4040

 

But it gives error

 

Any clue what is missed out.

 

Regards

Ranju

 

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Monday, March 22, 2021 11:07 AM
To: Ranju Jain <[hidden email]>
Cc: Mich Talebzadeh <[hidden email]>; [hidden email]
Subject: Re: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

 

Hi Ranju!

I am quite sure for your requirement "monitor every component and isolate the resources consuming individually by every component" Spark metrics is the right direction to go.
  
> Why only UsedstorageMemory should be checked?

Right, for you only storage memory won't be enough you need the system and the execution memory too.
I expect ".JVMHeapMemory" and ".JVMOffHeapMemory" is what you looking for.

> Also I noticed cpuTime provides cpu time spent by an executor. But there is no metric by which I can calculate the number of cores.


Number of cores is specified by the Spark submit. IIRC if you pass 3 it means that each executor can run a maximum of 3 tasks at the same time.
So all these cores will be used if there is enough tasks. I know this is not perfect solution but I hope it helps.

> Also I see Grafana, a very good visualization tool where I see all the metrics can be viewed , but I have less idea for steps to install on virtual server and integrate.

I cannot help in this with specifics but a monitoring system is a good idea either Grafana or Prometheus.

Best regards,
Attila

 

On Sun, Mar 21, 2021 at 3:01 PM Ranju Jain <[hidden email]> wrote:

Hi Mich/Attila,

 

[hidden email]: I considered spark GUI , but I have a confusion first at memory level.

 

App Configuration: spark.executor.memory= 4g for running spark job.

 

In spark GUI I see running spark job has Peak Execution Memory is 1 KB as highlighted below:

I do not have Storage Memory screenshot. So  I calculated Total Memory consumption at that point of time was:

 

Spark UI shows :  spark.executor.memory= Peak Execution Memory + Storage Mem + Reserved Mem + User Memory

                                                                             = 1 Kb + Storage Mem + 300 Mb + (4g *0.25)

                                                                                   = 1 Kb + Storage Mem + 300 Mb + 1g

                                                                                  = Approx 1.5 g

 

 

 

And if I see Executor 0,1,2 actual memory consumption on virtual server using top  commnd , it shows below reading:

 

Executor – 2:       top

 

 

Executor-0 :    top

 

Please suggest On Spark GUI, Can I go with below formula to isolate that how much spark component is consuming  memory out of several other components of a Web application.

  spark.executor.memory= Peak Execution Memory + Storage Mem + Reserved Mem + User Memory

                                                  = 1 Kb + Storage Mem + 300 Mb + (4g *0.25)

 

 

[hidden email]: I checked the memoryMetrics.* of executor-metrics, but here I have a confusion about

usedOnHeapStorageMemory

usedOffHeapStorageMemory

totalOnHeapStorageMemory

totalOffHeapStorageMemory

 

Why only UsedstorageMemory should be checked?

 

To isolate spark.executor.memory, Should I check memoryMetrics.* where only storageMemory is given  or Should I check peakMemoryMetrics.* where all Peaks are specified

  1. Execution
  2. Storage
  3. JVM Heap

 

Also I noticed cpuTime provides cpu time spent by an executor. But there is no metric by which I can calculate the number of cores.

 

As suggested, I checked Luca Canali’s presentation, there I see JMXSink which Registers metrics for viewing in JMX Console. I think exposing this metric via JMXSink take it to visualize

spark.executor.memory and number of cores by an executor on Java Monitoring tool.

Also I see Grafana, a very good visualization tool where I see all the metrics can be viewed , but I have less idea for steps to install on virtual server and integrate. I need to go through in detail the Grafana.

 

Kindly suggest your views.

 

Regards

Ranju

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Sunday, March 21, 2021 3:42 AM
To: Mich Talebzadeh <[hidden email]>
Cc: Ranju Jain <[hidden email]>; [hidden email]
Subject: Re: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

 

Hi Ranju!

You can configure Spark's metric system. 

Check the memoryMetrics.* of executor-metrics and in the component-instance-executor the CPU times.

Regarding the details I suggest to check Luca Canali's presentations about Spark's metric system and maybe his github repo.

Best Regards,
Attila

 

On Sat, Mar 20, 2021 at 5:41 PM Mich Talebzadeh <[hidden email]> wrote:

Hi,

 

Have you considered spark GUI first?

 

 

 Image removed by sender.  view my Linkedin profile

 

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.

 

 

 

On Sat, 20 Mar 2021 at 16:06, Ranju Jain <[hidden email]> wrote:

Hi All,

 

Virtual Machine running an application, this application is having various other 3PPs components running such as spark, database etc .

 

My requirement is to monitor every component and isolate the resources consuming individually by every component.

 

I am thinking of using a common tool such as Java Visual VM , where I specify the JMX URL of every component and monitor every component.

 

For other components I am able to view their resources.

 

Is there a possibility of Viewing the Spark Executor CPU/Memory via Java Visual VM Tool?

 

Please guide.

 

Regards

Ranju

Reply | Threaded
Open this post in threaded view
|

RE: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

Ranju Jain

Hi,

 

On top of below I added few more configurations in spark-env.sh & spark-defaults.conf

 

  1. Spark-env.sh

export SPARK_DAEMON_JAVA_OPTS="-Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.local.only=false -Dcom.sun.management.jmxremote.port=10003 -Dcom.sun.management.jmxremote.rmi.port=10003 -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.authenticate=false -Djava.rmi.server.hostname=0.0.0.0 -Dlog.level=INFO"

 

  1. spark-defaults.conf

spark.metrics.conf                 /opt/spark/conf/metrics.properties

 

spark.executor.extraJavaOptions    -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.local.only=false -Dcom.sun.management.jmxremote.port=10002 -Dcom.sun.management.jmxremote.rmi.port=10002 -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.authenticate=false -Djava.rmi.server.hostname=0.0.0.0

 

spark.driver.extraJavaOptions      -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.local.only=false -Dcom.sun.management.jmxremote.port=10001 -Dcom.sun.management.jmxremote.rmi.port=10001 -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.authenticate=false -Djava.rmi.server.hostname=0.0.0.0

 

What more is required to register metrics on JMX

 

Regards

Ranju

 

 

From: Ranju Jain
Sent: Monday, March 22, 2021 1:07 PM
To: Attila Zsolt Piros <[hidden email]>
Cc: [hidden email]
Subject: RE: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

 

Hi Attila,

 

I was configuring metrics.properties  by following below steps:

 

  1. *.sink.jmx.class=org.apache.spark.metrics.sink.JmxSink

master.source.jvm.class=org.apache.spark.metrics.source.JvmSource

worker.source.jvm.class=org.apache.spark.metrics.source.JvmSource

driver.source.jvm.class=org.apache.spark.metrics.source.JvmSource

executor.source.jvm.class=org.apache.spark.metrics.source.JvmSource

 

  1. Restart the spark master workers
  2. Connect to monitoring tool using  <driverhost>:<driverport> e.g.   <host_machine>:4040

 

But it gives error

 

Any clue what is missed out.

 

Regards

Ranju

 

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Monday, March 22, 2021 11:07 AM
To: Ranju Jain <[hidden email]>
Cc: Mich Talebzadeh <[hidden email]>; [hidden email]
Subject: Re: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

 

Hi Ranju!

I am quite sure for your requirement "monitor every component and isolate the resources consuming individually by every component" Spark metrics is the right direction to go.
  
> Why only UsedstorageMemory should be checked?

Right, for you only storage memory won't be enough you need the system and the execution memory too.
I expect ".JVMHeapMemory" and ".JVMOffHeapMemory" is what you looking for.

> Also I noticed cpuTime provides cpu time spent by an executor. But there is no metric by which I can calculate the number of cores.


Number of cores is specified by the Spark submit. IIRC if you pass 3 it means that each executor can run a maximum of 3 tasks at the same time.
So all these cores will be used if there is enough tasks. I know this is not perfect solution but I hope it helps.

> Also I see Grafana, a very good visualization tool where I see all the metrics can be viewed , but I have less idea for steps to install on virtual server and integrate.

I cannot help in this with specifics but a monitoring system is a good idea either Grafana or Prometheus.

Best regards,
Attila

 

On Sun, Mar 21, 2021 at 3:01 PM Ranju Jain <[hidden email]> wrote:

Hi Mich/Attila,

 

[hidden email]: I considered spark GUI , but I have a confusion first at memory level.

 

App Configuration: spark.executor.memory= 4g for running spark job.

 

In spark GUI I see running spark job has Peak Execution Memory is 1 KB as highlighted below:

I do not have Storage Memory screenshot. So  I calculated Total Memory consumption at that point of time was:

 

Spark UI shows :  spark.executor.memory= Peak Execution Memory + Storage Mem + Reserved Mem + User Memory

                                                                             = 1 Kb + Storage Mem + 300 Mb + (4g *0.25)

                                                                                   = 1 Kb + Storage Mem + 300 Mb + 1g

                                                                                  = Approx 1.5 g

 

 

 

And if I see Executor 0,1,2 actual memory consumption on virtual server using top  commnd , it shows below reading:

 

Executor – 2:       top

 

 

Executor-0 :    top

 

Please suggest On Spark GUI, Can I go with below formula to isolate that how much spark component is consuming  memory out of several other components of a Web application.

  spark.executor.memory= Peak Execution Memory + Storage Mem + Reserved Mem + User Memory

                                                  = 1 Kb + Storage Mem + 300 Mb + (4g *0.25)

 

 

[hidden email]: I checked the memoryMetrics.* of executor-metrics, but here I have a confusion about

usedOnHeapStorageMemory

usedOffHeapStorageMemory

totalOnHeapStorageMemory

totalOffHeapStorageMemory

 

Why only UsedstorageMemory should be checked?

 

To isolate spark.executor.memory, Should I check memoryMetrics.* where only storageMemory is given  or Should I check peakMemoryMetrics.* where all Peaks are specified

  1. Execution
  2. Storage
  3. JVM Heap

 

Also I noticed cpuTime provides cpu time spent by an executor. But there is no metric by which I can calculate the number of cores.

 

As suggested, I checked Luca Canali’s presentation, there I see JMXSink which Registers metrics for viewing in JMX Console. I think exposing this metric via JMXSink take it to visualize

spark.executor.memory and number of cores by an executor on Java Monitoring tool.

Also I see Grafana, a very good visualization tool where I see all the metrics can be viewed , but I have less idea for steps to install on virtual server and integrate. I need to go through in detail the Grafana.

 

Kindly suggest your views.

 

Regards

Ranju

 

From: Attila Zsolt Piros <[hidden email]>
Sent: Sunday, March 21, 2021 3:42 AM
To: Mich Talebzadeh <[hidden email]>
Cc: Ranju Jain <[hidden email]>; [hidden email]
Subject: Re: Can JVisual VM monitoring tool be used to Monitor Spark Executor Memory and CPU

 

Hi Ranju!

You can configure Spark's metric system. 

Check the memoryMetrics.* of executor-metrics and in the component-instance-executor the CPU times.

Regarding the details I suggest to check Luca Canali's presentations about Spark's metric system and maybe his github repo.

Best Regards,
Attila

 

On Sat, Mar 20, 2021 at 5:41 PM Mich Talebzadeh <[hidden email]> wrote:

Hi,

 

Have you considered spark GUI first?

 

 

 Image removed by sender.  view my Linkedin profile

 

Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.

 

 

 

On Sat, 20 Mar 2021 at 16:06, Ranju Jain <[hidden email]> wrote:

Hi All,

 

Virtual Machine running an application, this application is having various other 3PPs components running such as spark, database etc .

 

My requirement is to monitor every component and isolate the resources consuming individually by every component.

 

I am thinking of using a common tool such as Java Visual VM , where I specify the JMX URL of every component and monitor every component.

 

For other components I am able to view their resources.

 

Is there a possibility of Viewing the Spark Executor CPU/Memory via Java Visual VM Tool?

 

Please guide.

 

Regards

Ranju