Exact meaning of spark.memory.storageFraction in spark 2.3.x

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Exact meaning of spark.memory.storageFraction in spark 2.3.x

msumbul
Hello,

Im asking mysef the exact meaning of the setting of
spark.memory.storageFraction.
The documentation mention:

"Amount of storage memory immune to eviction, expressed as a fraction of the
size of the region set aside by spark.memory.fraction. The higher this is,
the less working memory may be available to execution and tasks may spill to
disk more often"

Does that mean that if there is no caching that part of the memory will not
be used at all?
In the spark UI, in the tab "Executor", I can see that the "storage memory"
is always zero. Does that mean that that part of the memory is never used at
all and I can reduce it or never used for storage specifically?

Thanks in advance for your help,
Michel



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail]

Jack Kolokasis
Hello Michel,

Spark seperates executors memory using an adaptive boundary between
storage and execution memory. If there is no caching and execution
memory needs more space, then it will use a portion of the storage memory.

If your program does not use caching then you can reduce storage memory.

Iacovos

On 20/3/20 4:40 μ.μ., msumbul wrote:

> Hello,
>
> Im asking mysef the exact meaning of the setting of
> spark.memory.storageFraction.
> The documentation mention:
>
> "Amount of storage memory immune to eviction, expressed as a fraction of the
> size of the region set aside by spark.memory.fraction. The higher this is,
> the less working memory may be available to execution and tasks may spill to
> disk more often"
>
> Does that mean that if there is no caching that part of the memory will not
> be used at all?
> In the spark UI, in the tab "Executor", I can see that the "storage memory"
> is always zero. Does that mean that that part of the memory is never used at
> all and I can reduce it or never used for storage specifically?
>
> Thanks in advance for your help,
> Michel
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail]

msumbul
Hi,

Thanks for the very quick reply!
If I see the metrics "storage memory", always at 0, does that mean that the memory is neither used for caching or computing?

Thanks,
Michel

Garanti sans virus. www.avast.com

Le ven. 20 mars 2020 à 14:45, Jack Kolokasis <[hidden email]> a écrit :
Hello Michel,

Spark seperates executors memory using an adaptive boundary between
storage and execution memory. If there is no caching and execution
memory needs more space, then it will use a portion of the storage memory.

If your program does not use caching then you can reduce storage memory.

Iacovos

On 20/3/20 4:40 μ.μ., msumbul wrote:
> Hello,
>
> Im asking mysef the exact meaning of the setting of
> spark.memory.storageFraction.
> The documentation mention:
>
> "Amount of storage memory immune to eviction, expressed as a fraction of the
> size of the region set aside by spark.memory.fraction. The higher this is,
> the less working memory may be available to execution and tasks may spill to
> disk more often"
>
> Does that mean that if there is no caching that part of the memory will not
> be used at all?
> In the spark UI, in the tab "Executor", I can see that the "storage memory"
> is always zero. Does that mean that that part of the memory is never used at
> all and I can reduce it or never used for storage specifically?
>
> Thanks in advance for your help,
> Michel
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail] [Marketing Mail]

Jack Kolokasis

This is just a counter to show you the size of cached RDDs. If it is zero means that no caching has occurred. Also, even storage memory is used for computing the counter will show as zero.

Iacovos

On 20/3/20 4:51 μ.μ., Michel Sumbul wrote:
Hi,

Thanks for the very quick reply!
If I see the metrics "storage memory", always at 0, does that mean that the memory is neither used for caching or computing?

Thanks,
Michel

Garanti sans virus. www.avast.com

Le ven. 20 mars 2020 à 14:45, Jack Kolokasis <[hidden email]> a écrit :
Hello Michel,

Spark seperates executors memory using an adaptive boundary between
storage and execution memory. If there is no caching and execution
memory needs more space, then it will use a portion of the storage memory.

If your program does not use caching then you can reduce storage memory.

Iacovos

On 20/3/20 4:40 μ.μ., msumbul wrote:
> Hello,
>
> Im asking mysef the exact meaning of the setting of
> spark.memory.storageFraction.
> The documentation mention:
>
> "Amount of storage memory immune to eviction, expressed as a fraction of the
> size of the region set aside by spark.memory.fraction. The higher this is,
> the less working memory may be available to execution and tasks may spill to
> disk more often"
>
> Does that mean that if there is no caching that part of the memory will not
> be used at all?
> In the spark UI, in the tab "Executor", I can see that the "storage memory"
> is always zero. Does that mean that that part of the memory is never used at
> all and I can reduce it or never used for storage specifically?
>
> Thanks in advance for your help,
> Michel
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Exact meaning of spark.memory.storageFraction in spark 2.3.x [Marketing Mail] [Marketing Mail]

msumbul
Hi  Iacovos,

thansk for the reply its super clear.
Do you know if there is a way to know the max memory usage?
In the spark ui 2.3.x the "peak memory usage" metris is always at zero.

Thanks,
Michel

Garanti sans virus. www.avast.com

Le ven. 20 mars 2020 à 14:56, Jack Kolokasis <[hidden email]> a écrit :

This is just a counter to show you the size of cached RDDs. If it is zero means that no caching has occurred. Also, even storage memory is used for computing the counter will show as zero.

Iacovos

On 20/3/20 4:51 μ.μ., Michel Sumbul wrote:
Hi,

Thanks for the very quick reply!
If I see the metrics "storage memory", always at 0, does that mean that the memory is neither used for caching or computing?

Thanks,
Michel

Garanti sans virus. www.avast.com

Le ven. 20 mars 2020 à 14:45, Jack Kolokasis <[hidden email]> a écrit :
Hello Michel,

Spark seperates executors memory using an adaptive boundary between
storage and execution memory. If there is no caching and execution
memory needs more space, then it will use a portion of the storage memory.

If your program does not use caching then you can reduce storage memory.

Iacovos

On 20/3/20 4:40 μ.μ., msumbul wrote:
> Hello,
>
> Im asking mysef the exact meaning of the setting of
> spark.memory.storageFraction.
> The documentation mention:
>
> "Amount of storage memory immune to eviction, expressed as a fraction of the
> size of the region set aside by spark.memory.fraction. The higher this is,
> the less working memory may be available to execution and tasks may spill to
> disk more often"
>
> Does that mean that if there is no caching that part of the memory will not
> be used at all?
> In the spark UI, in the tab "Executor", I can see that the "storage memory"
> is always zero. Does that mean that that part of the memory is never used at
> all and I can reduce it or never used for storage specifically?
>
> Thanks in advance for your help,
> Michel
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]