Default spark.deploy.recoveryMode

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Default spark.deploy.recoveryMode

Chitturi Padma
Hi Spark users/experts,

In Spark source code  (Master.scala & Worker.scala), when  registering the worker with master, I see the usage of persistenceEngine. When we don't specify spark.deploy.recovery mode explicitly, what is the default value used ? This recovery mode is used to persists and restore the application & worker details.

 I see when recovery mode not specified explicitly, BlackHolePersistenceEngine being used. Am i right ?


Thanks,
Padma Ch
Reply | Threaded
Open this post in threaded view
|

Re: Default spark.deploy.recoveryMode

Prashant Sharma
[Removing dev lists]

You are absolutely correct about that.

Prashant Sharma



On Tue, Oct 14, 2014 at 5:03 PM, Priya Ch <[hidden email]> wrote:
Hi Spark users/experts,

In Spark source code  (Master.scala & Worker.scala), when  registering the worker with master, I see the usage of persistenceEngine. When we don't specify spark.deploy.recovery mode explicitly, what is the default value used ? This recovery mode is used to persists and restore the application & worker details.

 I see when recovery mode not specified explicitly, BlackHolePersistenceEngine being used. Am i right ?


Thanks,
Padma Ch

Reply | Threaded
Open this post in threaded view
|

Re: Default spark.deploy.recoveryMode

Chitturi Padma
which means the details are not persisted and hence any failures in workers and master wouldnt start the daemons normally ..right ?

On Wed, Oct 15, 2014 at 12:17 PM, Prashant Sharma [via Apache Spark User List] <[hidden email]> wrote:
[Removing dev lists]

You are absolutely correct about that.

Prashant Sharma



On Tue, Oct 14, 2014 at 5:03 PM, Priya Ch <[hidden email]> wrote:
Hi Spark users/experts,

In Spark source code  (Master.scala & Worker.scala), when  registering the worker with master, I see the usage of persistenceEngine. When we don't specify spark.deploy.recovery mode explicitly, what is the default value used ? This recovery mode is used to persists and restore the application & worker details.

 I see when recovery mode not specified explicitly, BlackHolePersistenceEngine being used. Am i right ?


Thanks,
Padma Ch




If you reply to this email, your message will be added to the discussion below:
http://apache-spark-user-list.1001560.n3.nabble.com/Default-spark-deploy-recoveryMode-tp16375p16468.html
To start a new topic under Apache Spark User List, email [hidden email]
To unsubscribe from Apache Spark User List, click here.
NAML

Reply | Threaded
Open this post in threaded view
|

Re: Default spark.deploy.recoveryMode

Prashant Sharma
So if you need those features you can go ahead and setup one of Filesystem or zookeeper options. Please take a look at: http://spark.apache.org/docs/latest/spark-standalone.html.

Prashant Sharma



On Wed, Oct 15, 2014 at 3:25 PM, Chitturi Padma <[hidden email]> wrote:
which means the details are not persisted and hence any failures in workers and master wouldnt start the daemons normally ..right ?

On Wed, Oct 15, 2014 at 12:17 PM, Prashant Sharma [via Apache Spark User List] <[hidden email]> wrote:
[Removing dev lists]

You are absolutely correct about that.

Prashant Sharma



On Tue, Oct 14, 2014 at 5:03 PM, Priya Ch <[hidden email]> wrote:
Hi Spark users/experts,

In Spark source code  (Master.scala & Worker.scala), when  registering the worker with master, I see the usage of persistenceEngine. When we don't specify spark.deploy.recovery mode explicitly, what is the default value used ? This recovery mode is used to persists and restore the application & worker details.

 I see when recovery mode not specified explicitly, BlackHolePersistenceEngine being used. Am i right ?


Thanks,
Padma Ch




If you reply to this email, your message will be added to the discussion below:
http://apache-spark-user-list.1001560.n3.nabble.com/Default-spark-deploy-recoveryMode-tp16375p16468.html
To start a new topic under Apache Spark User List, email [hidden email]
To unsubscribe from Apache Spark User List, click here.
NAML



View this message in context: Re: Default spark.deploy.recoveryMode
Sent from the Apache Spark User List mailing list archive at Nabble.com.