spark.executor.extraJavaOptions inside application code

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

spark.executor.extraJavaOptions inside application code

Agostino Calamita
Hi all,
I wrote an application that needs an environment variable. I can set this variable with

--conf 'spark.executor.extraJavaOptions=-Dbasicauth=myuser:mypwd'

in spark-submit and it works well in standalone cluster mode.

But, I want set it inside the application code, because the variable contains a password.

How can I do ?

I tried with:

    SparkSession spark = SparkSession
                  .builder()
                  .appName("Java Spark Solr ETL")
                  .getOrCreate();
       
        spark.sparkContext().conf().setExecutorEnv("spark.executor.extraJavaOptions", "-Dbasicauth=myuser:mypassword");
   
but it doesn't work.

Thanks.
Reply | Threaded
Open this post in threaded view
|

Re: spark.executor.extraJavaOptions inside application code

Vadim Semenov-2
You need to pass config before creating a session

val conf = new SparkConf()
// All three methods below are equivalent
conf.set("spark.executor.extraJavaOptions", "-Dbasicauth=myuser:mypassword")
conf.set("spark.executorEnv.basicauth", "myuser:mypassword")
conf.setExecutorEnv("basicauth", "myuser:mypassword")
val spark = SparkSession.builder().config(conf).appName("…").getOrCreate()

On Wed, May 2, 2018 at 6:59 AM, Agostino Calamita <[hidden email]> wrote:
Hi all,
I wrote an application that needs an environment variable. I can set this variable with

--conf 'spark.executor.extraJavaOptions=-Dbasicauth=myuser:mypwd'

in spark-submit and it works well in standalone cluster mode.

But, I want set it inside the application code, because the variable contains a password.

How can I do ?

I tried with:

    SparkSession spark = SparkSession
                  .builder()
                  .appName("Java Spark Solr ETL")
                  .getOrCreate();
       
        spark.sparkContext().conf().setExecutorEnv("spark.executor.extraJavaOptions", "-Dbasicauth=myuser:mypassword");
   
but it doesn't work.

Thanks.



--
Sent from my iPhone