How to set environment variable for a spark job

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

How to set environment variable for a spark job

santhoma
Hello

I have a requirement to set some env values for my spark jobs.
Does anyone know how to set them? Specifically following variables:

1) ORACLE_HOME
2) LD_LIBRARY_PATH

thanks
Reply | Threaded
Open this post in threaded view
|

Re: How to set environment variable for a spark job

Sourav Chandra
You can pass them in the environment map used to create spark context.


On Tue, Mar 25, 2014 at 2:29 PM, santhoma <[hidden email]> wrote:
Hello

I have a requirement to set some env values for my spark jobs.
Does anyone know how to set them? Specifically following variables:

1) ORACLE_HOME
2) LD_LIBRARY_PATH

thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-environment-variable-for-a-spark-job-tp3180.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.



--

Sourav Chandra

Senior Software Engineer

· · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · ·

[hidden email]

o: +91 80 4121 8723

m: +91 988 699 3746

skype: sourav.chandra

Livestream

"Ajmera Summit", First Floor, #3/D, 68 Ward, 3rd Cross, 7th C Main, 3rd Block, Koramangala Industrial Area,

Bangalore 560034

www.livestream.com

Reply | Threaded
Open this post in threaded view
|

Re: How to set environment variable for a spark job

santhoma
I tried it, it did not work

     conf.setExecutorEnv("ORACLE_HOME", orahome)
     conf.setExecutorEnv("LD_LIBRARY_PATH", ldpath)

Any idea how to set it using java.library.path ?
Reply | Threaded
Open this post in threaded view
|

Re: How to set environment variable for a spark job

Sourav Chandra
Did you try to access the variables in worker using System.getenv(...) and it failed?


On Wed, Mar 26, 2014 at 11:42 AM, santhoma <[hidden email]> wrote:
I tried it, it did not work

     conf.setExecutorEnv("ORACLE_HOME", orahome)
     conf.setExecutorEnv("LD_LIBRARY_PATH", ldpath)

Any idea how to set it using java.library.path ?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-environment-variable-for-a-spark-job-tp3180p3241.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.



--

Sourav Chandra

Senior Software Engineer

· · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · ·

[hidden email]

o: +91 80 4121 8723

m: +91 988 699 3746

skype: sourav.chandra

Livestream

"Ajmera Summit", First Floor, #3/D, 68 Ward, 3rd Cross, 7th C Main, 3rd Block, Koramangala Industrial Area,

Bangalore 560034

www.livestream.com

Reply | Threaded
Open this post in threaded view
|

Re: How to set environment variable for a spark job

santhoma
OK, it was working.
I printed System.getenv(..) for both env variables  and they gave  correct values.

However it  did not give me the intended result. My intention was to load a native library from LD_LIBRARY_PATH, but looks like the library is loaded from value of -Djava.library.path.

Value of this property is coming as "-Djava.library.path=/opt/cloudera/parcels/CDH-5.0.0-0.cdh5b2.p0.27/lib/spark/lib:/opt/cloudera/parcels/CDH-5.0.0-0.cdh5b2.p0.27/lib/hadoop/lib/native"

Any idea how to append my custom path to it programatically?
Reply | Threaded
Open this post in threaded view
|

Re: How to set environment variable for a spark job

santhoma
Got it finally, pasting it here so that it will be useful for others

val conf = new SparkConf()
             .setJars(jarList);
 conf.setExecutorEnv("ORACLE_HOME", myOraHome)
 conf.setExecutorEnv("SPARK_JAVA_OPTS", "-Djava.library.path=/my/custom/path")