how to specify external jars in program with SparkConf

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

how to specify external jars in program with SparkConf

mytramesh
Context :- In EMR class path has old version of jar, want to refer new
version of jar in my code.

through bootstrap while spinning new nodes , copied necessary jars to local
folder from S3.

In spark-submit command by using extra class path parameter my code able
refer new version jar which is in custom location .

--conf="spark.driver.extraClassPath=/usr/jars/*"
--conf="spark.executor.extraClassPath=/usr/jars/*"

Same thing want to achieve programmatically by specifying in sparkconfig
object, but no luck . Could anyone help me on this .

sparkConf.set("spark.driver.extraClassPath", "/usr/jars/*");
sparkConf.set("spark.executor.extraClassPath", "/usr/jars/*");
//tried below options also
//sparkConf.set("spark.executor.userClassPathFirst", "true");
 //sparkConf.set("spark.driver.userClassPathFirst", "true");



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: how to specify external jars in program with SparkConf

Prem Sure
I think JVM is initiated with available classpath by the time your conf execution comes... I faced this earlier during Spark1.6 and ended up moving to Spark Submit using --jars 
found it was not part of runtime config changes.. 
May I know the advantage you are trying to get programmatically  

On Thu, Jul 12, 2018 at 8:19 PM, mytramesh <[hidden email]> wrote:
Context :- In EMR class path has old version of jar, want to refer new
version of jar in my code.

through bootstrap while spinning new nodes , copied necessary jars to local
folder from S3.

In spark-submit command by using extra class path parameter my code able
refer new version jar which is in custom location .

--conf="spark.driver.extraClassPath=/usr/jars/*"
--conf="spark.executor.extraClassPath=/usr/jars/*"

Same thing want to achieve programmatically by specifying in sparkconfig
object, but no luck . Could anyone help me on this .

sparkConf.set("spark.driver.extraClassPath", "/usr/jars/*");
sparkConf.set("spark.executor.extraClassPath", "/usr/jars/*");
//tried below options also
//sparkConf.set("spark.executor.userClassPathFirst", "true");
 //sparkConf.set("spark.driver.userClassPathFirst", "true");



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]