Need config params while doing rdd.foreach or map

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Need config params while doing rdd.foreach or map

Kamalanathan Venkatesan

Hello All,

 

I have custom parameter say for example file name added to the conf of spark context example  SparkConf.set(INPUT_FILE_NAME, fileName).

I need this value inside foreach performed on an  RDD, but the when access spark context inside foreach, I receive spark context is null exception!

 

Code sample:

 

val conf = new SparkConf().setMaster(appConfig.envOrElseConfig("app.sparkconf.master"))

      .setAppName(appConfig.envOrElseConfig("app.appName"))

      .set(“INPUT_FILE_NAME”, fileName)

 

var sparkContext = new SparkContext(conf)

 

sparkContext.addJar(sparkContextParams.jarPath)

 

var sqlContext = new SQLContext(sparkContext)

 

var df = sqlContext.read.format("com.databricks.spark.csv")

     .option("header", "true")

      .load(<filePath>)

 

df.foreach( f=> {

       f.split(“,”)

       println(sparkContext.getConf.get(“INPUT_FILE_NAME”))

});

 

The above sparkContext.getConf.get(“INPUT_FILE_NAME”) throws null pointer exception!

 

Thanks,

Kamal.


The information contained in this communication is intended solely for the use of the individual or entity to whom it is addressed and others authorized to receive it. It may contain confidential or legally privileged information. If you are not the intended recipient you are hereby notified that any disclosure, copying, distribution or taking any action in reliance on the contents of this information is strictly prohibited and may be unlawful. If you have received this communication in error, please notify us immediately by responding to this email and then delete it from your system. The firm is neither liable for the proper and complete transmission of the information contained in this communication nor for any delay in its receipt.
Reply | Threaded
Open this post in threaded view
|

Re: Need config params while doing rdd.foreach or map

ayan guha
Spark context runs in driver whereas the func inside foreach runs in executor. You can pass on the param in the func so it is available in executor

On Thu, 22 Mar 2018 at 8:18 pm, Kamalanathan Venkatesan <[hidden email]> wrote:

Hello All,

 

I have custom parameter say for example file name added to the conf of spark context example  SparkConf.set(INPUT_FILE_NAME, fileName).

I need this value inside foreach performed on an  RDD, but the when access spark context inside foreach, I receive spark context is null exception!

 

Code sample:

 

val conf = new SparkConf().setMaster(appConfig.envOrElseConfig("app.sparkconf.master"))

      .setAppName(appConfig.envOrElseConfig("app.appName"))

      .set(“INPUT_FILE_NAME”, fileName)

 

var sparkContext = new SparkContext(conf)

 

sparkContext.addJar(sparkContextParams.jarPath)

 

var sqlContext = new SQLContext(sparkContext)

 

var df = sqlContext.read.format("com.databricks.spark.csv")

     .option("header", "true")

      .load(<filePath>)

 

df.foreach( f=> {

       f.split(“,”)

       println(sparkContext.getConf.get(“INPUT_FILE_NAME”))

});

 

The above sparkContext.getConf.get(“INPUT_FILE_NAME”) throws null pointer exception!

 

Thanks,

Kamal.


The information contained in this communication is intended solely for the use of the individual or entity to whom it is addressed and others authorized to receive it. It may contain confidential or legally privileged information. If you are not the intended recipient you are hereby notified that any disclosure, copying, distribution or taking any action in reliance on the contents of this information is strictly prohibited and may be unlawful. If you have received this communication in error, please notify us immediately by responding to this email and then delete it from your system. The firm is neither liable for the proper and complete transmission of the information contained in this communication nor for any delay in its receipt.
--
Best Regards,
Ayan Guha