Hello All, I have custom parameter say for example file name added to the conf of spark context example SparkConf.set(INPUT_FILE_NAME, fileName). I need this value inside foreach performed on an RDD, but the when access spark context inside foreach, I receive spark context is null exception!
Code sample: val conf = new SparkConf().setMaster(appConfig.envOrElseConfig("app.sparkconf.master")) .setAppName(appConfig.envOrElseConfig("app.appName")) .set("INPUT_FILE_NAME", fileName) var sparkContext = new SparkContext(conf) sparkContext.addJar(sparkContextParams.jarPath) var sqlContext = new SQLContext(sparkContext) var df = sqlContext.read.format("com.databricks.spark.csv") .option("header", "true") .load(<filePath>) df.foreach( f=> { f.split(",") println(sparkContext.getConf.get("INPUT_FILE_NAME")) }); The above sparkContext.getConf.get("INPUT_FILE_NAME") throws null pointer exception! Thanks, Kamal. The information contained in this communication is intended solely for the use of the individual or entity to whom it is addressed and others authorized to receive it. It may contain confidential or legally privileged information. If you are not the intended recipient you are hereby notified that any disclosure, copying, distribution or taking any action in reliance on the contents of this information is strictly prohibited and may be unlawful. If you have received this communication in error, please notify us immediately by responding to this email and then delete it from your system. The firm is neither liable for the proper and complete transmission of the information contained in this communication nor for any delay in its receipt.