The SparkConf doesn't allow you to set arbitrary variables. You can use
SparkContext's HadoopRDD and create a JobConf (with whatever variables you
want), and then grab them out of the JobConf in your RecordReader.
On Sun, Feb 22, 2015 at 4:28 PM, hnahak harihar1...@gmail.com wrote:
Hi,
I
Hi,
I have written custom InputFormat and RecordReader for Spark, I need to use
user variables from spark client program.
I added them in SparkConf
val sparkConf = new
SparkConf().setAppName(args(0)).set(developer,MyName)
*and in InputFormat class*
protected boolean
Thanks. I extract hadoop configuration and set a my arbitrary variable and
able to get inside InputFormat from JobContext.configuration
On Mon, Feb 23, 2015 at 12:04 PM, Tom Vacek minnesota...@gmail.com wrote:
The SparkConf doesn't allow you to set arbitrary variables. You can use
Instead of setting in SparkConf , set it into
SparkContext.hadoopconfiguration.set(key,value)
and from JobContext extract same key.
--Harihar
--
View this message in context: