You should use sc.hadoopConfiguration to get the Hadoop configuration.
Making a new one just gets you default values, which may work for your
purposes, but probably not as ideal. This configuration object should
be something you can send in the closure.

On Fri, Aug 1, 2014 at 2:16 AM, Sung Hwan Chung
<coded...@cs.stanford.edu> wrote:
> Is there any way to get SparkContext object from executor? Or hadoop
> configuration, etc. The reason is that I would like to write to HDFS from
> executors.

Reply via email to