You can put them in spark's own conf/spark-defaults.conf file

On Fri, Feb 10, 2017 at 10:35 PM, Sam Elamin <hussam.ela...@gmail.com>
wrote:

> Hi All,
>
>
> really newbie question here folks, i have properties like my aws access
> and secret keys in the core-site.xml in hadoop among other properties, but
> thats the only reason I have hadoop installed which seems a bit of an
> overkill.
>
> Is there an equivalent of core-site.xml for spark so I dont have to
> reference the HADOOP_CONF_DIR in my spark env.sh?
>
> I know I can export env variables for the AWS credentials but other
> properties that my application might want to use?
>
> Regards
> Sam
>
>
>
>

Reply via email to