Re: [Newbie] spark conf

2017-02-10 Thread Sam Elamin
yup that worked Thanks for the clarification! On Fri, Feb 10, 2017 at 9:42 PM, Marcelo Vanzin wrote: > If you place core-site.xml in $SPARK_HOME/conf, I'm pretty sure Spark > will pick it up. (Sounds like you're not running YARN, which would > require HADOOP_CONF_DIR.) > > Also this is more of

Re: [Newbie] spark conf

2017-02-10 Thread Marcelo Vanzin
If you place core-site.xml in $SPARK_HOME/conf, I'm pretty sure Spark will pick it up. (Sounds like you're not running YARN, which would require HADOOP_CONF_DIR.) Also this is more of a user@ question. On Fri, Feb 10, 2017 at 1:35 PM, Sam Elamin wrote: > Hi All, > > > really newbie question here

Re: [Newbie] spark conf

2017-02-10 Thread Sam Elamin
yeah I thought of that but the file made it seem that its environment specific rather than application specific configurations Im more interested in the best practices, would you recommend using the default conf file for this and uploading them to where the application will be running (remote clus

Re: [Newbie] spark conf

2017-02-10 Thread Reynold Xin
You can put them in spark's own conf/spark-defaults.conf file On Fri, Feb 10, 2017 at 10:35 PM, Sam Elamin wrote: > Hi All, > > > really newbie question here folks, i have properties like my aws access > and secret keys in the core-site.xml in hadoop among other properties, but > thats the only

[Newbie] spark conf

2017-02-10 Thread Sam Elamin
Hi All, really newbie question here folks, i have properties like my aws access and secret keys in the core-site.xml in hadoop among other properties, but thats the only reason I have hadoop installed which seems a bit of an overkill. Is there an equivalent of core-site.xml for spark so I dont h