yup that worked
Thanks for the clarification!
On Fri, Feb 10, 2017 at 9:42 PM, Marcelo Vanzin wrote:
> If you place core-site.xml in $SPARK_HOME/conf, I'm pretty sure Spark
> will pick it up. (Sounds like you're not running YARN, which would
> require HADOOP_CONF_DIR.)
>
> Also this is more of
If you place core-site.xml in $SPARK_HOME/conf, I'm pretty sure Spark
will pick it up. (Sounds like you're not running YARN, which would
require HADOOP_CONF_DIR.)
Also this is more of a user@ question.
On Fri, Feb 10, 2017 at 1:35 PM, Sam Elamin wrote:
> Hi All,
>
>
> really newbie question here
yeah I thought of that but the file made it seem that its environment
specific rather than application specific configurations
Im more interested in the best practices, would you recommend using the
default conf file for this and uploading them to where the application will
be running (remote clus
You can put them in spark's own conf/spark-defaults.conf file
On Fri, Feb 10, 2017 at 10:35 PM, Sam Elamin
wrote:
> Hi All,
>
>
> really newbie question here folks, i have properties like my aws access
> and secret keys in the core-site.xml in hadoop among other properties, but
> thats the only
Hi All,
really newbie question here folks, i have properties like my aws access and
secret keys in the core-site.xml in hadoop among other properties, but
thats the only reason I have hadoop installed which seems a bit of an
overkill.
Is there an equivalent of core-site.xml for spark so I dont h