Thanks Sean. I'll ask our Hadoop admin.

Actually I didn't find hadoop.tmp.dir in the Hadoop settings...using user
home is suggested by other users.

Jianshi

On Wed, Mar 11, 2015 at 3:51 PM, Sean Owen <so...@cloudera.com> wrote:

> You shouldn't use /tmp, but it doesn't mean you should use user home
> directories either. Typically, like in YARN, you would a number of
> directories (on different disks) mounted and configured for local
> storage for jobs.
>
> On Wed, Mar 11, 2015 at 7:42 AM, Jianshi Huang <jianshi.hu...@gmail.com>
> wrote:
> > Unfortunately /tmp mount is really small in our environment. I need to
> > provide a per-user setting as the default value.
> >
> > I hacked bin/spark-class for the similar effect. And spark-defaults.conf
> can
> > override it. :)
> >
> > Jianshi
> >
> > On Wed, Mar 11, 2015 at 3:28 PM, Patrick Wendell <pwend...@gmail.com>
> wrote:
> >>
> >> We don't support expressions or wildcards in that configuration. For
> >> each application, the local directories need to be constant. If you
> >> have users submitting different Spark applications, those can each set
> >> spark.local.dirs.
> >>
> >> - Patrick
> >>
> >> On Wed, Mar 11, 2015 at 12:14 AM, Jianshi Huang <
> jianshi.hu...@gmail.com>
> >> wrote:
> >> > Hi,
> >> >
> >> > I need to set per-user spark.local.dir, how can I do that?
> >> >
> >> > I tried both
> >> >
> >> >   /x/home/${user.name}/spark/tmp
> >> > and
> >> >   /x/home/${USER}/spark/tmp
> >> >
> >> > And neither worked. Looks like it has to be a constant setting in
> >> > spark-defaults.conf. Right?
> >> >
> >> > Any ideas how to do that?
> >> >
> >> > Thanks,
> >> > --
> >> > Jianshi Huang
> >> >
> >> > LinkedIn: jianshi
> >> > Twitter: @jshuang
> >> > Github & Blog: http://huangjs.github.com/
> >
> >
> >
> >
> > --
> > Jianshi Huang
> >
> > LinkedIn: jianshi
> > Twitter: @jshuang
> > Github & Blog: http://huangjs.github.com/
>



-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Reply via email to