SPARK-6470 was integrated to 1.5.0 release.

Please use 1.5.0 or newer release.

SPARK-7173 <https://issues.apache.org/jira/browse/SPARK-7173> adds support
for setting "spark.yarn.am.nodeLabelExpression"

Cheers

On Tue, Dec 15, 2015 at 1:55 AM, 张志强(旺轩) <zzq98...@alibaba-inc.com> wrote:

> Hi Ted,
>
>
>
> Thanks for your quick response, but I think the link you gave it to me is
> more advanced feature.
>
> Yes, I noticed SPARK-6470(https://issues.apache.org/jira/browse/SPARK-6470)
>
>
> And I just tried for this feature with spark 1.5.0, what happened to me
> was I was blocked to get the YARN containers by setting
> spark.yarn.executor.nodeLabelExpression property. My question,
> https://issues.apache.org/jira/browse/SPARK-7173 will fix this?
>
>
>
> Thanks
>
> Allen
>
>
>
>
>
> *发件人:* Ted Yu [mailto:yuzhih...@gmail.com]
> *发送时间:* 2015年12月15日 17:39
> *收件人:* 张志强(旺轩)
> *抄送:* dev@spark.apache.org
> *主题:* Re: spark with label nodes in yarn
>
>
>
> Please take a look at:
>
> https://issues.apache.org/jira/browse/SPARK-7173
>
>
>
> Cheers
>
>
> On Dec 15, 2015, at 1:23 AM, 张志强(旺轩) <zzq98...@alibaba-inc.com> wrote:
>
> Hi all,
>
>
>
> Has anyone tried label based scheduling via spark on yarn? I’ve tried
> that, it didn’t work, spark 1.4.1 + apache hadoop 2.6.0
>
>
>
> Any feedbacks are welcome.
>
>
>
> Thanks
>
> Allen
>
>

Reply via email to