Re: Spark Cluster over yarn cluster monitoring

2019-10-29 Thread Chetan Khatri
Thanks Jörn

On Sun, Oct 27, 2019 at 8:01 AM Jörn Franke  wrote:

> Use yarn queues:
>
>
> https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/FairScheduler.html
>
> Am 27.10.2019 um 06:41 schrieb Chetan Khatri  >:
>
> 
> Could someone please help me to understand better..
>
> On Thu, Oct 17, 2019 at 7:41 PM Chetan Khatri 
> wrote:
>
>> Hi Users,
>>
>> I do submit *X* number of jobs with Airflow to Yarn as a part of
>> workflow for *Y *customer. I could potentially run workflow for customer *Z
>> *but I need to check that how much resources are available over the
>> cluster so jobs for next customer should start.
>>
>> Could you please tell what is the best way to handle this. Currently, I
>> am just checking availableMB > 100 then trigger next Airflow DAG over Yarn.
>>
>> GET http://rm-http-address:port/ws/v1/cluster/metrics
>>
>> Thanks.
>>
>>


Re: Spark Cluster over yarn cluster monitoring

2019-10-27 Thread Jörn Franke
Use yarn queues:

https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/FairScheduler.html

> Am 27.10.2019 um 06:41 schrieb Chetan Khatri :
> 
> 
> Could someone please help me to understand better..
> 
>> On Thu, Oct 17, 2019 at 7:41 PM Chetan Khatri  
>> wrote:
>> Hi Users,
>> 
>> I do submit X number of jobs with Airflow to Yarn as a part of workflow for 
>> Y customer. I could potentially run workflow for customer Z but I need to 
>> check that how much resources are available over the cluster so jobs for 
>> next customer should start.
>> 
>> Could you please tell what is the best way to handle this. Currently, I am 
>> just checking availableMB > 100 then trigger next Airflow DAG over Yarn.
>> 
>> GET http://rm-http-address:port/ws/v1/cluster/metrics
>> Thanks.


Re: Spark Cluster over yarn cluster monitoring

2019-10-26 Thread Chetan Khatri
Could someone please help me to understand better..

On Thu, Oct 17, 2019 at 7:41 PM Chetan Khatri 
wrote:

> Hi Users,
>
> I do submit *X* number of jobs with Airflow to Yarn as a part of workflow
> for *Y *customer. I could potentially run workflow for customer *Z *but I
> need to check that how much resources are available over the cluster so
> jobs for next customer should start.
>
> Could you please tell what is the best way to handle this. Currently, I am
> just checking availableMB > 100 then trigger next Airflow DAG over Yarn.
>
> GET http://rm-http-address:port/ws/v1/cluster/metrics
>
> Thanks.
>
>


Spark Cluster over yarn cluster monitoring

2019-10-17 Thread Chetan Khatri
Hi Users,

I do submit *X* number of jobs with Airflow to Yarn as a part of workflow
for *Y *customer. I could potentially run workflow for customer *Z *but I
need to check that how much resources are available over the cluster so
jobs for next customer should start.

Could you please tell what is the best way to handle this. Currently, I am
just checking availableMB > 100 then trigger next Airflow DAG over Yarn.

GET http://rm-http-address:port/ws/v1/cluster/metrics

Thanks.