Re: Scheduling jobs using FAIR pool

2024-04-02 Thread Varun Shah
Hi Hussein, Thanks for clarifying my doubts. It means that even if I configure 2 separate pools for 2 jobs or submit the 2 jobs in same pool, the submission time will take into effect only when both the jobs are "running" in parallel ( ie if job 1 gets all resources, job 2 has to wait unless

Re: Scheduling jobs using FAIR pool

2024-04-01 Thread Hussein Awala
IMO the questions are not limited to Databricks. > The Round-Robin distribution of executors only work in case of empty executors (achievable by enabling dynamic allocation). In case the jobs (part of the same pool) requires all executors, second jobs will still need to wait. This feature in

Re: Scheduling jobs using FAIR pool

2024-04-01 Thread Varun Shah
Hi Mich, I did not post in the databricks community, as most of the questions were related to spark itself. But let me also post the question on databricks community. Thanks, Varun Shah On Mon, Apr 1, 2024, 16:28 Mich Talebzadeh wrote: > Hi, > > Have you put this question to Databricks forum

Re: Scheduling jobs using FAIR pool

2024-04-01 Thread Mich Talebzadeh
Hi, Have you put this question to Databricks forum Data Engineering - Databricks Mich Talebzadeh, Technologist | Solutions Architect | Data Engineer | Generative AI London United Kingdom view my Linkedin profile

Scheduling jobs using FAIR pool

2024-03-31 Thread Varun Shah
Hi Community, I am currently exploring the best use of "Scheduler Pools" for executing jobs in parallel, and require clarification and suggestions on a few points. The implementation consists of executing "Structured Streaming" jobs on Databricks using AutoLoader. Each stream is executed with