Hello everyone,
Me and my team require some inputs on configuring max_threads of airflow
scheduler.
*Deployment setup:*
We are running airflow on Kubernetes with CeleryExecutor
We currently have thousands of dags and are running with max_threads
configuration of 40.
Airflow version: 1.10.12
We h
@Kamil Thank you very much for working on this PR. This certainly helps our
case where we need to load the variables from a file. I will work on
testing this feature.
@Ash Thank you very much for your inputs. We are currently using
KubernetesExecutor along with airflow_local_settings.py(mounted as
Hello,
We are currently running airflow v1.10.6 on Kubernetes. We are currently
using a custom helm chart to sync dags and airflow variables from GCS.
While airflow provides support to just copy dags to a folder and then dags
are automatically loaded, airflow variables are not imported from a
file
8s
> executor in prod? Could it solve for a different use case that cannot solve
> by k8s executor?
>
> On Thu, Dec 12, 2019 at 11:13 PM Maulik Soneji
> wrote:
>
>> Hi Maxime,
>>
>> We have been using KubernetesExecutor for our ETL use cases.
>>
>> T
solution if you don't have a k8s
> cluster laying around.
>
> Max
>
> On Thu, Dec 12, 2019 at 10:27 PM Maulik Soneji
> wrote:
>
>> Hello,
>>
>> I realize that the scheduler is waiting for the tasks to be completed
>> before shutting down.
>>
>
, 2019 at 9:18 AM Maulik Soneji
wrote:
> Hello all,
>
> *TLDR*: We are using local executor with KubernetesPodOperator for our
> airflow dags.
> From stack trace of scheduler we see that it is waiting on queue to join.
>
> File:
> "/usr/local/lib/python3.7/si
Hello all,
*TLDR*: We are using local executor with KubernetesPodOperator for our
airflow dags.
>From stack trace of scheduler we see that it is waiting on queue to join.
File:
"/usr/local/lib/python3.7/site-packages/airflow/executors/local_executor.py",
line 212, in end
self.queue.join()
A
l
> rsync -m 16 ... command. Then we will not have to write any code and
> it will be a solution similar to Kubernetes thinking, where one
> container contains only one tool.
>
> Best,
>
> On Fri, Oct 18, 2019 at 8:21 PM Maulik Soneji
> wrote:
> >
> > *[Proposal]
*[Proposal]*
Create a new *syncer *command to sync dags from any remote folder, which
will be used as initContainer command in KubernetesExecutor.
It is just like the initdb command, but it will copy dags from the remote
folder before running the dag.
*[Problem]*
Currently, there are only two ways
gt; >> >
> > > > > > >> > On Fri, Aug 16, 2019 at 1:29 PM Sachin <
> > parmar.sac...@gmail.com
> > > >
> > > > > wrote:
> > > > > > >> >
> > > > > > >> > > Interested. We use
Airflow as an orchestration engine to author and manage
> 1000(s) of multi-step workflows.
> Would be interested in Airflow meetup in India.
>
> Thanks,
> Raman Gupta
>
--
Thanks and Regards,
Maulik Soneji
g_id could not be found:
customer_summary_integration.customer_profile. Either the dag did not
exist or it failed to parse.
I have tried to cover all details, let me know if anything is unclear.
--
Thanks and Regards,
Maulik Soneji
12 matches
Mail list logo