ister_YZ and register_ZY. I am
> using Celery Executor and running 12 workers executing register_XX tasks. I
> am using airflow version 1.10.6. Any idea how to fix it?
>
--
Mehmet ERSOY
Hi Tomasz,
What do you mean when you say "The mentioned DAG is missing"? I was add the
DAG, but gmail may have rejected that .py file.
I have attached again the file as .txt.
Thanks,
Mehmet.
Mehmet Ersoy , 19 Şub 2020 Çar, 15:20 tarihinde
şunu yazdı:
> I'm using 1.10.6
Hello friends,
Is there any really healthy and consistent method about using virtual
environment Airflow roles as a Linux service. There is so many methods on
the blogs but most of them is not related especially virtual environment.
Is there anyone who ever experiences it? What do you use when you
version of Airflow do you use?
>
> The mentioned DAG is missing but I'm curious about "parallel" jobs you are
> running :) Does this problem occur with only one DAG?
>
> T.
>
> On Wed, Feb 19, 2020 at 12:58 PM Mehmet Ersoy
> wrote:
>
> > Hi Tomasz,
&
do
> you sync your DAGs / logs from celery workers? I know one setup where
> I've seen I/O error when wiritting to a log...
>
> T.
>
>
> On Wed, Feb 19, 2020 at 10:56 AM Mehmet Ersoy
> wrote:
> >
> > Hello Friends,
> >
> > I'm new to Air
Hello Friends,
I'm new to Airflow and I'm using Airflow Celery executor with Postgres
backend and Redis Message Queue service. For now, there is 4 worker, 1
Scheduler and 1 Web Server.
I have been preparing parallel Sqoop Jobs in my daily DAGs.
When I scheduled a daily DAG, Often some task instanc
Helle friends,
I am using Airflow Celery Executor (with postgres backend) and when I start
a dag workers are oppening so many connections and when they finish them
work these connections are remaining "idle" status.
Why doesn't they close the connection completely?
Thanks,
Best regards,
Mehmet.
,
--
Mehmet ERSOY