< changing the subject >
On Thu, Aug 30, 2018 at 9:57 AM Veeranagouda Mukkanagoudar <
mukkanagou...@gmail.com> wrote:
> I see the following error related to this task in scheduler log,
>
> Setting a task to FAILED without callbacks or retries. Do you have enough
> resour
I see the following error related to this task in scheduler log,
Setting a task to FAILED without callbacks or retries. Do you have enough
resources? Which resources its complaining about ?
On Thu, Aug 30, 2018 at 9:51 AM Veeranagouda Mukkanagoudar <
mukkanagou...@gmail.com> wrote:
Task Instance Details
Dependencies Blocking Task From Getting Scheduled
DependencyReason
Unknown All dependencies are met but the task instance is not running. In
most cases this just means that the task will probably be scheduled soon
unless:
- The scheduler is down or under heavy load
If this ta
Hi,
We have been using .conf/.ini files to store the credentials to access
endpoints. Wondering if there is a better and secure way to store the
credentials besides as Airflow connections in Admin console.
-Veera
Thanks Shah. that works.
On Wed, Jan 10, 2018 at 9:33 AM, Shah Altaf wrote:
> Hi you could have a look at the trigger_rule on your operator -
>
> See: https://airflow.apache.org/concepts.html#trigger-rules
>
> Hope that's what you meant
>
>
> On Wed, Jan 10,
I have scenario, where i need to move on to next task even if the upstream
task is failed, as long as upstream task execution is done.
Set_upstream and set_downstream methods trigger downstream task only on
success, is there a way to implement this scenario?
-Veera
nsport over port 8793 is not required. You should
> nog experience any issues when ignoring this error.
>
> Please let me know if this answers your question.
>
> Cheers, Fokko
>
> 2017-12-07 4:24 GMT+01:00 Veeranagouda Mukkanagoudar <
> mukkanagou...@gmail.com>:
>
&
Hi,
I am trying to configure the Celery [ same machine as airflow], but seeing
"Address already in use" error in worker logs .. I see following error, if
i kill the process listening to 8793 PORT, then process gets stuck. anyone
has experienced this issue during setup ?
Starting flask
[2017-12-
I am new to AirFlow, and need help with setting up how to configure the
ETLs running Redshift queries .
The approaches i am know of,
*1. Use postgress operator/hook: *
Parse the query from file, and run via hook. Use xCom to pass/set the
variables across tasks
*2. Bash Operator:*
Use this to inv
t; <https://issues.apache.org/jira/projects/AIRFLOW/issues/AIRFLOW-1695>, but
> it's awaiting PR#2532
> <https://github.com/apache/incubator-airflow/pull/2532> (moving s3 hook to
> boto3).
>
> Hope that is helpful!
>
> Best,
>
> Andy
>
> ---
> So
I am new to Airflow,
Can anyone point me to Redshift/Postgress operator or task implementation
examples.
-Thanks
Veera
11 matches
Mail list logo