[jira] [Assigned] (AIRFLOW-115) Migrate and Refactor AWS integration to use boto3 and better structured hooks
[ https://issues.apache.org/jira/browse/AIRFLOW-115?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Arthur Wiedmer reassigned AIRFLOW-115: -- Assignee: Arthur Wiedmer > Migrate and Refactor AWS integration to use boto3 and better structured hooks > - > > Key: AIRFLOW-115 > URL: https://issues.apache.org/jira/browse/AIRFLOW-115 > Project: Apache Airflow > Issue Type: Improvement > Components: AWS, boto3, hooks >Reporter: Arthur Wiedmer >Assignee: Arthur Wiedmer >Priority: Minor > > h2. Current State > The current AWS integration is mostly done through the S3Hook, which uses non > standard credentials parsing on top of using boto instead of boto3 which is > the current supported AWS sdk for Python. > h2. Proposal > an AWSHook should be provided that maps Airflow connections to the boto3 API. > Operators working with s3, as well as other AWS services would then inherit > from this hook but extend the functionality with service specific methods > like get_key for S3, start_cluster for EMR, enqueue for SQS, send_email for > SES etc... > * AWSHook > ** S3Hook > ** EMRHook > ** SQSHook > ** SESHook > ... > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Assigned] (AIRFLOW-116) Surface Airflow Version On Webservers
[ https://issues.apache.org/jira/browse/AIRFLOW-116?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Siddharth Anand reassigned AIRFLOW-116: --- Assignee: Siddharth Anand > Surface Airflow Version On Webservers > - > > Key: AIRFLOW-116 > URL: https://issues.apache.org/jira/browse/AIRFLOW-116 > Project: Apache Airflow > Issue Type: Task > Components: webserver >Reporter: Dan Davydov >Assignee: Siddharth Anand >Priority: Minor > > Surface Airflow Version On Webservers > Why? > Figuring out what version webservers are running requires sshing to the > webservers which isn't very sane (and not everyone has permissions to do > this). > Success: > Surface the current airflow version in the webserver (bonus points if the git > sha of the airflow code is shown too although this could be hacky), either on > every page as a static element or on a dedicated settings page. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Closed] (AIRFLOW-113) DAG concurrency is not honored
[ https://issues.apache.org/jira/browse/AIRFLOW-113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shenghu Yang closed AIRFLOW-113. Resolution: REMIND > DAG concurrency is not honored > -- > > Key: AIRFLOW-113 > URL: https://issues.apache.org/jira/browse/AIRFLOW-113 > Project: Apache Airflow > Issue Type: Improvement > Components: celery, executor, scheduler >Affects Versions: Airflow 1.6.2 > Environment: Version of Airflow: 1.6.2 > Airflow configuration: Running a Scheduler with LocalExecutor or > CeleryExecutor > Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC > 2015 x86_64 x86_64 x86_64 GNU/Linux > Python Version: 2.7.6 >Reporter: Shenghu Yang > > This is reported in airflow github also: > https://github.com/apache/incubator-airflow/issues/1424 > In our dag, we set the dag_args['concurrency'] = 8, however, when the > scheduler starts to run, we can see this concurrency is not being honored, > airflow scheduler will run up to num of the 'parallelism' (we set as 25) jobs. > What did you expect to happen? > dag_args['concurrency'] = 8 is honored, e.g. only run at most 8 jobs > concurrently. > What happened instead? > when the dag starts to run, we can see the concurrency is not being honored, > airflow scheduler will run up to the 'parallelism' (we set as 25) jobs. > Here is how you can reproduce this issue on your machine: > create a dag which contains nothing but 25 parallelized jobs. > set the dag dag_args['concurrency'] = 8 > set the airflow parallelism to 25 > then run: airflow scheduler > you will see all 25 jobs are scheduled to run, not 8. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-113) DAG concurrency is not honored
[ https://issues.apache.org/jira/browse/AIRFLOW-113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15283287#comment-15283287 ] Shenghu Yang commented on AIRFLOW-113: -- I will wait & test this on airflow 1.7.1, close this for now. > DAG concurrency is not honored > -- > > Key: AIRFLOW-113 > URL: https://issues.apache.org/jira/browse/AIRFLOW-113 > Project: Apache Airflow > Issue Type: Improvement > Components: celery, executor, scheduler >Affects Versions: Airflow 1.6.2 > Environment: Version of Airflow: 1.6.2 > Airflow configuration: Running a Scheduler with LocalExecutor or > CeleryExecutor > Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC > 2015 x86_64 x86_64 x86_64 GNU/Linux > Python Version: 2.7.6 >Reporter: Shenghu Yang > > This is reported in airflow github also: > https://github.com/apache/incubator-airflow/issues/1424 > In our dag, we set the dag_args['concurrency'] = 8, however, when the > scheduler starts to run, we can see this concurrency is not being honored, > airflow scheduler will run up to num of the 'parallelism' (we set as 25) jobs. > What did you expect to happen? > dag_args['concurrency'] = 8 is honored, e.g. only run at most 8 jobs > concurrently. > What happened instead? > when the dag starts to run, we can see the concurrency is not being honored, > airflow scheduler will run up to the 'parallelism' (we set as 25) jobs. > Here is how you can reproduce this issue on your machine: > create a dag which contains nothing but 25 parallelized jobs. > set the dag dag_args['concurrency'] = 8 > set the airflow parallelism to 25 > then run: airflow scheduler > you will see all 25 jobs are scheduled to run, not 8. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-113) DAG concurrency is not honored
[ https://issues.apache.org/jira/browse/AIRFLOW-113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15283276#comment-15283276 ] Chris Riccomini commented on AIRFLOW-113: - See also this: AIRFLOW-57 > DAG concurrency is not honored > -- > > Key: AIRFLOW-113 > URL: https://issues.apache.org/jira/browse/AIRFLOW-113 > Project: Apache Airflow > Issue Type: Improvement > Components: celery, executor, scheduler >Affects Versions: Airflow 1.6.2 > Environment: Version of Airflow: 1.6.2 > Airflow configuration: Running a Scheduler with LocalExecutor or > CeleryExecutor > Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC > 2015 x86_64 x86_64 x86_64 GNU/Linux > Python Version: 2.7.6 >Reporter: Shenghu Yang > > This is reported in airflow github also: > https://github.com/apache/incubator-airflow/issues/1424 > In our dag, we set the dag_args['concurrency'] = 8, however, when the > scheduler starts to run, we can see this concurrency is not being honored, > airflow scheduler will run up to num of the 'parallelism' (we set as 25) jobs. > What did you expect to happen? > dag_args['concurrency'] = 8 is honored, e.g. only run at most 8 jobs > concurrently. > What happened instead? > when the dag starts to run, we can see the concurrency is not being honored, > airflow scheduler will run up to the 'parallelism' (we set as 25) jobs. > Here is how you can reproduce this issue on your machine: > create a dag which contains nothing but 25 parallelized jobs. > set the dag dag_args['concurrency'] = 8 > set the airflow parallelism to 25 > then run: airflow scheduler > you will see all 25 jobs are scheduled to run, not 8. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-113) DAG concurrency is not honored
[ https://issues.apache.org/jira/browse/AIRFLOW-113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15283273#comment-15283273 ] Chris Riccomini commented on AIRFLOW-113: - Also, I wonder if perhaps you want max_active_runs instead of concurrency? >From [FAQ|https://pythonhosted.org/airflow/faq.html]: {quote} concurency defines how many running task instances a DAG is allowed to have, beyond which point things get queued {quote} Whereas: {quote} Is the max_active_runs parameter of your DAG reached? max_active_runs defines how many running concurrent instances of a DAG there are allowed to be. {quote} > DAG concurrency is not honored > -- > > Key: AIRFLOW-113 > URL: https://issues.apache.org/jira/browse/AIRFLOW-113 > Project: Apache Airflow > Issue Type: Improvement > Components: celery, executor, scheduler >Affects Versions: Airflow 1.6.2 > Environment: Version of Airflow: 1.6.2 > Airflow configuration: Running a Scheduler with LocalExecutor or > CeleryExecutor > Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC > 2015 x86_64 x86_64 x86_64 GNU/Linux > Python Version: 2.7.6 >Reporter: Shenghu Yang > > This is reported in airflow github also: > https://github.com/apache/incubator-airflow/issues/1424 > In our dag, we set the dag_args['concurrency'] = 8, however, when the > scheduler starts to run, we can see this concurrency is not being honored, > airflow scheduler will run up to num of the 'parallelism' (we set as 25) jobs. > What did you expect to happen? > dag_args['concurrency'] = 8 is honored, e.g. only run at most 8 jobs > concurrently. > What happened instead? > when the dag starts to run, we can see the concurrency is not being honored, > airflow scheduler will run up to the 'parallelism' (we set as 25) jobs. > Here is how you can reproduce this issue on your machine: > create a dag which contains nothing but 25 parallelized jobs. > set the dag dag_args['concurrency'] = 8 > set the airflow parallelism to 25 > then run: airflow scheduler > you will see all 25 jobs are scheduled to run, not 8. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-113) DAG concurrency is not honored
[ https://issues.apache.org/jira/browse/AIRFLOW-113?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15283268#comment-15283268 ] Chris Riccomini commented on AIRFLOW-113: - [~Shenghu], can you: # Test this against master? # Post your DAG (or some equivalent) Python code so we can try and replicate > DAG concurrency is not honored > -- > > Key: AIRFLOW-113 > URL: https://issues.apache.org/jira/browse/AIRFLOW-113 > Project: Apache Airflow > Issue Type: Improvement > Components: celery, executor, scheduler >Affects Versions: Airflow 1.6.2 > Environment: Version of Airflow: 1.6.2 > Airflow configuration: Running a Scheduler with LocalExecutor or > CeleryExecutor > Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC > 2015 x86_64 x86_64 x86_64 GNU/Linux > Python Version: 2.7.6 >Reporter: Shenghu Yang > > This is reported in airflow github also: > https://github.com/apache/incubator-airflow/issues/1424 > In our dag, we set the dag_args['concurrency'] = 8, however, when the > scheduler starts to run, we can see this concurrency is not being honored, > airflow scheduler will run up to num of the 'parallelism' (we set as 25) jobs. > What did you expect to happen? > dag_args['concurrency'] = 8 is honored, e.g. only run at most 8 jobs > concurrently. > What happened instead? > when the dag starts to run, we can see the concurrency is not being honored, > airflow scheduler will run up to the 'parallelism' (we set as 25) jobs. > Here is how you can reproduce this issue on your machine: > create a dag which contains nothing but 25 parallelized jobs. > set the dag dag_args['concurrency'] = 8 > set the airflow parallelism to 25 > then run: airflow scheduler > you will see all 25 jobs are scheduled to run, not 8. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (AIRFLOW-112) Change default view from graph view to tree view
[ https://issues.apache.org/jira/browse/AIRFLOW-112?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15283218#comment-15283218 ] ASF subversion and git services commented on AIRFLOW-112: - Commit 17bcf10fe55dd50fc42ae858d67bd26ba08988fe in incubator-airflow's branch refs/heads/master from [~aoen] [ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=17bcf10 ] [AIRFLOW-112] no-op README change to close this jira's PR > Change default view from graph view to tree view > > > Key: AIRFLOW-112 > URL: https://issues.apache.org/jira/browse/AIRFLOW-112 > Project: Apache Airflow > Issue Type: Task >Reporter: Dan Davydov >Assignee: Dan Davydov >Priority: Minor > > Most users use tree view instead of graph view to look at task instances (as > the history shown is useful), so the default view should be the tree view. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[3/3] incubator-airflow git commit: Merge branch '1498'
Merge branch '1498' Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/07fe7d7b Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/07fe7d7b Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/07fe7d7b Branch: refs/heads/master Commit: 07fe7d7b4a66ed55633c7e0258cbfe55c6b6d0a2 Parents: 1feac38 17bcf10 Author: Dan DavydovAuthored: Fri May 13 15:04:23 2016 -0700 Committer: Dan Davydov Committed: Fri May 13 15:04:23 2016 -0700 -- TODO.md | 1 - 1 file changed, 1 deletion(-) --
[2/3] incubator-airflow git commit: [AIRFLOW-112] no-op README change to close this jira's PR
[AIRFLOW-112] no-op README change to close this jira's PR Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/17bcf10f Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/17bcf10f Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/17bcf10f Branch: refs/heads/master Commit: 17bcf10fe55dd50fc42ae858d67bd26ba08988fe Parents: 30608b8 Author: Dan DavydovAuthored: Fri May 13 14:57:01 2016 -0700 Committer: Dan Davydov Committed: Fri May 13 14:57:01 2016 -0700 -- TODO.md | 1 - 1 file changed, 1 deletion(-) -- http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/17bcf10f/TODO.md -- diff --git a/TODO.md b/TODO.md index 091df2b..cf19035 100644 --- a/TODO.md +++ b/TODO.md @@ -11,7 +11,6 @@ * Test and migrate to use beeline instead of the Hive CLI * Run Hive / Hadoop / HDFS tests in Travis-CI - UI * Backfill form * Better task filtering int duration and landing time charts (operator toggle, task regex, uncheck all button)
[jira] [Created] (AIRFLOW-114) Alphabetizing plugins dropdown
Varant Zanoyan created AIRFLOW-114: -- Summary: Alphabetizing plugins dropdown Key: AIRFLOW-114 URL: https://issues.apache.org/jira/browse/AIRFLOW-114 Project: Apache Airflow Issue Type: Improvement Reporter: Varant Zanoyan Priority: Trivial Alphabetizing plugins dropdown -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Updated] (AIRFLOW-113) DAG concurrency is not honored
[ https://issues.apache.org/jira/browse/AIRFLOW-113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shenghu Yang updated AIRFLOW-113: - Environment: Version of Airflow: 1.6.2 Airflow configuration: Running a Scheduler with LocalExecutor or CeleryExecutor Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux Python Version: 2.7.6 was: Version of Airflow: 1.6.2 Airflow configuration: Running a Scheduler with LocalExecutor Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux Python Version: 2.7.6 > DAG concurrency is not honored > -- > > Key: AIRFLOW-113 > URL: https://issues.apache.org/jira/browse/AIRFLOW-113 > Project: Apache Airflow > Issue Type: Improvement > Components: celery, executor, scheduler >Affects Versions: Airflow 1.6.2 > Environment: Version of Airflow: 1.6.2 > Airflow configuration: Running a Scheduler with LocalExecutor or > CeleryExecutor > Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC > 2015 x86_64 x86_64 x86_64 GNU/Linux > Python Version: 2.7.6 >Reporter: Shenghu Yang > > This is reported in airflow github also: > https://github.com/apache/incubator-airflow/issues/1424 > In our dag, we set the dag_args['concurrency'] = 8, however, when the > scheduler starts to run, we can see this concurrency is not being honored, > airflow scheduler will run up to num of the 'parallelism' (we set as 25) jobs. > What did you expect to happen? > dag_args['concurrency'] = 8 is honored, e.g. only run at most 8 jobs > concurrently. > What happened instead? > when the dag starts to run, we can see the concurrency is not being honored, > airflow scheduler will run up to the 'parallelism' (we set as 25) jobs. > Here is how you can reproduce this issue on your machine: > create a dag which contains nothing but 25 parallelized jobs. > set the dag dag_args['concurrency'] = 8 > set the airflow parallelism to 25 > then run: airflow scheduler > you will see all 25 jobs are scheduled to run, not 8. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (AIRFLOW-113) DAG concurrency is not honored
Shenghu Yang created AIRFLOW-113: Summary: DAG concurrency is not honored Key: AIRFLOW-113 URL: https://issues.apache.org/jira/browse/AIRFLOW-113 Project: Apache Airflow Issue Type: Improvement Components: celery, executor, scheduler Affects Versions: Airflow 1.6.2 Environment: Version of Airflow: 1.6.2 Airflow configuration: Running a Scheduler with LocalExecutor Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux Python Version: 2.7.6 Reporter: Shenghu Yang This is reported in airflow github also: https://github.com/apache/incubator-airflow/issues/1424 In our dag, we set the dag_args['concurrency'] = 8, however, when the scheduler starts to run, we can see this concurrency is not being honored, airflow scheduler will run up to num of the 'parallelism' (we set as 25) jobs. What did you expect to happen? dag_args['concurrency'] = 8 is honored, e.g. only run at most 8 jobs concurrently. What happened instead? when the dag starts to run, we can see the concurrency is not being honored, airflow scheduler will run up to the 'parallelism' (we set as 25) jobs. Here is how you can reproduce this issue on your machine: create a dag which contains nothing but 25 parallelized jobs. set the dag dag_args['concurrency'] = 8 set the airflow parallelism to 25 then run: airflow scheduler you will see all 25 jobs are scheduled to run, not 8. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Closed] (AIRFLOW-111) DAG concurrency is not honored
[ https://issues.apache.org/jira/browse/AIRFLOW-111?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Shenghu Yang closed AIRFLOW-111. Resolution: Duplicate > DAG concurrency is not honored > -- > > Key: AIRFLOW-111 > URL: https://issues.apache.org/jira/browse/AIRFLOW-111 > Project: Apache Airflow > Issue Type: Sub-task > Components: celery, scheduler >Affects Versions: Airflow 1.6.2 > Environment: Version of Airflow: 1.6.2 > Airflow configuration: Running a Scheduler with LocalExecutor > Operating System: 3.13.0-74-generic #118-Ubuntu SMP Thu Dec 17 22:52:10 UTC > 2015 x86_64 x86_64 x86_64 GNU/Linux > Python Version: 2.7.6 > Screen shots of your DAG's status: >Reporter: Shenghu Yang > Fix For: Airflow 2.0 > > > Description of Issue > In our dag, we set the dag_args['concurrency'] = 8, however, when the > scheduler starts to run, we can see this concurrency is not being honored, > airflow scheduler will run up to num of the 'parallelism' (we set as 25) jobs. > What did you expect to happen? > dag_args['concurrency'] = 8 is honored, e.g. only run at most 8 jobs > concurrently. > What happened instead? > when the dag starts to run, we can see the concurrency is not being honored, > airflow scheduler/celery worker will run up to the 'parallelism' (we set as > 25) jobs. > Here is how you can reproduce this issue on your machine: > create a dag which contains nothing but 25 parallelized jobs. > set the dag dag_args['concurrency'] = 8 > set the airflow parallelism to 25 > then run: airflow scheduler > you will see all 25 jobs are scheduled to run, not 8. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[incubator-airflow] Git Push Summary
Repository: incubator-airflow Updated Tags: refs/tags/airbnb_1.7.1rc6 [created] 563be1324
incubator-airflow git commit: Cherrypick bugfix ab5d445992617585a0ced1d81881a0728f49b13a
Repository: incubator-airflow Updated Branches: refs/heads/airbnb_rb1.7.1 173b19313 -> ce86e0319 Cherrypick bugfix ab5d445992617585a0ced1d81881a0728f49b13a Project: http://git-wip-us.apache.org/repos/asf/incubator-airflow/repo Commit: http://git-wip-us.apache.org/repos/asf/incubator-airflow/commit/ce86e031 Tree: http://git-wip-us.apache.org/repos/asf/incubator-airflow/tree/ce86e031 Diff: http://git-wip-us.apache.org/repos/asf/incubator-airflow/diff/ce86e031 Branch: refs/heads/airbnb_rb1.7.1 Commit: ce86e0319031a95bf14cb6b0b3edd4a4962adbdc Parents: 173b193 Author: Siddharth AnandAuthored: Thu May 12 03:37:51 2016 + Committer: Dan Davydov Committed: Fri May 13 10:32:53 2016 -0700 -- airflow/jobs.py | 5 + airflow/models.py | 11 ++- 2 files changed, 15 insertions(+), 1 deletion(-) -- http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/ce86e031/airflow/jobs.py -- diff --git a/airflow/jobs.py b/airflow/jobs.py index 34318f3..06436ef 100644 --- a/airflow/jobs.py +++ b/airflow/jobs.py @@ -513,6 +513,11 @@ class SchedulerJob(BaseJob): elif ti.is_runnable(flag_upstream_failed=True): self.logger.debug('Firing task: {}'.format(ti)) executor.queue_task_instance(ti, pickle_id=pickle_id) +elif ti.is_premature(): +continue +else: +self.logger.debug('Adding task: {} to the COULD_NOT_RUN set'.format(ti)) +could_not_run.add(ti) # Releasing the lock self.logger.debug("Unlocking DAG (scheduler_lock)") http://git-wip-us.apache.org/repos/asf/incubator-airflow/blob/ce86e031/airflow/models.py -- diff --git a/airflow/models.py b/airflow/models.py index eeb1269..7754875 100644 --- a/airflow/models.py +++ b/airflow/models.py @@ -788,7 +788,7 @@ class TaskInstance(Base): if self.execution_date > datetime.now(): return False # is the task still in the retry waiting period? -elif self.state == State.UP_FOR_RETRY and not self.ready_for_retry(): +elif self.is_premature(): return False # does the task have an end_date prior to the execution date? elif self.task.end_date and self.execution_date > self.task.end_date: @@ -810,6 +810,15 @@ class TaskInstance(Base): else: return False + +def is_premature(self): +""" +Returns whether a task is in UP_FOR_RETRY state and its retry interval +has elapsed. +""" +# is the task still in the retry waiting period? +return self.state == State.UP_FOR_RETRY and not self.ready_for_retry() + def is_runnable( self, include_queued=False,
[jira] [Commented] (AIRFLOW-91) Ssl gunicorn
[ https://issues.apache.org/jira/browse/AIRFLOW-91?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15282800#comment-15282800 ] Chris Riccomini commented on AIRFLOW-91: Your latest PR doesn't even include this line: {code} configuration.get('webserver', 'ssl_keyfile') {code} Which is in your stack trace (cli.py, line 689). Line 689 in your PR contains: {code} ("-r", "--report"), "Show DagBag loading report", "store_true"), {code} I think your PR is missing some commits or something. > Ssl gunicorn > > > Key: AIRFLOW-91 > URL: https://issues.apache.org/jira/browse/AIRFLOW-91 > Project: Apache Airflow > Issue Type: Improvement > Components: security >Reporter: Stanilovsky Evgeny >Assignee: Stanilovsky Evgeny > > old issue : https://github.com/apache/incubator-airflow/pull/1492 > Ssl gunicorn support -- This message was sent by Atlassian JIRA (v6.3.4#6332)