[jira] [Updated] (AIRFLOW-6726) Document using airflow_local_settings.py
[ https://issues.apache.org/jira/browse/AIRFLOW-6726?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6726: Fix Version/s: (was: 2.0.0) > Document using airflow_local_settings.py > > > Key: AIRFLOW-6726 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6726 > Project: Apache Airflow > Issue Type: Improvement > Components: documentation >Affects Versions: 2.0.0, 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 1.10.8, 1.10.9 > > > Document how and when to use airflow_local_settings.py -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6726) Document using airflow_local_settings.py
Kaxil Naik created AIRFLOW-6726: --- Summary: Document using airflow_local_settings.py Key: AIRFLOW-6726 URL: https://issues.apache.org/jira/browse/AIRFLOW-6726 Project: Apache Airflow Issue Type: Improvement Components: documentation Affects Versions: 1.10.7, 2.0.0 Reporter: Kaxil Naik Assignee: Kaxil Naik Document how and when to use airflow_local_settings.py -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6725) Simplify chaining operation in DagFileProcessorManager
[ https://issues.apache.org/jira/browse/AIRFLOW-6725?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6725. - Resolution: Fixed > Simplify chaining operation in DagFileProcessorManager > -- > > Key: AIRFLOW-6725 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6725 > Project: Apache Airflow > Issue Type: Improvement > Components: core >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > Simplify chaining operation in an IF condition -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-5167) Dependancy conflict with grpc-google-iam-v1
[ https://issues.apache.org/jira/browse/AIRFLOW-5167?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-5167: Fix Version/s: (was: 1.10.8) > Dependancy conflict with grpc-google-iam-v1 > --- > > Key: AIRFLOW-5167 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5167 > Project: Apache Airflow > Issue Type: Bug > Components: gcp >Affects Versions: 1.10.4 >Reporter: Sujay Mansingh >Priority: Trivial > Fix For: 2.0.0 > > > If we use a tool that does script requirements checking (e.g. > {{pip-compile}}) then an erroe is thrown when handling the dependencies for > {{apache-airflow[gcp_api]}} > {code:java} > $ pip-compile --version > pip-compile, version 3.7.0 > $ cat requirements.in > apache-airflow[gcp_api] > $ pip-compile -o requirements.txt requirements.in > Could not find a version that matches > grpc-google-iam-v1<0.12dev,<0.13dev,>=0.11.4,>=0.12.3 > Tried: 0.9.0, 0.10.0, 0.10.1, 0.11.1, 0.11.3, 0.11.4, 0.12.0, 0.12.1, 0.12.2, > 0.12.3 > There are incompatible versions in the resolved dependencies. > {code} > Looks like a set of inconsistent dependencies in the google libs :/ > {code:java} > $ pip-compile -v -o requirements.txt requirements.in | grep -i grpc-google-iam > google-cloud-container==0.3.0 requires > google-api-core[grpc]<2.0.0dev,>=1.14.0, grpc-google-iam-v1<0.13dev,>=0.12.3 > google-cloud-bigtable==0.33.0 requires > google-api-core[grpc]<2.0.0dev,>=1.6.0, google-cloud-core<2.0dev,>=1.0.0, > grpc-google-iam-v1<0.12dev,>=0.11.4 > google-cloud-spanner==1.9.0 requires > google-api-core[grpc,grpcgcp]<2.0.0dev,>=1.4.1, > google-cloud-core<2.0dev,>=1.0.0, grpc-google-iam-v1<0.12dev,>=0.11.4 > adding ['grpc-google-iam-v1', '<0.12dev,<0.13dev,>=0.11.4,>=0.12.3', '[]'] > grpc-google-iam-v1<0.12dev,<0.13dev,>=0.11.4,>=0.12.3 > Could not find a version that matches > grpc-google-iam-v1<0.12dev,<0.13dev,>=0.11.4,>=0.12.3 > Tried: 0.9.0, 0.10.0, 0.10.1, 0.11.1, 0.11.3, 0.11.4, 0.12.0, 0.12.1, 0.12.2, > 0.12.3 > There are incompatible versions in the resolved dependencies. > {code} > So it looks like {{google-cloud-bigtable==0.33.0}} and > {{google-cloud-spanner==1.9.0}} require a version that is {{>=0.11.4}} and > {{<0.12dev}} > {{google-cloud-container==0.3.0}} upsets everything though by specifying > version {{>=0.12.3}} and {{<0.13dev}} > For now, we can work around issue by not checking any conflicts when > installing {{apache_airflow[gcp]}}. > That isn't ideal > - All it means is that the version of {{grpc-google-iam-v1}} that is > specified last will be installed > - I'm assuming that there are no major api changes in {{grpc-google-iam}} > between {{<0.12dev}} and {{<0.13dev}}. If there are then I imagine tests > would fail, but it still feels a little wrong to allow use of a lib that > doesn't match the version required > - Relaxing the check just means that we could have a conflict elsewhere that > isn't reported > I understand this is not an issue with airflow, but more to do with the > google python libraries. > However it does affect us when we try to use airflow and be strict with > python requirements. > Any ideas on what to do? -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6245) Allow custom waiters for AWS batch jobs
[ https://issues.apache.org/jira/browse/AIRFLOW-6245?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6245: Fix Version/s: (was: 1.10.8) > Allow custom waiters for AWS batch jobs > --- > > Key: AIRFLOW-6245 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6245 > Project: Apache Airflow > Issue Type: Improvement > Components: aws >Affects Versions: 1.10.6 >Reporter: Darren Weber >Assignee: Darren Weber >Priority: Minor > Labels: AWS, aws-batch > Fix For: 2.0.0 > > > The botocore waiter for AWS batch jobs has not been merged and released for > several years, i.e. > - [https://github.com/boto/botocore/pull/1307] > While working on this Airflow issue, I've also pushed up a PR on botocore to > use a default waiter with exponential backoff and add an option to use a > custom function for the delays between polling status, see > - [https://github.com/boto/botocore/issues/1915] > - [https://github.com/boto/botocore/pull/1921] > > For Airflow, adopt something from botocore PR-1307 as an example to create a > default batch job waiter in Airflow. As a guide to creating a custom waiter, > see > - [https://www.2ndwatch.com/blog/use-waiters-boto3-write/] > - > [https://boto3.amazonaws.com/v1/documentation/api/latest/guide/clients.html#waiters] > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-1467) allow tasks to use more than one pool slot
[ https://issues.apache.org/jira/browse/AIRFLOW-1467?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-1467: Fix Version/s: (was: 1.10.8) 1.10.9 2.0.0 > allow tasks to use more than one pool slot > -- > > Key: AIRFLOW-1467 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1467 > Project: Apache Airflow > Issue Type: Improvement > Components: scheduler >Reporter: Adrian Bridgett >Assignee: Lokesh Lal >Priority: Trivial > Labels: pool > Fix For: 2.0.0, 1.10.9 > > > It would be useful to have tasks use more than a single pool slot. > Our use case is actually to limit how many tasks run on a head node (due to > memory constraints), currently we have to set a pool limit limiting how many > tasks. > Ideally we could set the pool size to e.g amount of memory and then set those > tasks pool_usage (or whatever the option would be called) to the amount of > memory we think they'll use. This way the pool would let lots of small tasks > run or just a few large tasks. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-3607) Decreasing scheduler delay between tasks
[ https://issues.apache.org/jira/browse/AIRFLOW-3607?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-3607: Fix Version/s: (was: 1.10.8) 2.0.0 1.10.9 > Decreasing scheduler delay between tasks > > > Key: AIRFLOW-3607 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3607 > Project: Apache Airflow > Issue Type: Improvement > Components: scheduler >Affects Versions: 1.10.0, 1.10.1, 1.10.2 > Environment: ubuntu 14.04 >Reporter: Amichai Horvitz >Assignee: Amichai Horvitz >Priority: Major > Fix For: 2.0.0, 1.10.9 > > Original Estimate: 336h > Remaining Estimate: 336h > > I came across the TODO in airflow/ti_deps/deps/trigger_rule_dep (line 52) > that says instead of checking the query for every task let the tasks report > to the dagrun. I have a dag with many tasks and the delay between tasks can > rise to 10 seconds or more, I already changed the configuration, added > processes and memory, checked the code and did research, profiling and other > experiments. I hope that this change will make a drastic change in the delay. > I would be happy to discuss this solution, the research and other solutions > for this issue. > Thanks -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6725) Simplify chaining operation in DagFileProcessorManager
Kaxil Naik created AIRFLOW-6725: --- Summary: Simplify chaining operation in DagFileProcessorManager Key: AIRFLOW-6725 URL: https://issues.apache.org/jira/browse/AIRFLOW-6725 Project: Apache Airflow Issue Type: Improvement Components: core Affects Versions: 2.0.0 Reporter: Kaxil Naik Assignee: Kaxil Naik Fix For: 2.0.0 Simplify chaining operation in an IF condition -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6322) Add pytest marker for slow tests
[ https://issues.apache.org/jira/browse/AIRFLOW-6322?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6322: Fix Version/s: (was: 1.10.8) 2.0.0 > Add pytest marker for slow tests > > > Key: AIRFLOW-6322 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6322 > Project: Apache Airflow > Issue Type: Improvement > Components: tests >Affects Versions: 1.10.6 >Reporter: Darren Weber >Assignee: Darren Weber >Priority: Minor > Fix For: 2.0.0 > > > Add pytest marker for slow tests; see also broader context of markers > discussed at > - > [https://lists.apache.org/thread.html/4538437c96f599766005ba7829d0bee1511debb4f53599e0d300a56f%40%3Cdev.airflow.apache.org%3E] -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6452) scheduler_job.py - remove excess sleep/log/duration calls
[ https://issues.apache.org/jira/browse/AIRFLOW-6452?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6452: Fix Version/s: (was: 1.10.8) 2.0.0 > scheduler_job.py - remove excess sleep/log/duration calls > - > > Key: AIRFLOW-6452 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6452 > Project: Apache Airflow > Issue Type: Improvement > Components: scheduler >Affects Versions: 1.10.7 >Reporter: t oo >Assignee: t oo >Priority: Minor > Fix For: 2.0.0 > > > remove a lot of these debug calls, wrap some in boolean of loglevel, remove > the 2nd sleep and stuff about getting duration/start/end.etc: > self.log.debug("Starting Loop...") > loop_start_time = time.time() > if self.using_sqlite: > self.processor_agent.heartbeat() > # For the sqlite case w/ 1 thread, wait until the processor > # is finished to avoid concurrent access to the DB. > self.log.debug( > "Waiting for processors to finish since we're using > sqlite") > self.processor_agent.wait_until_finished() > self.log.debug("Harvesting DAG parsing results") > simple_dags = self._get_simple_dags() > self.log.debug("Harvested {} SimpleDAGs".format(len(simple_dags))) > # Send tasks for execution if available > simple_dag_bag = SimpleDagBag(simple_dags) > if not > self._validate_and_run_task_instances(simple_dag_bag=simple_dag_bag): > continue > # Heartbeat the scheduler periodically > time_since_last_heartbeat = (timezone.utcnow() - > > last_self_heartbeat_time).total_seconds() > if time_since_last_heartbeat > self.heartrate: > self.log.debug("Heartbeating the scheduler") > self.heartbeat() > last_self_heartbeat_time = timezone.utcnow() > loop_end_time = time.time() > loop_duration = loop_end_time - loop_start_time > self.log.debug( > "Ran scheduling loop in %.2f seconds", > loop_duration) > if not is_unit_test: > self.log.debug("Sleeping for %.2f seconds", > self._processor_poll_interval) > time.sleep(self._processor_poll_interval) > if self.processor_agent.done: > self.log.info("Exiting scheduler loop as all files" > " have been processed {} > times".format(self.num_runs)) > break > if loop_duration < 1 and not is_unit_test: > sleep_length = 1 - loop_duration > self.log.debug( > "Sleeping for {0:.2f} seconds to prevent excessive > logging" > .format(sleep_length)) > sleep(sleep_length) -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-5912) Expose lineage through REST
[ https://issues.apache.org/jira/browse/AIRFLOW-5912?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-5912: Fix Version/s: (was: 1.10.8) > Expose lineage through REST > --- > > Key: AIRFLOW-5912 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5912 > Project: Apache Airflow > Issue Type: Sub-task > Components: lineage >Affects Versions: 1.10.6 >Reporter: Bolke de Bruin >Priority: Major > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-5912) Expose lineage through REST
[ https://issues.apache.org/jira/browse/AIRFLOW-5912?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-5912: Fix Version/s: 2.0.0 > Expose lineage through REST > --- > > Key: AIRFLOW-5912 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5912 > Project: Apache Airflow > Issue Type: Sub-task > Components: lineage >Affects Versions: 1.10.6 >Reporter: Bolke de Bruin >Priority: Major > Fix For: 2.0.0, 1.10.8 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6530) Allow for custom Statsd client
[ https://issues.apache.org/jira/browse/AIRFLOW-6530?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6530: Fix Version/s: (was: 1.10.8) 2.0.0 > Allow for custom Statsd client > -- > > Key: AIRFLOW-6530 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6530 > Project: Apache Airflow > Issue Type: New Feature > Components: scheduler, webserver >Affects Versions: 1.10.7 >Reporter: Usman Arshad >Assignee: Usman Arshad >Priority: Major > Labels: features > Fix For: 2.0.0 > > > We are currently using Airflow at Skyscanner and we have a custom > implementation of Statsd which offers features which wires in nicely into our > metrics platform/tooling. > I'm quite sure that other companies who are using Airflow would also find > great benefit in being able to utilise their own custom Statsd client, > therefore i am proposing this addition. > > The proposed solution looks something along the lines of changing this: > {code:java} > statsd = StatsClient( > host=conf.get('scheduler', 'statsd_host'), > port=conf.getint('scheduler', 'statsd_port'), > prefix=conf.get('scheduler', 'statsd_prefix')) > {code} > Into > {code:java} > statsd = conf.get('STATSD_CLIENT') or StatsClient( > host=conf.get('scheduler', 'statsd_host'), > port=conf.getint('scheduler', 'statsd_port'), > prefix=conf.get('scheduler', 'statsd_prefix')) > {code} > Note: Psuedocode, not actual code > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-3915) Scheduler fails for dags with datetime start_date
[ https://issues.apache.org/jira/browse/AIRFLOW-3915?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-3915: Fix Version/s: (was: 1.10.8) > Scheduler fails for dags with datetime start_date > - > > Key: AIRFLOW-3915 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3915 > Project: Apache Airflow > Issue Type: Bug > Components: models >Affects Versions: 1.10.2 >Reporter: David Stuck >Priority: Major > Fix For: 2.0.0 > > > When start_date is passed in to a dag as a datetime object, it does not get > converted to pendulum and thus the scheduler fails at > [https://github.com/apache/airflow/blob/4083a8f5217e9ca7a5c83a3eaaaf403dd367a90c/airflow/models.py#L3487] > when trying to access `self.timezone.name`. > My guess is that the fix is as simple as setting `self.timezone = > pendulum.instance(self.timezone)` in `__init__`. If that sounds right I can > create a PR. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6544) Log_id is not missing when writing logs to elastic search
[ https://issues.apache.org/jira/browse/AIRFLOW-6544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6544: Fix Version/s: (was: 1.10.8) 2.0.0 > Log_id is not missing when writing logs to elastic search > - > > Key: AIRFLOW-6544 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6544 > Project: Apache Airflow > Issue Type: Bug > Components: logging >Affects Versions: 1.10.7 >Reporter: Larry Zhu >Assignee: Larry Zhu >Priority: Major > Labels: pull-request-available > Fix For: 2.0.0 > > Original Estimate: 1h > Remaining Estimate: 1h > > The “end of log” marker does not include the aforementioned log_id. The issue > is then airflow-web does not know when to stop tailing the logs. > Also it would be better to include an elasticsearch configuration for index > name so that the search is more efficient in big clusters with a lot of > indices -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-5686) Deleting DAG can leave scheduled/queued tasks consuming pool slots
[ https://issues.apache.org/jira/browse/AIRFLOW-5686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-5686: Fix Version/s: (was: 1.10.8) 2.0.0 > Deleting DAG can leave scheduled/queued tasks consuming pool slots > -- > > Key: AIRFLOW-5686 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5686 > Project: Apache Airflow > Issue Type: Bug > Components: scheduler >Affects Versions: 1.10.5 >Reporter: Ash Berlin-Taylor >Priority: Critical > Fix For: 2.0.0 > > > If you delete a dag file when it had tasks in the scheduled or queued state, > those tasks instances are never touched again. > With the slight tweak in Pool (making the default pool an explicit one) this > now matters, and this ends up with the scheduler being "blocked" from running > new tasks wiht this message: > {noformat} > Figuring out tasks to run in Pool(name=default_pool) with -9022 open slots > and 45 task instances ready to be queued {noformat} > The fix should be to set any task instance in a non-terminal state (None, > queued, scheduled, running, up_for_retry etc.) to "removed" inside > DAG.deactivate_stale_dags (which is already called on scheduler shutdown). -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6319) Add support for workgroups in AWS Athena
[ https://issues.apache.org/jira/browse/AIRFLOW-6319?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6319. - Resolution: Fixed > Add support for workgroups in AWS Athena > - > > Key: AIRFLOW-6319 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6319 > Project: Apache Airflow > Issue Type: Improvement > Components: aws, contrib >Affects Versions: 1.10.6 >Reporter: Bhavika >Assignee: Bhavika >Priority: Minor > Labels: pull-request-available > Fix For: 1.10.8 > > > The AWS Athena hooks and operators currently do not provide the ability to > use workgroups for running queries. More on the feature here - > [https://docs.aws.amazon.com/athena/latest/ug/workgroups.html] & > [https://docs.aws.amazon.com/athena/latest/ug/user-created-workgroups.html] > > I propose we add the ability to pass in workgroup names while creating the > Athena connection. The default workgroup name in Athena is `default` - so we > can pass that to define a connection when a workgroup is not passed in by the > user. > > I can submit a PR but would like some suggestions on how I can expand the > test suite to involve a "user created workgroup" that the unit tests can use. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6404) Add ANSI color support
[ https://issues.apache.org/jira/browse/AIRFLOW-6404?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6404: Fix Version/s: (was: 1.10.8) 2.0.0 > Add ANSI color support > -- > > Key: AIRFLOW-6404 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6404 > Project: Apache Airflow > Issue Type: New Feature > Components: logging, ui >Affects Versions: 1.10.7 >Reporter: ohad >Assignee: ohad >Priority: Trivial > Labels: Color, logger > Fix For: 2.0.0 > > > Add ANSI color support in order to present colored logs in the airflow webUI > {code:java} > ESC [ 0 m # reset all (colors and brightness) > # FOREGROUND: > ESC [ 30 m # black > ESC [ 31 m # red > ESC [ 32 m # green > ESC [ 33 m # yellow > ESC [ 34 m # blue > ESC [ 35 m # magenta > ESC [ 36 m # cyan > ESC [ 37 m # white > # BACKGROUND > ESC [ 40 m # black > ESC [ 41 m # red > ESC [ 42 m # green > ESC [ 43 m # yellow > ESC [ 44 m # blue > ESC [ 45 m # magenta > ESC [ 46 m # cyan > ESC [ 47 m # white > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6698) Allow shorthand notation
[ https://issues.apache.org/jira/browse/AIRFLOW-6698?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6698: Fix Version/s: (was: 1.10.8) 2.0.0 > Allow shorthand notation > > > Key: AIRFLOW-6698 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6698 > Project: Apache Airflow > Issue Type: Sub-task > Components: lineage >Affects Versions: 1.10.7 >Reporter: Bolke de Bruin >Assignee: Bolke de Bruin >Priority: Major > Fix For: 2.0.0 > > > Shorthand notation makes it a lot easier to define dags with lineage support -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6706) Lazy load operator extra links
[ https://issues.apache.org/jira/browse/AIRFLOW-6706?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6706: Fix Version/s: (was: 1.10.8) > Lazy load operator extra links > -- > > Key: AIRFLOW-6706 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6706 > Project: Apache Airflow > Issue Type: Improvement > Components: serialization >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Priority: Major > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6626) Add email on failure or retry to default config
[ https://issues.apache.org/jira/browse/AIRFLOW-6626?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6626: Fix Version/s: (was: 1.10.8) > Add email on failure or retry to default config > --- > > Key: AIRFLOW-6626 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6626 > Project: Apache Airflow > Issue Type: Task > Components: configuration >Affects Versions: 1.10.7 >Reporter: Xinbin Huang >Assignee: Xinbin Huang >Priority: Major > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6672) AWS DataSync - better logging of error message
[ https://issues.apache.org/jira/browse/AIRFLOW-6672?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6672: Fix Version/s: (was: 1.10.8) > AWS DataSync - better logging of error message > -- > > Key: AIRFLOW-6672 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6672 > Project: Apache Airflow > Issue Type: Improvement > Components: aws >Affects Versions: 1.10.7 >Reporter: Bjorn Olsen >Assignee: Bjorn Olsen >Priority: Minor > Fix For: 2.0.0 > > > When the AWS DataSync operator fails, it dumps a TaskDescription to the log. > The TaskDescription is in JSON format and contains several elements. This is > hard to read to try and see what exactly went wrong. > Example 1: > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO - > task_execution_description=\{"TaskExecutionArn": > "arn:aws:datasync:***:***:task/task-***/execution/exec-***", "Status": > "ERROR", "Options": {"VerifyMode": "ONLY_FILES_TRANSFERRED", "OverwriteMode": > "ALWAYS", "Atime": "BEST_EFFORT", "Mtime": "PRESERVE", "Uid": "INT_VALUE", > "Gid": "INT_VALUE", "PreserveDeletedFiles": "PRESERVE", "PreserveDevices": > "NONE", "PosixPermissions": "PRESERVE", "BytesPerSecond": -1, "TaskQueueing": > "ENABLED"}, "Excludes": [], "Includes": [\{"FilterType": "SIMPLE_PATTERN", > "Value": "***"}], "StartTime": datetime.datetime(2020, 1, 28, 17, 36, 2, > 816000, tzinfo=tzlocal()), "EstimatedFilesToTransfer": 7, > "EstimatedBytesToTransfer": 4534925, "FilesTransferred": 7, "BytesWritten": > 4534925, "BytesTransferred": 4534925, "Result": \{"PrepareDuration": 9795, > "PrepareStatus": "SUCCESS", "TotalDuration": 351660, "TransferDuration": > 338568, "TransferStatus": "SUCCESS", "VerifyDuration": 7006, "VerifyStatus": > "ERROR", "ErrorCode": "OpNotSupp", "ErrorDetail": "Operation not supported"}, > "ResponseMetadata": \{"RequestId": "***", "HTTPStatusCode": 200, > "HTTPHeaders": {"date": "Tue, 28 Jan 2020 15:44:39 GMT", "content-type": > "application/x-amz-json-1.1", "content-length": "994", "connection": > "keep-alive", "x-amzn-requestid": "***"}, "RetryAttempts": 0}} > Example 2: > [2020-01-28 18:23:23,322] \{datasync.py:354} INFO - > task_execution_description=\{"TaskExecutionArn": > "arn:aws:datasync:***:***:task/task-***/execution/exec-***", "Status": > "ERROR", "Options": {"VerifyMode": "ONLY_FILES_TRANSFERRED", "OverwriteMode": > "ALWAYS", "Atime": "BEST_EFFORT", "Mtime": "PRESERVE", "Uid": "INT_VALUE", > "Gid": "INT_VALUE", "PreserveDeletedFiles": "PRESERVE", "PreserveDevices": > "NONE", "PosixPermissions": "PRESERVE", "BytesPerSecond": -1, "TaskQueueing": > "ENABLED"}, "Excludes": [], "Includes": [\{"FilterType": "SIMPLE_PATTERN", > "Value": "***"}], "StartTime": datetime.datetime(2020, 1, 28, 17, 45, 57, > 212000, tzinfo=tzlocal()), "EstimatedFilesToTransfer": 0, > "EstimatedBytesToTransfer": 0, "FilesTransferred": 0, "BytesWritten": 0, > "BytesTransferred": 0, "Result": \{"PrepareDuration": 16687, "PrepareStatus": > "SUCCESS", "TotalDuration": 2083467, "TransferDuration": 2065744, > "TransferStatus": "ERROR", "VerifyDuration": 5251, "VerifyStatus": "SUCCESS", > "ErrorCode": "SockTlsHandshakeFailure", "ErrorDetail": "DataSync agent ran > into an error connecting to AWS.Please review the DataSync network > requirements and ensure required endpoints are accessible from the agent. > Please contact AWS support if the error persists."}, "ResponseMetadata": > \{"RequestId": "***", "HTTPStatusCode": 200, "HTTPHeaders": {"date": "Tue, 28 > Jan 2020 16:23:23 GMT", "content-type": "application/x-amz-json-1.1", > "content-length": "1179", "connection": "keep-alive", "x-amzn-requestid": > "***"}, "RetryAttempts": 0}} > > Note that the 'Result' element contains the statuses and errors that are of > interest, however these are hard to see in the log at the moment. > Example of a successful one: > 'Result': \{'PrepareDuration': 9663, 'PrepareStatus': 'SUCCESS', > 'TotalDuration': 352095, 'TransferDuration': 338358, 'TransferStatus': > 'SUCCESS', 'VerifyDuration': 7171, 'VerifyStatus': 'SUCCESS'}, > Suggested output is to include the previous line/s but also add: > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO/ERROR - Status=SUCCESS/ERROR > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO/ERROR - > PrepareStatus=SUCCESS/ERROR PrepareDuration=9795 > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO/ERROR - > TransferStatus=SUCCESS/ERROR TransferDuration=9795 > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO/ERROR - > VerifyStatus=SUCCESS/ERROR TransferDuration=9795 > [2020-01-28 17:44:39,495] \{datasync.py:354} ERROR - ErrorCode=OpNotSupp, > ErrorDetail=Operation not supported > > This should make it much clearer what the job status and errors are. -- This
[jira] [Updated] (AIRFLOW-6672) AWS DataSync - better logging of error message
[ https://issues.apache.org/jira/browse/AIRFLOW-6672?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6672: Affects Version/s: (was: 1.10.7) 2.0.0 > AWS DataSync - better logging of error message > -- > > Key: AIRFLOW-6672 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6672 > Project: Apache Airflow > Issue Type: Improvement > Components: aws >Affects Versions: 2.0.0 >Reporter: Bjorn Olsen >Assignee: Bjorn Olsen >Priority: Minor > Fix For: 2.0.0 > > > When the AWS DataSync operator fails, it dumps a TaskDescription to the log. > The TaskDescription is in JSON format and contains several elements. This is > hard to read to try and see what exactly went wrong. > Example 1: > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO - > task_execution_description=\{"TaskExecutionArn": > "arn:aws:datasync:***:***:task/task-***/execution/exec-***", "Status": > "ERROR", "Options": {"VerifyMode": "ONLY_FILES_TRANSFERRED", "OverwriteMode": > "ALWAYS", "Atime": "BEST_EFFORT", "Mtime": "PRESERVE", "Uid": "INT_VALUE", > "Gid": "INT_VALUE", "PreserveDeletedFiles": "PRESERVE", "PreserveDevices": > "NONE", "PosixPermissions": "PRESERVE", "BytesPerSecond": -1, "TaskQueueing": > "ENABLED"}, "Excludes": [], "Includes": [\{"FilterType": "SIMPLE_PATTERN", > "Value": "***"}], "StartTime": datetime.datetime(2020, 1, 28, 17, 36, 2, > 816000, tzinfo=tzlocal()), "EstimatedFilesToTransfer": 7, > "EstimatedBytesToTransfer": 4534925, "FilesTransferred": 7, "BytesWritten": > 4534925, "BytesTransferred": 4534925, "Result": \{"PrepareDuration": 9795, > "PrepareStatus": "SUCCESS", "TotalDuration": 351660, "TransferDuration": > 338568, "TransferStatus": "SUCCESS", "VerifyDuration": 7006, "VerifyStatus": > "ERROR", "ErrorCode": "OpNotSupp", "ErrorDetail": "Operation not supported"}, > "ResponseMetadata": \{"RequestId": "***", "HTTPStatusCode": 200, > "HTTPHeaders": {"date": "Tue, 28 Jan 2020 15:44:39 GMT", "content-type": > "application/x-amz-json-1.1", "content-length": "994", "connection": > "keep-alive", "x-amzn-requestid": "***"}, "RetryAttempts": 0}} > Example 2: > [2020-01-28 18:23:23,322] \{datasync.py:354} INFO - > task_execution_description=\{"TaskExecutionArn": > "arn:aws:datasync:***:***:task/task-***/execution/exec-***", "Status": > "ERROR", "Options": {"VerifyMode": "ONLY_FILES_TRANSFERRED", "OverwriteMode": > "ALWAYS", "Atime": "BEST_EFFORT", "Mtime": "PRESERVE", "Uid": "INT_VALUE", > "Gid": "INT_VALUE", "PreserveDeletedFiles": "PRESERVE", "PreserveDevices": > "NONE", "PosixPermissions": "PRESERVE", "BytesPerSecond": -1, "TaskQueueing": > "ENABLED"}, "Excludes": [], "Includes": [\{"FilterType": "SIMPLE_PATTERN", > "Value": "***"}], "StartTime": datetime.datetime(2020, 1, 28, 17, 45, 57, > 212000, tzinfo=tzlocal()), "EstimatedFilesToTransfer": 0, > "EstimatedBytesToTransfer": 0, "FilesTransferred": 0, "BytesWritten": 0, > "BytesTransferred": 0, "Result": \{"PrepareDuration": 16687, "PrepareStatus": > "SUCCESS", "TotalDuration": 2083467, "TransferDuration": 2065744, > "TransferStatus": "ERROR", "VerifyDuration": 5251, "VerifyStatus": "SUCCESS", > "ErrorCode": "SockTlsHandshakeFailure", "ErrorDetail": "DataSync agent ran > into an error connecting to AWS.Please review the DataSync network > requirements and ensure required endpoints are accessible from the agent. > Please contact AWS support if the error persists."}, "ResponseMetadata": > \{"RequestId": "***", "HTTPStatusCode": 200, "HTTPHeaders": {"date": "Tue, 28 > Jan 2020 16:23:23 GMT", "content-type": "application/x-amz-json-1.1", > "content-length": "1179", "connection": "keep-alive", "x-amzn-requestid": > "***"}, "RetryAttempts": 0}} > > Note that the 'Result' element contains the statuses and errors that are of > interest, however these are hard to see in the log at the moment. > Example of a successful one: > 'Result': \{'PrepareDuration': 9663, 'PrepareStatus': 'SUCCESS', > 'TotalDuration': 352095, 'TransferDuration': 338358, 'TransferStatus': > 'SUCCESS', 'VerifyDuration': 7171, 'VerifyStatus': 'SUCCESS'}, > Suggested output is to include the previous line/s but also add: > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO/ERROR - Status=SUCCESS/ERROR > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO/ERROR - > PrepareStatus=SUCCESS/ERROR PrepareDuration=9795 > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO/ERROR - > TransferStatus=SUCCESS/ERROR TransferDuration=9795 > [2020-01-28 17:44:39,495] \{datasync.py:354} INFO/ERROR - > VerifyStatus=SUCCESS/ERROR TransferDuration=9795 > [2020-01-28 17:44:39,495] \{datasync.py:354} ERROR - ErrorCode=OpNotSupp, > ErrorDetail=Operation not supported > > This should make it much clearer what the job
[jira] [Updated] (AIRFLOW-6707) Simplify Connection.get_hook method
[ https://issues.apache.org/jira/browse/AIRFLOW-6707?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6707: Fix Version/s: (was: 1.10.8) 2.0.0 > Simplify Connection.get_hook method > --- > > Key: AIRFLOW-6707 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6707 > Project: Apache Airflow > Issue Type: Improvement > Components: core >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Priority: Major > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6710) Lazy create custom executor modules
[ https://issues.apache.org/jira/browse/AIRFLOW-6710?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6710: Fix Version/s: (was: 1.10.8) 2.0.0 > Lazy create custom executor modules > --- > > Key: AIRFLOW-6710 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6710 > Project: Apache Airflow > Issue Type: Improvement > Components: executors, plugins >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Priority: Major > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6699) Parameterize weekday sensor tests
[ https://issues.apache.org/jira/browse/AIRFLOW-6699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6699: Fix Version/s: (was: 1.10.8) 2.0.0 > Parameterize weekday sensor tests > - > > Key: AIRFLOW-6699 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6699 > Project: Apache Airflow > Issue Type: Improvement > Components: tests >Affects Versions: 1.10.7 >Reporter: Cooper Gillan >Assignee: Cooper Gillan >Priority: Minor > Fix For: 2.0.0 > > > Many of the arguments for the true tests in > tests/sensors/test_weekday_sensor.py are repeated. The tests are also running > with unittest.TestCase rather than pytest. > Make use of pytest's parametrize functionality to use the same base tests for > true tests. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6716) Fix AWS Datasync Example DAG
[ https://issues.apache.org/jira/browse/AIRFLOW-6716?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6716. - Resolution: Fixed > Fix AWS Datasync Example DAG > > > Key: AIRFLOW-6716 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6716 > Project: Apache Airflow > Issue Type: Bug > Components: examples >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > Currently, the DAG fails with JSON Decode error > {noformat} > ERROR [airflow.models.dagbag.DagBag] Failed to import: > /opt/airflow/airflow/providers/amazon/aws/example_dags/example_datasync_2.py > Traceback (most recent call last): > File "/opt/airflow/airflow/models/dagbag.py", line 255, in process_file > loader.exec_module(m) > File "", line 678, in exec_module > File "", line 219, in _call_with_frames_removed > File > "/opt/airflow/airflow/providers/amazon/aws/example_dags/example_datasync_2.py", > line 70, in > default_destination_location_kwargs) > File "/usr/local/lib/python3.6/json/__init__.py", line 354, in loads > return _default_decoder.decode(s) > File "/usr/local/lib/python3.6/json/decoder.py", line 339, in decode > obj, end = self.raw_decode(s, idx=_w(s, 0).end()) > File "/usr/local/lib/python3.6/json/decoder.py", line 357, in raw_decode > raise JSONDecodeError("Expecting value", s, err.value) from None > json.decoder.JSONDecodeError: Expecting value: line 2 column 41 (char 81) > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6718) Fix more occurrences of utils.dates.days_ago
[ https://issues.apache.org/jira/browse/AIRFLOW-6718?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6718. - Resolution: Fixed > Fix more occurrences of utils.dates.days_ago > > > Key: AIRFLOW-6718 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6718 > Project: Apache Airflow > Issue Type: Bug > Components: examples >Affects Versions: 2.0.0, 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > There are 3 more occurrences in a different manner in the following files: > * dags/test_dag.py > * example_datasync_1.py > * example_datasync_2.py > This files used "utils.dates.days_ago(1)" instead of > "airflow.utils.dates.days_ago(1)" and hence they were not detected in > https://github.com/apache/airflow/pull/7007 -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6717) Remove non-existent field from templated_fields
[ https://issues.apache.org/jira/browse/AIRFLOW-6717?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6717. - Resolution: Fixed > Remove non-existent field from templated_fields > --- > > Key: AIRFLOW-6717 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6717 > Project: Apache Airflow > Issue Type: Bug > Components: operators >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > *CloudMemorystoreScaleInstanceOperator* does not have *instance* field but > the field exists in templated_fields. We should remove this -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6718) Fix more occurrences of utils.dates.days_ago
Kaxil Naik created AIRFLOW-6718: --- Summary: Fix more occurrences of utils.dates.days_ago Key: AIRFLOW-6718 URL: https://issues.apache.org/jira/browse/AIRFLOW-6718 Project: Apache Airflow Issue Type: Bug Components: examples Affects Versions: 1.10.7, 2.0.0 Reporter: Kaxil Naik Assignee: Kaxil Naik Fix For: 2.0.0 There are 3 more occurrences in a different manner in the following files: * dags/test_dag.py * example_datasync_1.py * example_datasync_2.py This files used "utils.dates.days_ago(1)" instead of "airflow.utils.dates.days_ago(1)" and hence they were not detected in https://github.com/apache/airflow/pull/7007 -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6717) Remove non-existent field from templated_fields
[ https://issues.apache.org/jira/browse/AIRFLOW-6717?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6717: Fix Version/s: 2.0.0 > Remove non-existent field from templated_fields > --- > > Key: AIRFLOW-6717 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6717 > Project: Apache Airflow > Issue Type: Bug > Components: operators >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > *CloudMemorystoreScaleInstanceOperator* does not have *instance* field but > the field exists in templated_fields. We should remove this -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6717) Remove non-existent field from templated_fields
Kaxil Naik created AIRFLOW-6717: --- Summary: Remove non-existent field from templated_fields Key: AIRFLOW-6717 URL: https://issues.apache.org/jira/browse/AIRFLOW-6717 Project: Apache Airflow Issue Type: Bug Components: operators Affects Versions: 2.0.0 Reporter: Kaxil Naik Assignee: Kaxil Naik *CloudMemorystoreScaleInstanceOperator* does not have *instance* field but the field exists in templated_fields. We should remove this -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6717) Remove non-existent field from templated_fields
[ https://issues.apache.org/jira/browse/AIRFLOW-6717?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6717: Priority: Minor (was: Major) > Remove non-existent field from templated_fields > --- > > Key: AIRFLOW-6717 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6717 > Project: Apache Airflow > Issue Type: Bug > Components: operators >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > > *CloudMemorystoreScaleInstanceOperator* does not have *instance* field but > the field exists in templated_fields. We should remove this -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6716) Fix AWS Datasync Example DAG
Kaxil Naik created AIRFLOW-6716: --- Summary: Fix AWS Datasync Example DAG Key: AIRFLOW-6716 URL: https://issues.apache.org/jira/browse/AIRFLOW-6716 Project: Apache Airflow Issue Type: Bug Components: examples Affects Versions: 2.0.0 Reporter: Kaxil Naik Assignee: Kaxil Naik Fix For: 2.0.0 Currently, the DAG fails with JSON Decode error {noformat} ERROR [airflow.models.dagbag.DagBag] Failed to import: /opt/airflow/airflow/providers/amazon/aws/example_dags/example_datasync_2.py Traceback (most recent call last): File "/opt/airflow/airflow/models/dagbag.py", line 255, in process_file loader.exec_module(m) File "", line 678, in exec_module File "", line 219, in _call_with_frames_removed File "/opt/airflow/airflow/providers/amazon/aws/example_dags/example_datasync_2.py", line 70, in default_destination_location_kwargs) File "/usr/local/lib/python3.6/json/__init__.py", line 354, in loads return _default_decoder.decode(s) File "/usr/local/lib/python3.6/json/decoder.py", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/local/lib/python3.6/json/decoder.py", line 357, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 2 column 41 (char 81) {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6715) Fix Google Cloud DLP Example DAG
[ https://issues.apache.org/jira/browse/AIRFLOW-6715?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6715. - Resolution: Fixed > Fix Google Cloud DLP Example DAG > > > Key: AIRFLOW-6715 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6715 > Project: Apache Airflow > Issue Type: Bug > Components: examples >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > The example DAG for DLP operator has a wrong way of setting task dependencies -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6715) Fix Google Cloud DLP Example DAG
[ https://issues.apache.org/jira/browse/AIRFLOW-6715?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6715: Fix Version/s: 2.0.0 > Fix Google Cloud DLP Example DAG > > > Key: AIRFLOW-6715 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6715 > Project: Apache Airflow > Issue Type: Bug > Components: examples >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > The example DAG for DLP operator has a wrong way of setting task dependencies -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6715) Fix Google Cloud DLP Example DAG
[ https://issues.apache.org/jira/browse/AIRFLOW-6715?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6715: Summary: Fix Google Cloud DLP Example DAG (was: Fix DLP Example DAG) > Fix Google Cloud DLP Example DAG > > > Key: AIRFLOW-6715 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6715 > Project: Apache Airflow > Issue Type: Bug > Components: examples >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > > The example DAG for DLP operator has a wrong way of setting task dependencies -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6715) Fix DLP Example DAG
Kaxil Naik created AIRFLOW-6715: --- Summary: Fix DLP Example DAG Key: AIRFLOW-6715 URL: https://issues.apache.org/jira/browse/AIRFLOW-6715 Project: Apache Airflow Issue Type: Improvement Components: examples Affects Versions: 1.10.7 Reporter: Kaxil Naik Assignee: Kaxil Naik The example DAG for DLP operator has a wrong way of setting task dependencies -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6715) Fix DLP Example DAG
[ https://issues.apache.org/jira/browse/AIRFLOW-6715?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6715: Issue Type: Bug (was: Improvement) > Fix DLP Example DAG > --- > > Key: AIRFLOW-6715 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6715 > Project: Apache Airflow > Issue Type: Bug > Components: examples >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > > The example DAG for DLP operator has a wrong way of setting task dependencies -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6694) print_stuff cannot be imported in kubernetes tests
[ https://issues.apache.org/jira/browse/AIRFLOW-6694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6694. - Resolution: Fixed > print_stuff cannot be imported in kubernetes tests > -- > > Key: AIRFLOW-6694 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6694 > Project: Apache Airflow > Issue Type: Bug > Components: ci >Affects Versions: 2.0.0 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 2.0.0 > > > Kubernetes tests fail because print_stuff cannot be imported. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6694) print_stuff cannot be imported in kubernetes tests
[ https://issues.apache.org/jira/browse/AIRFLOW-6694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17028399#comment-17028399 ] Kaxil Naik commented on AIRFLOW-6694: - [~potiuk] - I have already created a PR to fix it :) - https://github.com/apache/airflow/pull/7334 > print_stuff cannot be imported in kubernetes tests > -- > > Key: AIRFLOW-6694 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6694 > Project: Apache Airflow > Issue Type: Bug > Components: ci >Affects Versions: 2.0.0 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 2.0.0 > > > Kubernetes tests fail because print_stuff cannot be imported. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6694) print_stuff cannot be imported in kubernetes tests
[ https://issues.apache.org/jira/browse/AIRFLOW-6694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17028379#comment-17028379 ] Kaxil Naik commented on AIRFLOW-6694: - Reopened the issue as I get following error when running Airflow (Master): {noformat} ERROR [airflow.models.dagbag.DagBag] Failed to import: /Users/kaxilnaik/airflow_master/dags/example_bigquery_1.py Traceback (most recent call last): File "/Users/kaxilnaik/Documents/GitHub/incubator-airflow/airflow/models/dagbag.py", line 248, in process_file loader.exec_module(m) File "", line 728, in exec_module File "", line 219, in _call_with_frames_removed File "/Users/kaxilnaik/airflow_master/dags/example_bigquery_1.py", line 28, in from airflow.gcp.operators.bigquery import ( ModuleNotFoundError: No module named 'airflow.gcp' ERROR [airflow.models.dagbag.DagBag] Failed to import: /Users/kaxilnaik/Documents/GitHub/incubator-airflow/airflow/example_dags/example_kubernetes_executor_config.py Traceback (most recent call last): File "/Users/kaxilnaik/Documents/GitHub/incubator-airflow/airflow/models/dagbag.py", line 248, in process_file loader.exec_module(m) File "", line 728, in exec_module File "", line 219, in _call_with_frames_removed File "/Users/kaxilnaik/Documents/GitHub/incubator-airflow/airflow/example_dags/example_kubernetes_executor_config.py", line 24, in from libs.helper import print_stuff ModuleNotFoundError: No module named 'libs' {noformat} > print_stuff cannot be imported in kubernetes tests > -- > > Key: AIRFLOW-6694 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6694 > Project: Apache Airflow > Issue Type: Bug > Components: ci >Affects Versions: 2.0.0 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 2.0.0 > > > Kubernetes tests fail because print_stuff cannot be imported. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Reopened] (AIRFLOW-6694) print_stuff cannot be imported in kubernetes tests
[ https://issues.apache.org/jira/browse/AIRFLOW-6694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik reopened AIRFLOW-6694: - > print_stuff cannot be imported in kubernetes tests > -- > > Key: AIRFLOW-6694 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6694 > Project: Apache Airflow > Issue Type: Bug > Components: ci >Affects Versions: 2.0.0 >Reporter: Jarek Potiuk >Priority: Major > Fix For: 2.0.0 > > > Kubernetes tests fail because print_stuff cannot be imported. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6527) Error sending Celery task:Timeout in send_task_to_executor
[ https://issues.apache.org/jira/browse/AIRFLOW-6527?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6527: Fix Version/s: 1.10.8 > Error sending Celery task:Timeout in send_task_to_executor > -- > > Key: AIRFLOW-6527 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6527 > Project: Apache Airflow > Issue Type: Bug > Components: scheduler >Affects Versions: 1.10.7 >Reporter: Qian Yu >Priority: Major > Fix For: 2.0.0, 1.10.8 > > > We use Airflow with CeleryExecutor and redis broker. Our airflow scheduler > often encounters this \{{AirflowTaskTimeout}} error. > - This happens in \{{send_task_to_executor()}}. > - It only happens occasionally. > - Retrying the failed task a few times always works. > - This affects at least 1.10.6 and 1.10.7 and possibly other versions too. > Possible cause: > Our airflow venv and dags_folder are on an NFS mount because we want to keep > the various pieces of Airflow services in sync. > The NFS mount can be slow sometimes. This causes the import to be slow and > causes \{{send_task_to_executor()}} to take more than 2 seconds. > Other people with similar looking problems: > The following issue is now closed. It's not clear to me whether or how the > user resolved this issue. > https://github.com/bitnami/bitnami-docker-airflow-scheduler/issues/1 > Another user asked a question in the mailing list. It's not answered. > https://www.mail-archive.com/dev@airflow.apache.org/msg01093.html > Proposed workaround: > - Make this `timeout(seconds=2)` configurable. E.g adding a > [celery]send_task_timeout to airflow.cfg. Since 2 seconds seems too short, we > can configure it to something like 15 seconds to make it much less likely to > happen. > - Move airflow venv to the local disk. This makes it inconvenient to sync > airflow installation across multiple hosts though. > {code} > Jan 09 22:46:59 scheduler_host airflow[18882]: [2020-01-09 22:46:59,763] > \{celery_executor.py:224} ERROR - Error sending Celery task:Timeout, PID: > 27724 > Jan 09 22:46:59 scheduler_host airflow[18882]: Celery Task ID: > ('example_daily', 'example_sensor1', datetime.datetime(2020, 1, 9, 0, 0, > tzinfo=), 1) > Jan 09 22:46:59 scheduler_host airflow[18882]: Traceback (most recent call > last): > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/kombu/utils/objects.py", > line 42, in __get__ > Jan 09 22:46:59 scheduler_host airflow[18882]: return > obj.__dict__[self.__name__] > Jan 09 22:46:59 scheduler_host airflow[18882]: KeyError: 'amqp' > Jan 09 22:46:59 scheduler_host airflow[18882]: During handling of the above > exception, another exception occurred: > Jan 09 22:46:59 scheduler_host airflow[18882]: Traceback (most recent call > last): > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/airflow/executors/celery_executor.py", > line 118, in send_task_to_executor > Jan 09 22:46:59 scheduler_host airflow[18882]: result = > task.apply_async(args=[command], queue=queue) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/celery/app/task.py", line > 570, in apply_async > Jan 09 22:46:59 scheduler_host airflow[18882]: **options > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/celery/app/base.py", line > 712, in send_task > Jan 09 22:46:59 scheduler_host airflow[18882]: amqp = self.amqp > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/kombu/utils/objects.py", > line 44, in __get__ > Jan 09 22:46:59 scheduler_host airflow[18882]: value = > obj.__dict__[self.__name__] = self.__get(obj) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/celery/app/base.py", line > 1202, in amqp > Jan 09 22:46:59 scheduler_host airflow[18882]: return > instantiate(self.amqp_cls, app=self) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/celery/utils/imports.py", > line 55, in instantiate > Jan 09 22:46:59 scheduler_host airflow[18882]: return > symbol_by_name(name)(*args, **kwargs) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/kombu/utils/imports.py", > line 57, in symbol_by_name > Jan 09 22:46:59 scheduler_host airflow[18882]: module = imp(module_name, > package=package, **kwargs) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/usr/lib/python3.6/importlib/__init__.py", line 126, in import_module > Jan 09 22:46:59 scheduler_host airflow[18882]: return >
[jira] [Resolved] (AIRFLOW-6527) Error sending Celery task:Timeout in send_task_to_executor
[ https://issues.apache.org/jira/browse/AIRFLOW-6527?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6527. - Fix Version/s: 2.0.0 Resolution: Fixed > Error sending Celery task:Timeout in send_task_to_executor > -- > > Key: AIRFLOW-6527 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6527 > Project: Apache Airflow > Issue Type: Bug > Components: scheduler >Affects Versions: 1.10.7 >Reporter: Qian Yu >Priority: Major > Fix For: 2.0.0 > > > We use Airflow with CeleryExecutor and redis broker. Our airflow scheduler > often encounters this \{{AirflowTaskTimeout}} error. > - This happens in \{{send_task_to_executor()}}. > - It only happens occasionally. > - Retrying the failed task a few times always works. > - This affects at least 1.10.6 and 1.10.7 and possibly other versions too. > Possible cause: > Our airflow venv and dags_folder are on an NFS mount because we want to keep > the various pieces of Airflow services in sync. > The NFS mount can be slow sometimes. This causes the import to be slow and > causes \{{send_task_to_executor()}} to take more than 2 seconds. > Other people with similar looking problems: > The following issue is now closed. It's not clear to me whether or how the > user resolved this issue. > https://github.com/bitnami/bitnami-docker-airflow-scheduler/issues/1 > Another user asked a question in the mailing list. It's not answered. > https://www.mail-archive.com/dev@airflow.apache.org/msg01093.html > Proposed workaround: > - Make this `timeout(seconds=2)` configurable. E.g adding a > [celery]send_task_timeout to airflow.cfg. Since 2 seconds seems too short, we > can configure it to something like 15 seconds to make it much less likely to > happen. > - Move airflow venv to the local disk. This makes it inconvenient to sync > airflow installation across multiple hosts though. > {code} > Jan 09 22:46:59 scheduler_host airflow[18882]: [2020-01-09 22:46:59,763] > \{celery_executor.py:224} ERROR - Error sending Celery task:Timeout, PID: > 27724 > Jan 09 22:46:59 scheduler_host airflow[18882]: Celery Task ID: > ('example_daily', 'example_sensor1', datetime.datetime(2020, 1, 9, 0, 0, > tzinfo=), 1) > Jan 09 22:46:59 scheduler_host airflow[18882]: Traceback (most recent call > last): > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/kombu/utils/objects.py", > line 42, in __get__ > Jan 09 22:46:59 scheduler_host airflow[18882]: return > obj.__dict__[self.__name__] > Jan 09 22:46:59 scheduler_host airflow[18882]: KeyError: 'amqp' > Jan 09 22:46:59 scheduler_host airflow[18882]: During handling of the above > exception, another exception occurred: > Jan 09 22:46:59 scheduler_host airflow[18882]: Traceback (most recent call > last): > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/airflow/executors/celery_executor.py", > line 118, in send_task_to_executor > Jan 09 22:46:59 scheduler_host airflow[18882]: result = > task.apply_async(args=[command], queue=queue) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/celery/app/task.py", line > 570, in apply_async > Jan 09 22:46:59 scheduler_host airflow[18882]: **options > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/celery/app/base.py", line > 712, in send_task > Jan 09 22:46:59 scheduler_host airflow[18882]: amqp = self.amqp > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/kombu/utils/objects.py", > line 44, in __get__ > Jan 09 22:46:59 scheduler_host airflow[18882]: value = > obj.__dict__[self.__name__] = self.__get(obj) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/celery/app/base.py", line > 1202, in amqp > Jan 09 22:46:59 scheduler_host airflow[18882]: return > instantiate(self.amqp_cls, app=self) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/celery/utils/imports.py", > line 55, in instantiate > Jan 09 22:46:59 scheduler_host airflow[18882]: return > symbol_by_name(name)(*args, **kwargs) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/mnt/nfs1/airflow_venv/lib/python3.6/site-packages/kombu/utils/imports.py", > line 57, in symbol_by_name > Jan 09 22:46:59 scheduler_host airflow[18882]: module = imp(module_name, > package=package, **kwargs) > Jan 09 22:46:59 scheduler_host airflow[18882]: File > "/usr/lib/python3.6/importlib/__init__.py", line 126, in import_module > Jan 09 22:46:59 scheduler_host airflow[18882]: return >
[jira] [Resolved] (AIRFLOW-6258) CloudFormation create_stack and delete_stack operators
[ https://issues.apache.org/jira/browse/AIRFLOW-6258?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6258. - Fix Version/s: 2.0.0 Resolution: Fixed > CloudFormation create_stack and delete_stack operators > -- > > Key: AIRFLOW-6258 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6258 > Project: Apache Airflow > Issue Type: New Feature > Components: contrib >Affects Versions: 1.10.6 >Reporter: Aviem Zur >Assignee: Aviem Zur >Priority: Major > Fix For: 2.0.0 > > > Add CloudFormation create_stack and delete_stack operators. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-5843) Add conf form when trigger DAG from the WEB.
[ https://issues.apache.org/jira/browse/AIRFLOW-5843?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-5843. - Fix Version/s: 1.10.8 2.0.0 Resolution: Fixed > Add conf form when trigger DAG from the WEB. > - > > Key: AIRFLOW-5843 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5843 > Project: Apache Airflow > Issue Type: Wish > Components: ui >Affects Versions: 1.10.6 >Reporter: jihun.no >Assignee: James Coder >Priority: Minor > Fix For: 2.0.0, 1.10.8 > > > When we trigger a DAG by airflow_cli, it is possible to give conf like this. > {code:java} > airflow trigger_dag --conf {"file_variable": "/path/to/file"} dag_id > {code} > > But some times, Access to the webserver's shell is not easy or convenient. > So I think, It will be very helpful if we can specify conf when trigger a DAG > from the airflow's web. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6666) Resolve js-yaml advisories
[ https://issues.apache.org/jira/browse/AIRFLOW-?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-. - Resolution: Fixed > Resolve js-yaml advisories > -- > > Key: AIRFLOW- > URL: https://issues.apache.org/jira/browse/AIRFLOW- > Project: Apache Airflow > Issue Type: Improvement > Components: webserver >Affects Versions: 1.10.7 >Reporter: Ry Walker >Assignee: Ry Walker >Priority: Major > Fix For: 1.10.8 > > > Discovered via `npm audit` > # https://npmjs.com/advisories/788 > # https://npmjs.com/advisories/813 -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6626) Add email on failure or retry to default config
[ https://issues.apache.org/jira/browse/AIRFLOW-6626?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6626. - Fix Version/s: 1.10.8 2.0.0 Resolution: Fixed > Add email on failure or retry to default config > --- > > Key: AIRFLOW-6626 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6626 > Project: Apache Airflow > Issue Type: Task > Components: configuration >Affects Versions: 1.10.7 >Reporter: Xinbin Huang >Assignee: Xinbin Huang >Priority: Major > Fix For: 2.0.0, 1.10.8 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6706) Lazy load operator extra links
[ https://issues.apache.org/jira/browse/AIRFLOW-6706?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6706. - Fix Version/s: 1.10.8 2.0.0 Resolution: Fixed > Lazy load operator extra links > -- > > Key: AIRFLOW-6706 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6706 > Project: Apache Airflow > Issue Type: Improvement > Components: serialization >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Priority: Major > Fix For: 2.0.0, 1.10.8 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-3007) Scheduler docs use deprecated use of `schedule_interval`
[ https://issues.apache.org/jira/browse/AIRFLOW-3007?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17027734#comment-17027734 ] Kaxil Naik commented on AIRFLOW-3007: - [~matjazmav] You might be passing *schedule_interval* in *default_args* or passing in *task*. You should only set it on DAG level > Scheduler docs use deprecated use of `schedule_interval` > - > > Key: AIRFLOW-3007 > URL: https://issues.apache.org/jira/browse/AIRFLOW-3007 > Project: Apache Airflow > Issue Type: Improvement > Components: documentation >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > Attachments: screenshot-1.png > > > The scheduler docs at > https://airflow.apache.org/scheduler.html#backfill-and-catchup use deprecated > way of passing {{schedule_interval}} . {{schedule_interval}} should be pass > to DAG as a separate parameter and not as a default arg. > !screenshot-1.png! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-2923) LatestOnlyOperator cascade skip through all_done and dummy
[ https://issues.apache.org/jira/browse/AIRFLOW-2923?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-2923. - Fix Version/s: 2.0.0 Resolution: Fixed > LatestOnlyOperator cascade skip through all_done and dummy > -- > > Key: AIRFLOW-2923 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2923 > Project: Apache Airflow > Issue Type: Bug > Components: scheduler >Affects Versions: 1.9.0, 1.10.5, 1.10.6, 1.10.7 > Environment: CeleryExecutor, 2-nodes cluster, RMQ, PostgreSQL >Reporter: Dmytro Kulyk >Assignee: Cedrik Neumann >Priority: Major > Labels: cascade, latestonly, skip > Fix For: 2.0.0 > > Attachments: cube_update.py, screenshot-1.png > > > DAG with consolidating point (calc_ready : dummy) > as per [https://airflow.apache.org/concepts.html#latest-run-only] given task > should be ran even catching up an execution DagRuns for a previous periods > However, LatestOnlyOperator cascading through calc_ready despite of it is a > dummy and trigger_rule=all_done, none_failed > Same behavior when trigger_rule=all_success > {code} > t_ready = DummyOperator( > task_id = 'calc_ready', > trigger_rule = TriggerRule.ALL_DONE, > dag=dag) > {code} > !screenshot-1.png! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-1784) SKIPPED status is being cascading wrongly
[ https://issues.apache.org/jira/browse/AIRFLOW-1784?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-1784. - Fix Version/s: 2.0.0 Resolution: Fixed > SKIPPED status is being cascading wrongly > - > > Key: AIRFLOW-1784 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1784 > Project: Apache Airflow > Issue Type: Bug > Components: operators >Affects Versions: 1.8.2 > Environment: Ubuntu 16.04.3 LTS > Python 2.7.12 > CeleryExecutor: 2-nodes cluster >Reporter: Dmytro Kulyk >Assignee: Cedrik Neumann >Priority: Critical > Labels: documentation, latestonly, operators > Fix For: 2.0.0 > > Attachments: Capture_graph.JPG, Capture_tree2.JPG, cube_update.py > > > After implementation of AIRFLOW-1296 within 1.8.2 there is an wrong behavior > of LatestOnlyOperator which is forcing SKIPPED status cascading despite of > TriggerRule='all_done' set > Which is opposite to documented > [here|https://airflow.incubator.apache.org/concepts.html#latest-run-only] > *Expected Behavior:* > dummy task and all downstreams (update_*) should not be skipped > Full listings are attached > 1.8.1 did not have such issue -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6428) Fix import path for airflow.utils.dates.days_ago in Example DAGs
[ https://issues.apache.org/jira/browse/AIRFLOW-6428?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6428: Fix Version/s: (was: 2.0.0) 1.10.8 > Fix import path for airflow.utils.dates.days_ago in Example DAGs > > > Key: AIRFLOW-6428 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6428 > Project: Apache Airflow > Issue Type: Improvement > Components: utils >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Major > Fix For: 1.10.8 > > Attachments: image-2020-01-02-15-33-55-601.png > > > Currently, without the entry in __init__.py, IDEs show that it could not find > the reference to *dates* and hence if you try to find the reference for > *days_ago* function or *dates* modules it can't find it. > Check attachment -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6479) Update celery command calls
[ https://issues.apache.org/jira/browse/AIRFLOW-6479?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6479: Fix Version/s: (was: 1.10.8) 2.0.0 > Update celery command calls > --- > > Key: AIRFLOW-6479 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6479 > Project: Apache Airflow > Issue Type: Improvement > Components: core, documentation >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Priority: Major > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (AIRFLOW-6567) Configuration broken when using proxy_fix
[ https://issues.apache.org/jira/browse/AIRFLOW-6567?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik closed AIRFLOW-6567. --- Resolution: Fixed This issue looks like a duplicate of https://issues.apache.org/jira/browse/AIRFLOW-6345 > Configuration broken when using proxy_fix > - > > Key: AIRFLOW-6567 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6567 > Project: Apache Airflow > Issue Type: Bug > Components: webserver >Affects Versions: 1.10.7 >Reporter: Tobias Edvardsson >Priority: Blocker > > We tried to upgrade from 1.10.6 to 1.10.7 but encountered an issue when > trying to access the webserver. We are running the solution in kubernetes at > AWS behind Kong as proxy/gateway. > 1.10.6 works niceley but after the upgrade we get the following exception: > {code:java} > [2020-01-15 08:12:02 +] [111] [ERROR] Error handling request /1/15/2020 > 9:12:02 AM Traceback (most recent call last):1/15/2020 9:12:02 AM File > "/usr/local/lib/python3.7/site-packages/gunicorn/workers/sync.py", line 135, > in handle1/15/2020 9:12:02 AM self.handle_request(listener, req, client, > addr)1/15/2020 9:12:02 AM File > "/usr/local/lib/python3.7/site-packages/gunicorn/workers/sync.py", line 176, > in handle_request1/15/2020 9:12:02 AM respiter = self.wsgi(environ, > resp.start_response)1/15/2020 9:12:02 AM File > "/usr/local/lib/python3.7/site-packages/werkzeug/middleware/dispatcher.py", > line 66, in __call__1/15/2020 9:12:02 AM return app(environ, > start_response)1/15/2020 9:12:02 AM File > "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2463, in > __call__1/15/2020 9:12:02 AM return self.wsgi_app(environ, > start_response)1/15/2020 9:12:02 AM File > "/usr/local/lib/python3.7/site-packages/werkzeug/middleware/proxy_fix.py", > line 195, in __call__1/15/2020 9:12:02 AM x_for = > self._get_trusted_comma(self.x_for, > environ_get("HTTP_X_FORWARDED_FOR"))1/15/2020 9:12:02 AM File > "/usr/local/lib/python3.7/site-packages/werkzeug/middleware/proxy_fix.py", > line 166, in _get_trusted_comma1/15/2020 9:12:02 AM if len(values) >= > trusted:1/15/2020 9:12:02 AM TypeError: '>=' not supported between instances > of 'int' and 'str' > {code} > From my own investigation the issue seem to be the new configuration options > for the proxy where the configuration which are fetched returns strings > rather then integers as it should. > [https://github.com/apache/airflow/pull/6723] > {code:java} > x_for=conf.get("webserver", "PROXY_FIX_X_FOR", fallback=1), > x_proto=conf.get("webserver", "PROXY_FIX_X_PROTO", fallback=1), > x_host=conf.get("webserver", "PROXY_FIX_X_HOST", fallback=1), > x_port=conf.get("webserver", "PROXY_FIX_X_PORT", fallback=1), > x_prefix=conf.get("webserver", "PROXY_FIX_X_PREFIX", fallback=1) > {code} > (row 51-56 in airflow/www/app.py) > The following values are expected to be integers and not strings. > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6595) Use TaskNotFound exception instead of AirflowException
[ https://issues.apache.org/jira/browse/AIRFLOW-6595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6595: Fix Version/s: (was: 2.0.0) > Use TaskNotFound exception instead of AirflowException > -- > > Key: AIRFLOW-6595 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6595 > Project: Apache Airflow > Issue Type: Improvement > Components: core >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 1.10.8 > > > We should use *TaskNotFound* exception when the Task no longer exists instead > of the general AirflowException -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6639) Remove duplicate Output format choices from CLI docs
[ https://issues.apache.org/jira/browse/AIRFLOW-6639?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6639. - Resolution: Fixed > Remove duplicate Output format choices from CLI docs > > > Key: AIRFLOW-6639 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6639 > Project: Apache Airflow > Issue Type: Improvement > Components: documentation >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > {noformat} > Output table format. The specified value is passed to the tabulate module > (https://pypi.org/project/tabulate/). Valid values are: > (fancy_grid|github|grid|html|jira|latex|latex_booktabs|latex_raw|mediawiki|moinmoin|orgtbl|pipe|plain|presto|psql|rst|simple|textile|tsv|youtrack) > {noformat} > The above is repeated twice. > https://airflow.readthedocs.io/en/latest/cli-ref.html#tasks -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6639) Remove duplicate Output format choices from CLI docs
[ https://issues.apache.org/jira/browse/AIRFLOW-6639?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6639: Fix Version/s: 2.0.0 > Remove duplicate Output format choices from CLI docs > > > Key: AIRFLOW-6639 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6639 > Project: Apache Airflow > Issue Type: Improvement > Components: documentation >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > {noformat} > Output table format. The specified value is passed to the tabulate module > (https://pypi.org/project/tabulate/). Valid values are: > (fancy_grid|github|grid|html|jira|latex|latex_booktabs|latex_raw|mediawiki|moinmoin|orgtbl|pipe|plain|presto|psql|rst|simple|textile|tsv|youtrack) > {noformat} > The above is repeated twice. > https://airflow.readthedocs.io/en/latest/cli-ref.html#tasks -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6639) Remove duplicate Output format choices from CLI docs
Kaxil Naik created AIRFLOW-6639: --- Summary: Remove duplicate Output format choices from CLI docs Key: AIRFLOW-6639 URL: https://issues.apache.org/jira/browse/AIRFLOW-6639 Project: Apache Airflow Issue Type: Improvement Components: documentation Affects Versions: 2.0.0 Reporter: Kaxil Naik Assignee: Kaxil Naik {noformat} Output table format. The specified value is passed to the tabulate module (https://pypi.org/project/tabulate/). Valid values are: (fancy_grid|github|grid|html|jira|latex|latex_booktabs|latex_raw|mediawiki|moinmoin|orgtbl|pipe|plain|presto|psql|rst|simple|textile|tsv|youtrack) {noformat} The above is repeated twice. https://airflow.readthedocs.io/en/latest/cli-ref.html#tasks -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6620) Mock celery in worker cli test
[ https://issues.apache.org/jira/browse/AIRFLOW-6620?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6620. - Fix Version/s: 1.10.8 Resolution: Fixed > Mock celery in worker cli test > -- > > Key: AIRFLOW-6620 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6620 > Project: Apache Airflow > Issue Type: Improvement > Components: celery, tests >Affects Versions: 1.10.8 >Reporter: Tomasz Urbaszek >Priority: Major > Fix For: 1.10.8 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (AIRFLOW-6460) Reduce timeout in pytest
[ https://issues.apache.org/jira/browse/AIRFLOW-6460?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik closed AIRFLOW-6460. --- Resolution: Fixed > Reduce timeout in pytest > > > Key: AIRFLOW-6460 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6460 > Project: Apache Airflow > Issue Type: Improvement > Components: tests >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Assignee: Kamil Bregula >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6342) EmrAddStepsOperator broken reference and faulty test
[ https://issues.apache.org/jira/browse/AIRFLOW-6342?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6342: Fix Version/s: (was: 1.10.8) > EmrAddStepsOperator broken reference and faulty test > > > Key: AIRFLOW-6342 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6342 > Project: Apache Airflow > Issue Type: New Feature > Components: contrib >Affects Versions: 1.10.7 >Reporter: Aviem Zur >Assignee: Aviem Zur >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6460) Reduce timeout in pytest
[ https://issues.apache.org/jira/browse/AIRFLOW-6460?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6460: Fix Version/s: (was: 1.10.8) > Reduce timeout in pytest > > > Key: AIRFLOW-6460 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6460 > Project: Apache Airflow > Issue Type: Improvement > Components: tests >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Assignee: Kamil Bregula >Priority: Major > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Reopened] (AIRFLOW-6460) Reduce timeout in pytest
[ https://issues.apache.org/jira/browse/AIRFLOW-6460?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik reopened AIRFLOW-6460: - > Reduce timeout in pytest > > > Key: AIRFLOW-6460 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6460 > Project: Apache Airflow > Issue Type: Improvement > Components: tests >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Assignee: Kamil Bregula >Priority: Major > Fix For: 1.10.8 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6514) AIRFLOW-5695 didn't make it into release
[ https://issues.apache.org/jira/browse/AIRFLOW-6514?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6514. - Resolution: Fixed Backported to old UI too https://github.com/apache/airflow/commit/4bd2f574cd527c634e0ddeb607a35e363d975524 > AIRFLOW-5695 didn't make it into release > > > Key: AIRFLOW-6514 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6514 > Project: Apache Airflow > Issue Type: Bug > Components: webserver >Affects Versions: 1.10.7 >Reporter: david liu >Priority: Minor > Fix For: 1.10.8 > > > Ticket AIRFLOW-5695 is listed as being included in the 1.10.7 release in the > changelogs, but appears to have omitted. There's no reference to this fix in > the commits > [https://github.com/apache/airflow/commits/1.10.7/airflow/www/views.py] > And an inspection of the file shows that the master is updated but the > release is not > [https://github.com/apache/airflow/blob/1.10.7/airflow/www/views.py#L1134] > [https://github.com/apache/airflow/blob/master/airflow/www/views.py#L897] > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6414) Create a configurations page in airflow documentation
[ https://issues.apache.org/jira/browse/AIRFLOW-6414?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6414: Fix Version/s: (was: 1.10.8) > Create a configurations page in airflow documentation > - > > Key: AIRFLOW-6414 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6414 > Project: Apache Airflow > Issue Type: Improvement > Components: documentation >Affects Versions: 1.10.7 >Reporter: Daniel Imberman >Assignee: Daniel Imberman >Priority: Major > > Many questions on the airflow slack and the airflow mailing list come down to > people not knowing which configurations can best help their airflow clusters > run optimally. There appears to be no single hub of all configuration options > and users sometimes find themselves going into airflow source code to find > configurations. > This will place all configurations in configurations.rst in the > airflow/docs/configurations.rst directory. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Reopened] (AIRFLOW-4242) Missing description of statsd metric keys
[ https://issues.apache.org/jira/browse/AIRFLOW-4242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik reopened AIRFLOW-4242: - > Missing description of statsd metric keys > - > > Key: AIRFLOW-4242 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4242 > Project: Apache Airflow > Issue Type: New Feature > Components: documentation >Reporter: Kamil Bregula >Priority: Trivial > Fix For: 1.10.8 > > > Missing keys: `local_task_job_heartbeat_failure`, > `local_task_job_prolonged_heartbeat_failure`, > `task_removed_from_dag.`, `task_restored_to_dag.`, > `task_instance_created-`, `dag_file_refresh_error` -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Closed] (AIRFLOW-4242) Missing description of statsd metric keys
[ https://issues.apache.org/jira/browse/AIRFLOW-4242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik closed AIRFLOW-4242. --- Resolution: Fixed > Missing description of statsd metric keys > - > > Key: AIRFLOW-4242 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4242 > Project: Apache Airflow > Issue Type: New Feature > Components: documentation >Reporter: Kamil Bregula >Priority: Trivial > > Missing keys: `local_task_job_heartbeat_failure`, > `local_task_job_prolonged_heartbeat_failure`, > `task_removed_from_dag.`, `task_restored_to_dag.`, > `task_instance_created-`, `dag_file_refresh_error` -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6428) Fix import path for airflow.utils.dates.days_ago in Example DAGs
[ https://issues.apache.org/jira/browse/AIRFLOW-6428?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6428: Fix Version/s: (was: 1.10.8) > Fix import path for airflow.utils.dates.days_ago in Example DAGs > > > Key: AIRFLOW-6428 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6428 > Project: Apache Airflow > Issue Type: Improvement > Components: utils >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Major > Fix For: 2.0.0 > > Attachments: image-2020-01-02-15-33-55-601.png > > > Currently, without the entry in __init__.py, IDEs show that it could not find > the reference to *dates* and hence if you try to find the reference for > *days_ago* function or *dates* modules it can't find it. > Check attachment -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-2279) Clearing Tasks Across DAGs
[ https://issues.apache.org/jira/browse/AIRFLOW-2279?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-2279: Fix Version/s: (was: 2.0.0) 1.10.8 > Clearing Tasks Across DAGs > -- > > Key: AIRFLOW-2279 > URL: https://issues.apache.org/jira/browse/AIRFLOW-2279 > Project: Apache Airflow > Issue Type: Improvement >Reporter: Achal Soni >Assignee: Qian Yu >Priority: Major > Fix For: 1.10.8 > > Attachments: cross_dag_ui_screenshot.png > > > At Stripe, we commonly have discrete dags that depend on each other by > leveraging ExternalTaskSensors. We also find ourselves routinely wanting to > not only clear tasks and their downstream tasks in a particular dag, but also > their downstream tasks in their dependent dags (linked by > ExternalTaskSensors). > We currently have extended Airflow to handle this by modifying the webapp and > cli tool to optionally clear dependent tasks across multiple dags (see > attached screenshot). > We want to open the floor for discussion with the larger Airflow community > about the usage of ExternalTaskSensors and specifically how to handle > clearing across dags. We are interested in learning more about the accepted > practices in this regard, and are very open/willing to contribute in this > area if there is interest! -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6434) Docker Operator No Longer XComs Result in 1.10.7
[ https://issues.apache.org/jira/browse/AIRFLOW-6434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6434. - Resolution: Fixed > Docker Operator No Longer XComs Result in 1.10.7 > > > Key: AIRFLOW-6434 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6434 > Project: Apache Airflow > Issue Type: Bug > Components: operators, xcom >Affects Versions: 1.10.7 >Reporter: Brian Phillips >Priority: Trivial > Fix For: 1.10.8 > > > This change > ([https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)|https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212] > introduced a slight (and I believe unintended) change to the Docker Operator > xcom behavior. > Even if xcom_push is True, DockerOperator.execute will not return a value and > thus will not push an xcom value. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Reopened] (AIRFLOW-6434) Docker Operator No Longer XComs Result in 1.10.7
[ https://issues.apache.org/jira/browse/AIRFLOW-6434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik reopened AIRFLOW-6434: - > Docker Operator No Longer XComs Result in 1.10.7 > > > Key: AIRFLOW-6434 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6434 > Project: Apache Airflow > Issue Type: Bug > Components: operators, xcom >Affects Versions: 1.10.7 >Reporter: Brian Phillips >Priority: Trivial > Fix For: 1.10.8 > > > This change > ([https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212)|https://github.com/apache/airflow/commit/8a6dc6657d2a32ac3979e6478512d518ad5a5212] > introduced a slight (and I believe unintended) change to the Docker Operator > xcom behavior. > Even if xcom_push is True, DockerOperator.execute will not return a value and > thus will not push an xcom value. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6376) Remove code duplicate -order_queued_tasks_by_priority
[ https://issues.apache.org/jira/browse/AIRFLOW-6376?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6376: Fix Version/s: (was: 1.10.8) 2.0.0 > Remove code duplicate -order_queued_tasks_by_priority > - > > Key: AIRFLOW-6376 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6376 > Project: Apache Airflow > Issue Type: Improvement > Components: executors >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Priority: Trivial > Fix For: 2.0.0 > > > We have two identical codes in two places. I want to extract them to base > classes. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6238) Dags stats endpoint is slow and returns a large payload
[ https://issues.apache.org/jira/browse/AIRFLOW-6238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6238: Fix Version/s: (was: 2.0.0) 1.10.8 > Dags stats endpoint is slow and returns a large payload > --- > > Key: AIRFLOW-6238 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6238 > Project: Apache Airflow > Issue Type: Improvement > Components: api, ui >Affects Versions: 2.0.0, 1.10.6 >Reporter: Robin Edwards >Priority: Minor > Fix For: 1.10.8 > > > The dag_stats endpoint returns all dags by default. This can result in an > extremely large payload ~ 3mb and slow response time when you have a lot of > dags (In our case 1500+). > The accompanying pull request adds a dag_ids get parameter to the dag_stats > end point which is populated by the dags present on the page. > Please see previous issue for task_stats AIRFLOW-6095 which has already been > merged with a similar solution -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6446) Add Github actions to Welcome First time contributors
[ https://issues.apache.org/jira/browse/AIRFLOW-6446?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6446: Fix Version/s: (was: 1.10.8) 2.0.0 > Add Github actions to Welcome First time contributors > - > > Key: AIRFLOW-6446 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6446 > Project: Apache Airflow > Issue Type: New Feature > Components: project-management >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > Add Github actions to Welcome First time contributors -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6447) Add GitHub Action to add Labels on Pull Requests
[ https://issues.apache.org/jira/browse/AIRFLOW-6447?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6447: Fix Version/s: (was: 1.10.8) 2.0.0 > Add GitHub Action to add Labels on Pull Requests > > > Key: AIRFLOW-6447 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6447 > Project: Apache Airflow > Issue Type: New Feature > Components: project-management >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > Add GitHub Action to add Labels on Pull Requests -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6421) Remove ci-reporter Probot app
[ https://issues.apache.org/jira/browse/AIRFLOW-6421?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6421: Fix Version/s: (was: 1.10.8) 2.0.0 > Remove ci-reporter Probot app > - > > Key: AIRFLOW-6421 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6421 > Project: Apache Airflow > Issue Type: New Feature > Components: PR tool, project-management >Affects Versions: 2.0.0, 1.10.8 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > ci-reporter did not work for us because of > https://github.com/JasonEtco/ci-reporter/issues/40 -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6417) Disable approval requirements from mergeable bot
[ https://issues.apache.org/jira/browse/AIRFLOW-6417?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6417: Fix Version/s: (was: 1.10.8) 2.0.0 > Disable approval requirements from mergeable bot > > > Key: AIRFLOW-6417 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6417 > Project: Apache Airflow > Issue Type: Bug > Components: core >Affects Versions: 1.10.7 >Reporter: Kamil Bregula >Priority: Major > Fix For: 2.0.0 > > Attachments: Screenshot 2020-01-01 at 12.32.07.png, > image-2020-01-01-12-32-24-188.png, image-2020-01-01-12-32-25-491.png > > > I review PR that are in good condition in the first place - they have a green > label next to the title. Unfortunately, now all PRs are always red before > they are accepted. When accepted, they are often merged immediately, so all > open PRs are always red. This useful indicator now makes no sense. > !Screenshot 2020-01-01 at 12.32.07.png! > In my opinion, we do not have a big problem with merging PRs that are not > accepted, so this option does not solve any of our problems. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-5814) Implementing Presto hook tests
[ https://issues.apache.org/jira/browse/AIRFLOW-5814?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-5814: Fix Version/s: (was: 2.0.0) 1.10.8 > Implementing Presto hook tests > -- > > Key: AIRFLOW-5814 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5814 > Project: Apache Airflow > Issue Type: Test > Components: hooks >Affects Versions: 1.10.5 >Reporter: Sayed Mohammad Hossein Torabi >Assignee: Sayed Mohammad Hossein Torabi >Priority: Minor > Fix For: 1.10.8 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6611) Add missing ProxyFix configs to default_airflow.cfg with docs
[ https://issues.apache.org/jira/browse/AIRFLOW-6611?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6611. - Resolution: Fixed > Add missing ProxyFix configs to default_airflow.cfg with docs > - > > Key: AIRFLOW-6611 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6611 > Project: Apache Airflow > Issue Type: Improvement > Components: configuration, documentation >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 1.10.8 > > > https://github.com/apache/airflow/commit/d90ddbd189e4ef99d45af44abf8580b29896a4e0 > added some new configs which weren't documented and weren't added in > default_airflow.cfg -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-4495) allow externally triggered dags to run for future 'Execution date'
[ https://issues.apache.org/jira/browse/AIRFLOW-4495?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-4495. - Fix Version/s: 1.10.8 2.0.0 Resolution: Fixed > allow externally triggered dags to run for future 'Execution date' > -- > > Key: AIRFLOW-4495 > URL: https://issues.apache.org/jira/browse/AIRFLOW-4495 > Project: Apache Airflow > Issue Type: Improvement >Reporter: t oo >Assignee: t oo >Priority: Minor > Fix For: 2.0.0, 1.10.8 > > > 1. > useful to handle future date for externally triggered batch process where > ingesting 'forecast' data where filename date is in the future > 2. > this error is just in the scheduler log and not propagated up, so the dag > stays in 'running' state forever (or for 1 year waiting for the time to pass > :) ) > ERROR - Execution date is in future: 2020-01-01 00:00:00+00:00 > > > fix below works if u only have externally triggered DAGs: > > commenting below ti_deps\deps\runnable_exec_date_dep.py > #if ti.execution_date > cur_date: > # yield self._failing_status( > # reason="Execution date \{0} is in the future (the current " > # "date is \{1}).".format(ti.execution_date.isoformat(), > # cur_date.isoformat())) > > commenting below jobs.py > # don't consider runs that are executed in the future > #if run.execution_date > timezone.utcnow(): > # self.log.error( > # "Execution date is in future: %s", > # run.execution_date > # ) > # continue > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6615) Remove double sorted in task_list CLI
[ https://issues.apache.org/jira/browse/AIRFLOW-6615?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6615: Description: tasks list is sorted twice in task_list CLI. > Remove double sorted in task_list CLI > - > > Key: AIRFLOW-6615 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6615 > Project: Apache Airflow > Issue Type: Bug > Components: cli >Affects Versions: 1.10.7 >Reporter: Xinbin Huang >Assignee: Xinbin Huang >Priority: Major > Fix For: 2.0.0 > > > tasks list is sorted twice in task_list CLI. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6615) Remove double sorted in task_list CLI
[ https://issues.apache.org/jira/browse/AIRFLOW-6615?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6615. - Fix Version/s: 2.0.0 Resolution: Fixed > Remove double sorted in task_list CLI > - > > Key: AIRFLOW-6615 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6615 > Project: Apache Airflow > Issue Type: Bug > Components: cli >Affects Versions: 1.10.7 >Reporter: Xinbin Huang >Assignee: Xinbin Huang >Priority: Major > Fix For: 2.0.0 > > -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Work started] (AIRFLOW-6611) Add missing ProxyFix configs to default_airflow.cfg with docs
[ https://issues.apache.org/jira/browse/AIRFLOW-6611?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Work on AIRFLOW-6611 started by Kaxil Naik. --- > Add missing ProxyFix configs to default_airflow.cfg with docs > - > > Key: AIRFLOW-6611 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6611 > Project: Apache Airflow > Issue Type: Improvement > Components: configuration, documentation >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 1.10.8 > > > https://github.com/apache/airflow/commit/d90ddbd189e4ef99d45af44abf8580b29896a4e0 > added some new configs which weren't documented and weren't added in > default_airflow.cfg -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6611) Add missing ProxyFix configs to default_airflow.cfg with docs
Kaxil Naik created AIRFLOW-6611: --- Summary: Add missing ProxyFix configs to default_airflow.cfg with docs Key: AIRFLOW-6611 URL: https://issues.apache.org/jira/browse/AIRFLOW-6611 Project: Apache Airflow Issue Type: Improvement Components: configuration, documentation Affects Versions: 1.10.7 Reporter: Kaxil Naik Assignee: Kaxil Naik Fix For: 1.10.8 https://github.com/apache/airflow/commit/d90ddbd189e4ef99d45af44abf8580b29896a4e0 added some new configs which weren't documented and weren't added in default_airflow.cfg -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6549) Configuration related to reverse proxy is loaded using wrong data type.
[ https://issues.apache.org/jira/browse/AIRFLOW-6549?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6549. - Resolution: Fixed > Configuration related to reverse proxy is loaded using wrong data type. > --- > > Key: AIRFLOW-6549 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6549 > Project: Apache Airflow > Issue Type: Bug > Components: configuration >Affects Versions: 1.10.7 >Reporter: Mathieu Poussin >Priority: Major > > The configuration management system is loading the variables related to > reverse proxy fix handling with the wrong format, example with the following > configuration : > > {code:java} > enable_proxy_fix = True > proxy_fix_x_for = 1 > proxy_fix_x_proto = 1 > proxy_fix_x_host = 1 > proxy_fix_x_port = 1 > proxy_fix_x_prefix = 1 > {code} > When running the webserver, a 500 is returned with the following stacktrace: > > {code:java} > TypeError: '>=' not supported between instances of 'int' and 'str' > File "flask/app.py", line 2463, in __call__ > return self.wsgi_app(environ, start_response) > File "werkzeug/middleware/proxy_fix.py", line 195, in __call__ > x_for = self._get_trusted_comma(self.x_for, > environ_get("HTTP_X_FORWARDED_FOR")) > File "werkzeug/middleware/proxy_fix.py", line 166, in _get_trusted_comma > if len(values) >= trusted: > {code} > When analysic local variable (thanks Sentry), I can see that trusted is equal > to '1' (as string) and not 1 (as int) as it should be. > > This makes the usage of reverse proxy impossible. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (AIRFLOW-6549) Configuration related to reverse proxy is loaded using wrong data type.
[ https://issues.apache.org/jira/browse/AIRFLOW-6549?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17020551#comment-17020551 ] Kaxil Naik commented on AIRFLOW-6549: - Duplicated by https://issues.apache.org/jira/browse/AIRFLOW-6345 > Configuration related to reverse proxy is loaded using wrong data type. > --- > > Key: AIRFLOW-6549 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6549 > Project: Apache Airflow > Issue Type: Bug > Components: configuration >Affects Versions: 1.10.7 >Reporter: Mathieu Poussin >Priority: Major > > The configuration management system is loading the variables related to > reverse proxy fix handling with the wrong format, example with the following > configuration : > > {code:java} > enable_proxy_fix = True > proxy_fix_x_for = 1 > proxy_fix_x_proto = 1 > proxy_fix_x_host = 1 > proxy_fix_x_port = 1 > proxy_fix_x_prefix = 1 > {code} > When running the webserver, a 500 is returned with the following stacktrace: > > {code:java} > TypeError: '>=' not supported between instances of 'int' and 'str' > File "flask/app.py", line 2463, in __call__ > return self.wsgi_app(environ, start_response) > File "werkzeug/middleware/proxy_fix.py", line 195, in __call__ > x_for = self._get_trusted_comma(self.x_for, > environ_get("HTTP_X_FORWARDED_FOR")) > File "werkzeug/middleware/proxy_fix.py", line 166, in _get_trusted_comma > if len(values) >= trusted: > {code} > When analysic local variable (thanks Sentry), I can see that trusted is equal > to '1' (as string) and not 1 (as int) as it should be. > > This makes the usage of reverse proxy impossible. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6608) Change logging level to DEBUG for PythonOperator Env exports
[ https://issues.apache.org/jira/browse/AIRFLOW-6608?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6608. - Resolution: Fixed > Change logging level to DEBUG for PythonOperator Env exports > > > Key: AIRFLOW-6608 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6608 > Project: Apache Airflow > Issue Type: Improvement > Components: operators >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 1.10.8 > > > The following logs just create noise and add nothing for the end-users. We > should change this to DEBUG > {noformat} > [2020-01-20 17:58:27,364] {python_operator.py:105} INFO - Exporting the > following env vars: > AIRFLOW_CTX_DAG_EMAIL=a...@mail.com > AIRFLOW_CTX_DAG_OWNER=me > AIRFLOW_CTX_DAG_ID=my_dag_id > AIRFLOW_CTX_TASK_ID=my_task_id > AIRFLOW_CTX_EXECUTION_DATE=2020-01-20T12:27:48.408593+00:00 > AIRFLOW_CTX_DAG_RUN_ID=manual__2020-01-20T12:27:48.408593+00:00 > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Work started] (AIRFLOW-6608) Change logging level to DEBUG for PythonOperator Env exports
[ https://issues.apache.org/jira/browse/AIRFLOW-6608?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Work on AIRFLOW-6608 started by Kaxil Naik. --- > Change logging level to DEBUG for PythonOperator Env exports > > > Key: AIRFLOW-6608 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6608 > Project: Apache Airflow > Issue Type: Improvement > Components: operators >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 1.10.8 > > > The following logs just create noise and add nothing for the end-users. We > should change this to DEBUG > {noformat} > [2020-01-20 17:58:27,364] {python_operator.py:105} INFO - Exporting the > following env vars: > AIRFLOW_CTX_DAG_EMAIL=a...@mail.com > AIRFLOW_CTX_DAG_OWNER=me > AIRFLOW_CTX_DAG_ID=my_dag_id > AIRFLOW_CTX_TASK_ID=my_task_id > AIRFLOW_CTX_EXECUTION_DATE=2020-01-20T12:27:48.408593+00:00 > AIRFLOW_CTX_DAG_RUN_ID=manual__2020-01-20T12:27:48.408593+00:00 > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6608) Change logging level to DEBUG for PythonOperator Env exports
Kaxil Naik created AIRFLOW-6608: --- Summary: Change logging level to DEBUG for PythonOperator Env exports Key: AIRFLOW-6608 URL: https://issues.apache.org/jira/browse/AIRFLOW-6608 Project: Apache Airflow Issue Type: Improvement Components: operators Affects Versions: 1.10.7 Reporter: Kaxil Naik Assignee: Kaxil Naik Fix For: 1.10.8 The following logs just create noise and add nothing for the end-users. We should change this to DEBUG {noformat} [2020-01-20 17:58:27,364] {python_operator.py:105} INFO - Exporting the following env vars: AIRFLOW_CTX_DAG_EMAIL=a...@mail.com AIRFLOW_CTX_DAG_OWNER=me AIRFLOW_CTX_DAG_ID=my_dag_id AIRFLOW_CTX_TASK_ID=my_task_id AIRFLOW_CTX_EXECUTION_DATE=2020-01-20T12:27:48.408593+00:00 AIRFLOW_CTX_DAG_RUN_ID=manual__2020-01-20T12:27:48.408593+00:00 {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Work started] (AIRFLOW-5946) Store & Read code from DB for Code View
[ https://issues.apache.org/jira/browse/AIRFLOW-5946?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Work on AIRFLOW-5946 started by Kaxil Naik. --- > Store & Read code from DB for Code View > --- > > Key: AIRFLOW-5946 > URL: https://issues.apache.org/jira/browse/AIRFLOW-5946 > Project: Apache Airflow > Issue Type: Improvement > Components: webserver >Affects Versions: 2.0.0, 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Major > > To make Webserver not need DAG Files we need to find a way to get Code to > display in *Code View*. > - Store in lazy-loaded column in SerializedDag table > - Save in a new table with DAG_id and store versions as well. Add a limit of > last 10 versions. This is just needed by Code View so not a problem if we > store in New table > OR - Just keep as reading from file? -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-1467) allow tasks to use more than one pool slot
[ https://issues.apache.org/jira/browse/AIRFLOW-1467?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-1467. - Resolution: Fixed > allow tasks to use more than one pool slot > -- > > Key: AIRFLOW-1467 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1467 > Project: Apache Airflow > Issue Type: Improvement > Components: scheduler >Reporter: Adrian Bridgett >Assignee: Lokesh Lal >Priority: Trivial > Labels: pool > Fix For: 1.10.8 > > > It would be useful to have tasks use more than a single pool slot. > Our use case is actually to limit how many tasks run on a head node (due to > memory constraints), currently we have to set a pool limit limiting how many > tasks. > Ideally we could set the pool size to e.g amount of memory and then set those > tasks pool_usage (or whatever the option would be called) to the amount of > memory we think they'll use. This way the pool would let lots of small tasks > run or just a few large tasks. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (AIRFLOW-6576) Deleting a task with SLA crashes the scheduler
[ https://issues.apache.org/jira/browse/AIRFLOW-6576?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik updated AIRFLOW-6576: Fix Version/s: 1.10.8 > Deleting a task with SLA crashes the scheduler > -- > > Key: AIRFLOW-6576 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6576 > Project: Apache Airflow > Issue Type: New Feature > Components: scheduler >Affects Versions: 2.0.0 >Reporter: QP Hou >Assignee: QP Hou >Priority: Major > Fix For: 2.0.0, 1.10.8 > > > When a task with SLA is deleted from a DAG after the SLA miss is logged but > before the notification was sent, scheduler will crash with an > AirflowException -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6595) Use TaskNotFound exception instead of AirflowException
[ https://issues.apache.org/jira/browse/AIRFLOW-6595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6595. - Fix Version/s: 1.10.8 2.0.0 Resolution: Fixed > Use TaskNotFound exception instead of AirflowException > -- > > Key: AIRFLOW-6595 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6595 > Project: Apache Airflow > Issue Type: Improvement > Components: core >Affects Versions: 1.10.7 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0, 1.10.8 > > > We should use *TaskNotFound* exception when the Task no longer exists instead > of the general AirflowException -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6576) Deleting a task with SLA crashes the scheduler
[ https://issues.apache.org/jira/browse/AIRFLOW-6576?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6576. - Fix Version/s: 2.0.0 Resolution: Fixed > Deleting a task with SLA crashes the scheduler > -- > > Key: AIRFLOW-6576 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6576 > Project: Apache Airflow > Issue Type: New Feature > Components: scheduler >Affects Versions: 2.0.0 >Reporter: QP Hou >Assignee: QP Hou >Priority: Major > Fix For: 2.0.0 > > > When a task with SLA is deleted from a DAG after the SLA miss is logged but > before the notification was sent, scheduler will crash with an > AirflowException -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6596) Enforce PR description should not be empty
[ https://issues.apache.org/jira/browse/AIRFLOW-6596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6596. - Resolution: Fixed > Enforce PR description should not be empty > -- > > Key: AIRFLOW-6596 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6596 > Project: Apache Airflow > Issue Type: Improvement > Components: PR tool >Affects Versions: 2.0.0 >Reporter: Kaxil Naik >Assignee: Kaxil Naik >Priority: Minor > Fix For: 2.0.0 > > > Enforce description should not be empty. This is so that contributors not > only just ticks "Description above provides context of the change" but adds > the actual details -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6596) Enforce PR description should not be empty
Kaxil Naik created AIRFLOW-6596: --- Summary: Enforce PR description should not be empty Key: AIRFLOW-6596 URL: https://issues.apache.org/jira/browse/AIRFLOW-6596 Project: Apache Airflow Issue Type: Improvement Components: PR tool Affects Versions: 2.0.0 Reporter: Kaxil Naik Assignee: Kaxil Naik Fix For: 2.0.0 Enforce description should not be empty. This is so that contributors not only just ticks "Description above provides context of the change" but adds the actual details -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (AIRFLOW-6595) Use TaskNotFound exception instead of AirflowException
Kaxil Naik created AIRFLOW-6595: --- Summary: Use TaskNotFound exception instead of AirflowException Key: AIRFLOW-6595 URL: https://issues.apache.org/jira/browse/AIRFLOW-6595 Project: Apache Airflow Issue Type: Improvement Components: core Affects Versions: 1.10.7 Reporter: Kaxil Naik Assignee: Kaxil Naik We should use *TaskNotFound* exception when the Task no longer exists instead of the general AirflowException -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Resolved] (AIRFLOW-6568) Add some more entries (Emacs related files) to .gitignore
[ https://issues.apache.org/jira/browse/AIRFLOW-6568?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Kaxil Naik resolved AIRFLOW-6568. - Fix Version/s: 2.0.0 Resolution: Fixed > Add some more entries (Emacs related files) to .gitignore > -- > > Key: AIRFLOW-6568 > URL: https://issues.apache.org/jira/browse/AIRFLOW-6568 > Project: Apache Airflow > Issue Type: Improvement > Components: build >Affects Versions: 1.10.8 >Reporter: Kousuke Saruta >Assignee: Kousuke Saruta >Priority: Minor > Fix For: 2.0.0 > > > Emacs generates some types of backup files. > They should be ignored by the repository. -- This message was sent by Atlassian Jira (v8.3.4#803005)