+1 binding.

- Ran an example DAG with the local executor under Python 3.7.
- Checked the sha hashes:

MacBook-Pro-van-Fokko:Downloads fokkodriesprong$ shasum -a 512
apache-airflow-1.10.6rc1-bin.tar.gz
a4bde283b3f32fbc7a603da6deb3f015bda470b3dba1934a40b72212eba5391a9365400b8595d61afd27b2f8b6d9be220caac73c411d2172357625d0050cd449
 apache-airflow-1.10.6rc1-bin.tar.gz
MacBook-Pro-van-Fokko:Downloads fokkodriesprong$ cat
apache-airflow-1.10.6rc1-bin.tar.gz.sha512
apache-airflow-1.10.6rc1-bin.tar.gz: A4BDE283 B3F32FBC 7A603DA6 DEB3F015
                                     BDA470B3 DBA1934A 40B72212 EBA5391A
                                     9365400B 8595D61A FD27B2F8 B6D9BE22
                                     0CAAC73C 411D2172 357625D0 050CD449

MacBook-Pro-van-Fokko:Downloads fokkodriesprong$ shasum -a 512
apache-airflow-1.10.6rc1-source.tar.gz
a5d72b1cd7af4c8a883ba0ff38b7d602a7d192184ce270cb9d8ba809e633c3f9e7fdf8b48ccdafedc52879808245eee71c0c161e6d16fbe8ce4399aa03b85907
 apache-airflow-1.10.6rc1-source.tar.gz
MacBook-Pro-van-Fokko:Downloads fokkodriesprong$ cat
apache-airflow-1.10.6rc1-source.tar.gz.sha512
apache-airflow-1.10.6rc1-source.tar.gz: A5D72B1C D7AF4C8A 883BA0FF 38B7D602
                                        A7D19218 4CE270CB 9D8BA809 E633C3F9
                                        E7FDF8B4 8CCDAFED C5287980 8245EEE7
                                        1C0C161E 6D16FBE8 CE4399AA 03B85907

MacBook-Pro-van-Fokko:Downloads fokkodriesprong$ shasum -a 512
apache_airflow-1.10.6rc1-py2.py3-none-any.whl
0b981c63ec478bfdb7d1f286f6d69b4d189f64d5c98c16e42246281d793a08845b976fc8e6b192656b547d36ad71e8483dee84a99869d94efff19c0dd9ebb988
 apache_airflow-1.10.6rc1-py2.py3-none-any.whl
apache_airflow-1.10.6rc1-py2.py3-none-any.whl:
0B981C63 EC478BFD B7D1F286 F6D69B4D 189F64D5 C98C16E4 2246281D 793A0884
5B976FC8
 E6B19265 6B547D36 AD71E848 3DEE84A9 9869D94E FFF19C0D D9EBB988

- Checked the signatures:

MacBook-Pro-van-Fokko:Downloads fokkodriesprong$ gpg --import ~/Desktop/KEYS
gpg: key 764129647BEC5C4B: public key "Chris Riccomini <
criccom...@apache.org>" imported
gpg: key 35190B83D905A0BA: public key "Bolke de Bruin (CODE SIGNING KEY) <
bo...@apache.org>" imported
gpg: key E6F0505CC7BC7E0D: public key "Maxime Beauchemin <
maximebeauche...@apache.org>" imported
gpg: key 807C731A8C82A095: 1 signature not checked due to a missing key
gpg: key 807C731A8C82A095: public key "Ash Berlin-Taylor <a...@apache.org>"
imported
gpg: key DD7484A025F17494: public key "Kaxil Naik <kaxiln...@apache.org>"
imported
gpg: key 75FCCD0A25FA0E4B: public key "Kaxil Naik <kaxiln...@gmail.com>"
imported
gpg: Total number processed: 6
gpg:               imported: 6

MacBook-Pro-van-Fokko:Downloads fokkodriesprong$ gpg --verify
apache-airflow-1.10.6rc1-bin.tar.gz.asc
gpg: assuming signed data in 'apache-airflow-1.10.6rc1-bin.tar.gz'
gpg: Signature made vr 18 okt 15:58:25 2019 CEST
gpg:                using RSA key 5CCAEAC758ED64CA323F053B807C731A8C82A095
gpg:                issuer "a...@apache.org"
gpg: Good signature from "Ash Berlin-Taylor <a...@apache.org>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the
owner.
Primary key fingerprint: 5CCA EAC7 58ED 64CA 323F  053B 807C 731A 8C82 A095

MacBook-Pro-van-Fokko:Downloads fokkodriesprong$ gpg --verify
apache-airflow-1.10.6rc1-source.tar.gz.asc
gpg: assuming signed data in 'apache-airflow-1.10.6rc1-source.tar.gz'
gpg: Signature made vr 18 okt 15:58:06 2019 CEST
gpg:                using RSA key 5CCAEAC758ED64CA323F053B807C731A8C82A095
gpg:                issuer "a...@apache.org"
gpg: Good signature from "Ash Berlin-Taylor <a...@apache.org>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the
owner.
Primary key fingerprint: 5CCA EAC7 58ED 64CA 323F  053B 807C 731A 8C82 A095

MacBook-Pro-van-Fokko:Downloads fokkodriesprong$ gpg --verify
 apache_airflow-1.10.6rc1-py2.py3-none-any.whl.asc
gpg: assuming signed data in 'apache_airflow-1.10.6rc1-py2.py3-none-any.whl'
gpg: Signature made vr 18 okt 15:58:25 2019 CEST
gpg:                using RSA key 5CCAEAC758ED64CA323F053B807C731A8C82A095
gpg:                issuer "a...@apache.org"
gpg: Good signature from "Ash Berlin-Taylor <a...@apache.org>" [unknown]
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the
owner.
Primary key fingerprint: 5CCA EAC7 58ED 64CA 323F  053B 807C 731A 8C82 A095

Thanks for releasing 1.10.6 :-)

Cheers, Fokko


Op di 22 okt. 2019 om 00:34 schreef Kaxil Naik <kaxiln...@gmail.com>:

> +1 (binding) - tested on Py 3.7.3 with example DAGs
>
> On Sat, Oct 19, 2019 at 11:30 AM Ash Berlin-Taylor <a...@apache.org> wrote:
>
> > I did all my testing of the RC on 3.7.3 so the bits I testee (core
> mostly,
> > didn't test many operators) work
> >
> > On 19 October 2019 07:56:40 BST, Jarek Potiuk <jarek.pot...@polidea.com>
> > wrote:
> > >It is stated in CONTRIBUTING.md (for v1-10) and CONTRIBUTING.rst (in
> > >master) - we are working on improving those docs. But indeed it should
> > >be
> > >stated in the user documentation :). I will make sure to include that
> > >in
> > >our Google Season of Docs initiative.
> > >
> > >Yes. It is quite possible to add 3.7 support. We are in the process of
> > >migrating to another CI system (GitLab CI likely) and there we should
> > >be
> > >able to run tests on more python versions. It's not a big effort at
> > >all.
> > >
> > >J.
> > >
> > >
> > >On Sat, Oct 19, 2019 at 5:16 AM Kevin Yang <yrql...@gmail.com> wrote:
> > >
> > >> Thank you Jarek for the clarification, that make sense! I might be
> > >ignorant
> > >> but do we have an official place stating the python version we
> > >support?
> > >> People might not be keep tracking of the Python version in CI and use
> > >> breeze doc as a reference. Do you think it make sense to update
> > >breeze doc
> > >> to reflect what we intend to support to just avoid a bit confusion?
> > >>
> > >> As as a side topic, since we've been talking about there's a not
> > >short
> > >> timeline for 2.0 release in other threads and the current code base
> > >may
> > >> just support 3.7, do you think it make sense to try add the support
> > >in
> > >> 1.10.*( optimally we just update a few docs and build a CI test for
> > >it, and
> > >> if it cannot just work outta box or requires a lot work we can give
> > >up
> > >> eariler :D).
> > >>
> > >> Cheers,
> > >> Kevin Y
> > >>
> > >> On Fri, Oct 18, 2019 at 7:26 PM Jarek Potiuk
> > ><jarek.pot...@polidea.com>
> > >> wrote:
> > >>
> > >> > This is really what's intended. 1.10 supports officialy
> > >2.7/3.5/3.6. See
> > >> > for example here where we do not test against 3.7
> > >> > https://travis-ci.org/apache/airflow/builds/599638353
> > >> >
> > >> > On the other hand 2.0.*/master should support 3.5, 3.6, 3.7 (and we
> > >test
> > >> > against all those versions): for example here -
> > >> > https://travis-ci.org/apache/airflow/builds/599628309 (never mind
> > >the
> > >> > failing kubernetes builds). This should also be quite deliberate
> > >that if
> > >> > you want to go
> > >> >
> > >> > Then 1.10.*  might simply just work in 3.7 and if you need it, it
> > >should
> > >> be
> > >> > as easy as changing ./breeze-complete to add it). But we do not
> > >have
> > >> > pre-built images for 3.7 so it will take a long time to build it
> > >for the
> > >> > first time from the scratch.
> > >> >
> > >> > J.
> > >> >
> > >> >
> > >> >
> > >> > On Sat, Oct 19, 2019 at 12:43 AM Kevin Yang <yrql...@gmail.com>
> > >wrote:
> > >> >
> > >> > > Just started to play with it for a bit and seems like we got a
> > >small
> > >> bit
> > >> > > inconsistency between what version of python breeze supports. The
> > >code
> > >> > > would allow only `2.7, 3.5, 3.6`
> > >> > >
> > ><https://github.com/apache/airflow/blob/v1-10-test/breeze-complete#L3>
> > >> > but
> > >> > > we claim we support `3.5, 3.6, 3.7`
> > >> > >
> > ><https://github.com/apache/airflow/blame/v1-10-test/BREEZE.rst#L532>.
> > >> > And
> > >> > > thus `./breeze` will complain about me running in a python3.7
> > >env. My
> > >> > > understanding is that we want to support 2.7, 3.5, 3.6 and 3.7 in
> > >> 1.10.6
> > >> > > until we drop 2.7 support in 2.0, am I right?
> > >> > >
> > >> > > On Fri, Oct 18, 2019 at 7:12 AM Ash Berlin-Taylor
> > ><a...@apache.org>
> > >> > wrote:
> > >> > >
> > >> > > > Hey all,
> > >> > > >
> > >> > > > I have cut Airflow 1.10.6 RC1. This email is calling a vote on
> > >the
> > >> > > > release, which will last for 96 hours, until Tuesday, October
> > >22nd at
> > >> > > 14:30
> > >> > > > UTC. (Sorry this is mostly over the weekend again, I've
> > >extended the
> > >> > vote
> > >> > > > by one day to give two working days to test.)
> > >> > > >
> > >> > > > Consider this my (binding) +1.
> > >> > > >
> > >> > > > Airflow 1.10.6 RC1 is available at: <
> > >> > > > https://dist.apache.org/repos/dist/dev/airflow/1.10.6rc1/>
> > >> > > >
> > >> > > > *apache-airflow-1.10.6rc1-source.tar.gz* is a source release
> > >that
> > >> comes
> > >> > > > with INSTALL instructions.
> > >> > > > *apache-airflow-1.10.6rc1-bin.tar.gz* is the binary Python
> > >"sdist"
> > >> > > release.
> > >> > > > *apache_airflow-1.10.6rc1-py2.py3-none-any.whl* is the binary
> > >Python
> > >> > > > "wheel" release.
> > >> > > >
> > >> > > > Public keys are available at: <
> > >> > > > https://dist.apache.org/repos/dist/release/airflow/KEYS>
> > >> > > >
> > >> > > > As per normal the rc1 is available for testing from PyPi.
> > >> > > >
> > >> > > > Only votes from PMC members are binding, but members of the
> > >community
> > >> > are
> > >> > > > encouraged to test the release and vote with "(non-binding)".
> > >> > > >
> > >> > > > Please note that the version number excludes the `rcX` string,
> > >so
> > >> it's
> > >> > > now
> > >> > > > simply 1.10.6. This will allow us to rename the artifact
> > >without
> > >> > > modifying
> > >> > > > the artifact checksums when we actually release.
> > >> > > >
> > >> > > > Changelog since 1.10.5:
> > >> > > >
> > >> > > > Airflow 1.10.6, 2019-10-22
> > >> > > > --------------------------
> > >> > > >
> > >> > > > New Features
> > >> > > > """"""""""""
> > >> > > > - [AIRFLOW-4908] Implement BigQuery Hooks/Operators for
> > >> update_dataset,
> > >> > > > patch_dataset and get_dataset (#5546)
> > >> > > > - [AIRFLOW-4741] Optionally report task errors to Sentry
> > >(#5407)
> > >> > > > - [AIRFLOW-4939] Add default_task_retries config (#5570)
> > >> > > > - [AIRFLOW-5508] Add config setting to limit which StatsD
> > >metrics are
> > >> > > > emitted (#6130)
> > >> > > > - [AIRFLOW-4222] Add cli autocomplete for bash & zsh (#5789)
> > >> > > > - [AIRFLOW-3871] Operators template fields can now render
> > >fields
> > >> inside
> > >> > > > objects (#4743)
> > >> > > >
> > >> > > > Improvements
> > >> > > > """"""""""""
> > >> > > > - [AIRFLOW-5127] Gzip support for
> > >> CassandraToGoogleCloudStorageOperator
> > >> > > > (#5738)
> > >> > > > - [AIRFLOW-5125] Add gzip support for
> > >> AdlsToGoogleCloudStorageOperator
> > >> > > > (#5737)
> > >> > > > - [AIRFLOW-5124] Add gzip support for
> > >S3ToGoogleCloudStorageOperator
> > >> > > > (#5736)
> > >> > > > - [AIRFLOW-5653] Log AirflowSkipException in task instance log
> > >to
> > >> make
> > >> > it
> > >> > > > clearer why tasks might be skipped (#6330)
> > >> > > > - [AIRFLOW-5343] Remove legacy SQLAlchmey pessimistic pool
> > >disconnect
> > >> > > > handling (#6034)
> > >> > > > - [AIRFLOW-5561] Relax httplib2 version required for gcp extra
> > >> (#6194)
> > >> > > > - [AIRFLOW-5657] Update the upper bound for dill dependency
> > >(#6334)
> > >> > > > - [AIRFLOW-5292] Allow ECSOperator to tag tasks (#5891)
> > >> > > > - [AIRFLOW-4939] Simplify Code for Default Task Retries (#6233)
> > >> > > > - [AIRFLOW-5126] Read ``aws_session_token`` in extra_config of
> > >the
> > >> aws
> > >> > > > hook (#6303)
> > >> > > > - [AIRFLOW-5636] Allow adding or overriding existing Operator
> > >Links
> > >> > > (#6302)
> > >> > > > - [AIRFLOW-4965] Handle quote exceptions in GCP AI operators
> > >(v1.10)
> > >> > > > (#6304)
> > >> > > > - [AIRFLOW-3783] Speed up Redshift to S3 UNload with HEADERs
> > >(#6309)
> > >> > > > - [AIRFLOW-3388] Add support to Array Jobs for AWS Batch
> > >Operator
> > >> > (#6153)
> > >> > > > - [AIRFLOW-4574] add option to provide private_key in SSHHook
> > >(#6104)
> > >> > > > (#6163)
> > >> > > > - [AIRFLOW-5530] Fix typo in AWS SQS sensors (#6012)
> > >> > > > - [AIRFLOW-5445] Reduce the required resources for the
> > >Kubernetes's
> > >> > > > sidecar (#6062)
> > >> > > > - [AIRFLOW-5443] Use alpine image in Kubernetes's sidecar
> > >(#6059)
> > >> > > > - [AIRFLOW-5344] Add --proxy-user parameter to
> > >SparkSubmitOperator
> > >> > > (#5948)
> > >> > > > - [AIRFLOW-3888] HA for Hive metastore connection (#4708)
> > >> > > > - [AIRFLOW-5269] Reuse session in Scheduler Job from health
> > >endpoint
> > >> > > > (#5873)
> > >> > > > - [AIRFLOW-5153] Option to force delete non-empty BQ datasets
> > >(#5768)
> > >> > > > - [AIRFLOW-4443] Document LatestOnly behavior for external
> > >trigger
> > >> > > (#5214)
> > >> > > > - [AIRFLOW-2891] Make DockerOperator container_name be
> > >templateable
> > >> > > (#5696)
> > >> > > > - [AIRFLOW-2891] allow configurable docker_operator container
> > >name
> > >> > > (#5689)
> > >> > > > - [AIRFLOW-4285] Update task dependency context definition and
> > >usage
> > >> > > > (#5079)
> > >> > > > - [AIRFLOW-5142] Fixed flaky Cassandra test (#5758)
> > >> > > > - [AIRFLOW-5218] Less polling of AWS Batch job status (#5825)
> > >> > > > - [AIRFLOW-4956] Fix LocalTaskJob heartbeat log spamming
> > >(#5589)
> > >> > > > - [AIRFLOW-3160] Load latest_dagruns asynchronously on home
> > >page
> > >> > (#5339)
> > >> > > > - [AIRFLOW-5560] Allow no confirmation on reset dags in
> > >`airflow
> > >> > > backfill`
> > >> > > > command (#6195)
> > >> > > > - [AIRFLOW-5280] conn: Remove aws_default's default region name
> > >> (#5879)
> > >> > > > - [AIRFLOW-5528] end_of_log_mark should not be a log record
> > >(#6159)
> > >> > > > - [AIRFLOW-5526] Update docs configuration due to migration of
> > >GCP
> > >> docs
> > >> > > > (#6154)
> > >> > > > - [AIRFLOW-4835] Refactor operator render_template (#5461)
> > >> > > >
> > >> > > > Bug Fixes
> > >> > > > """""""""
> > >> > > > - [AIRFLOW-5459] Use a dynamic tmp location in Dataflow
> > >operator
> > >> > (#6078)
> > >> > > > - [Airflow 4923] Fix Databricks hook leaks API secret in logs
> > >(#5635)
> > >> > > > - [AIRFLOW-5133] Keep original env state in
> > >> provide_gcp_credential_file
> > >> > > > (#5747)
> > >> > > > - [AIRFLOW-5497] Update docstring in
> > >> > ``airflow/utils/dag_processing.py``
> > >> > > > (#6314)
> > >> > > > - Revert/and then rework "[AIRFLOW-4797] Improve performance
> > >and
> > >> > > behaviour
> > >> > > > of zombie detection (#5511)" to improve performance (#5908)
> > >> > > > - [AIRFLOW-5634] Don't allow editing of DagModelView (#6308)
> > >> > > > - [AIRFLOW-4309] Remove Broken Dag error after Dag is deleted
> > >(#6102)
> > >> > > > - [AIRFLOW-5387] Fix "show paused" pagination bug (#6100)
> > >> > > > - [AIRFLOW-5489] Remove unneeded assignment of variable (#6106)
> > >> > > > - [AIRFLOW-5491] mark_tasks pydoc is incorrect (#6108)
> > >> > > > - [AIRFLOW-5492] added missing docstrings (#6107)
> > >> > > > - [AIRFLOW-5503] Fix tree view layout on HDPI screen (#6125)
> > >> > > > - [AIRFLOW-5481] Allow Deleting Renamed DAGs (#6101)
> > >> > > > - [AIRFLOW-3857] spark_submit_hook cannot kill driver pod in
> > >> Kubernetes
> > >> > > > (#4678)
> > >> > > > - [AIRFLOW-4391] Fix tooltip for None-State Tasks in 'Recent
> > >Tasks'
> > >> > > (#5909)
> > >> > > > - [AIRFLOW-5554] Require statsd 3.3.0 minimum (#6185)
> > >> > > > - [AIRFLOW-5306] Fix the display of links when they contain
> > >special
> > >> > > > characters (#5904)
> > >> > > > - [AIRFLOW-3705] Fix PostgresHook get_conn to use
> > >conn_name_attr
> > >> > (#5841)
> > >> > > > - [AIRFLOW-5581] Cleanly shutdown KubernetesJobWatcher for safe
> > >> > Scheduler
> > >> > > > shutdown on SIGTERM (#6237)
> > >> > > > - [AIRFLOW-5634] Don't allow disabled fields to be edited in
> > >> > DagModelView
> > >> > > > (#6307)
> > >> > > > - [AIRFLOW-4833] Allow to set Jinja env options in DAG
> > >declaration
> > >> > > (#5943)
> > >> > > > - [AIRFLOW-5408] Fix env variable name in Kubernetes template
> > >(#6016)
> > >> > > > - [AIRFLOW-5102] Worker jobs should terminate themselves if
> > >they
> > >> can't
> > >> > > > heartbeat (#6284)
> > >> > > > - [AIRFLOW-5572] Clear task reschedules when clearing task
> > >instances
> > >> > > > (#6217)
> > >> > > > - [AIRFLOW-5543] Fix tooltip disappears in tree and graph view
> > >(RBAC
> > >> > UI)
> > >> > > > (#6174)
> > >> > > > - [AIRFLOW-5444] Fix action_logging so that request.form for
> > >POST is
> > >> > > > logged (#6064)
> > >> > > > - [AIRFLOW-5484] fix PigCliHook has incorrect named parameter
> > >(#6112)
> > >> > > > - [AIRFLOW-5342] Fix MSSQL breaking task_instance db migration
> > >> (#6014)
> > >> > > > - [AIRFLOW-5556] Add separate config for timeout from scheduler
> > >dag
> > >> > > > processing (#6186)
> > >> > > > - [AIRFLOW-4858] Deprecate "Historical convenience functions"
> > >in
> > >> > > > airflow.configuration (#5495) (#6144)
> > >> > > > - [AIRFLOW-774] Fix long-broken DAG parsing Statsd metrics
> > >(#6157)
> > >> > > > - [AIRFLOW-5419] Use ``sudo`` to kill cleared tasks when
> > >running with
> > >> > > > impersonation (#6026) (#6176)
> > >> > > > - [AIRFLOW-5537] Yamllint is not needed as dependency on host
> > >> > > > - [AIRFLOW-5536] Better handling of temporary output files
> > >> > > > - [AIRFLOW-5535] Fix name of VERBOSE parameter
> > >> > > > - [AIRFLOW-5519] Fix sql_to_gcs operator missing multi-level
> > >default
> > >> > args
> > >> > > > by adding apply_defaults decorator  (#6146)
> > >> > > > - [AIRFLOW-5210] Make finding template files more efficient
> > >(#5815)
> > >> > > > - [AIRFLOW-5447] Scheduler stalls because second watcher thread
> > >in
> > >> > > default
> > >> > > > args (#6129)
> > >> > > >
> > >> > > > Doc-only changes
> > >> > > > """"""""""""""""
> > >> > > > - [AIRFLOW-5574] Fix Google Analytics script loading (#6218)
> > >> > > > - [AIRFLOW-5588] Add Celery's architecture diagram (#6247)
> > >> > > > - [AIRFLOW-5521] Fix link to GCP documentation (#6150)
> > >> > > > - [AIRFLOW-5398] Update contrib example DAGs to context manager
> > >> (#5998)
> > >> > > > - [AIRFLOW-5268] Apply same DAG naming conventions as in
> > >literature
> > >> > > (#5874)
> > >> > > > - [AIRFLOW-5101] Fix inconsistent owner value in examples
> > >(#5712)
> > >> > > > - [AIRFLOW-XXX] Fix typo - AWS DynamoDB Hook (#6319)
> > >> > > > - [AIRFLOW-XXX] Fix Documentation for adding extra Operator
> > >Links
> > >> > (#6301)
> > >> > > > - [AIRFLOW-XXX] Add section on task lifecycle & correct casing
> > >in
> > >> docs
> > >> > > > (#4681)
> > >> > > > - [AIRFLOW-XXX] Make it clear that 1.10.5 wasn't accidentally
> > >omitted
> > >> > > from
> > >> > > > UPDATING.md (#6240)
> > >> > > > - [AIRFLOW-XXX] Improve format in code-block directives (#6242)
> > >> > > > - [AIRFLOW-XXX] Format Sendgrid docs (#6245)
> > >> > > > - [AIRFLOW-XXX] Update to new logo (#6066)
> > >> > > > - [AIRFLOW-XXX] Typo in FAQ - schedule_interval (#6291)
> > >> > > > - [AIRFLOW-XXX] Add message about breaking change in
> > >> > > > DAG#get_task_instances in 1.10.4 (#6226)
> > >> > > > - [AIRFLOW-XXX] Fix incorrect units in docs for metrics using
> > >Timers
> > >> > > > (#6152)
> > >> > > > - [AIRFLOW-XXX] Fix backtick issues in .rst files & Add
> > >Precommit
> > >> hook
> > >> > > > (#6162)
> > >> > > > - [AIRFLOW-XXX] Update documentation about variables forcing
> > >answer
> > >> > > (#6158)
> > >> > > > - [AIRFLOW-XXX] Add a third way to configure authorization
> > >(#6134)
> > >> > > > - [AIRFLOW-XXX] Add example of running pre-commit hooks on
> > >single
> > >> file
> > >> > > > (#6143)
> > >> > > > - [AIRFLOW-XXX] Add information about default pool to docs
> > >(#6019)
> > >> > > > - [AIRFLOW-XXX] Make Breeze The default integration test
> > >environment
> > >> > > > (#6001)
> > >> > > >
> > >> > > > Misc/Internal
> > >> > > > """""""""""""
> > >> > > > - [AIRFLOW-5687] Upgrade pip to 19.0.2 in CI build pipeline
> > >(#6358)
> > >> > > (#6361)
> > >> > > > - [AIRFLOW-5533] Fixed failing CRON build (#6167)
> > >> > > > - [AIRFLOW-5130] Use GOOGLE_APPLICATION_CREDENTIALS constant
> > >from
> > >> > library
> > >> > > > (#5744)
> > >> > > > - [AIRFLOW-5369] Adds interactivity to pre-commits (#5976)
> > >> > > > - [AIRFLOW-5531] Replace deprecated log.warn() with
> > >log.warning()
> > >> > (#6165)
> > >> > > > - [AIRFLOW-4686] Make dags Pylint compatible (#5753)
> > >> > > > - [AIRFLOW-4864] Remove calls to load_test_config (#5502)
> > >> > > > - [AIRFLOW-XXX] Pin version of mypy so we are stable over time
> > >> (#6198)
> > >> > > > - [AIRFLOW-XXX] Add tests that got missed from #5127
> > >> > > > - [AIRFLOW-4928] Move config parses to class properties inside
> > >DagBag
> > >> > > > (#5557)
> > >> > > > - [AIRFLOW-5003] Making AWS Hooks pylint compatible (#5627)
> > >> > > > - [AIRFLOW-5580] Add base class for system test (#6229)
> > >> > > >
> > >> > > >
> > >> > >
> > >> >
> > >> >
> > >> > --
> > >> >
> > >> > Jarek Potiuk
> > >> > Polidea <https://www.polidea.com/> | Principal Software Engineer
> > >> >
> > >> > M: +48 660 796 129 <+48660796129>
> > >> > [image: Polidea] <https://www.polidea.com/>
> > >> >
> > >>
> > >
> > >
> > >--
> > >
> > >Jarek Potiuk
> > >Polidea <https://www.polidea.com/> | Principal Software Engineer
> > >
> > >M: +48 660 796 129 <+48660796129>
> > >[image: Polidea] <https://www.polidea.com/>
> >
>

Reply via email to