[GitHub] [airflow] RickRen7575 commented on pull request #18615: fix template string passed to env_vars when using render_template_as_native_obj for Kubernetes Pod Operator and add Tests

2021-10-06 Thread GitBox


RickRen7575 commented on pull request #18615:
URL: https://github.com/apache/airflow/pull/18615#issuecomment-936168221


   > My main concern here is that `env_vars` intentionally takes 
`List[V1EnvVar]` - taking a `dict` is currently for backward compatibility only 
at this point. It's not that we couldn't support this, but there was a 
conscious move to using the models from the k8s client.
   
   Passing in a `List[V1EnvVar]` also doesn't work for the same reason. If we 
just set it up so that `List[V1EnvVar]` could work with this, that would be 
great as well.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] RickRen7575 edited a comment on pull request #18615: fix template string passed to env_vars when using render_template_as_native_obj for Kubernetes Pod Operator and add Tests

2021-10-06 Thread GitBox


RickRen7575 edited a comment on pull request #18615:
URL: https://github.com/apache/airflow/pull/18615#issuecomment-936168221


   > My main concern here is that `env_vars` intentionally takes 
`List[V1EnvVar]` - taking a `dict` is currently for backward compatibility only 
at this point. It's not that we couldn't support this, but there was a 
conscious move to using the models from the k8s client.
   
   @jedcunningham Passing in a `List[V1EnvVar]` also doesn't work for the same 
reason. If we just set it up so that `List[V1EnvVar]` could work with this, 
that would be great as well.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on pull request #18772: Don't bake ENV and _cmd into tmp config for non-sudo

2021-10-06 Thread GitBox


ashb commented on pull request #18772:
URL: https://github.com/apache/airflow/pull/18772#issuecomment-936153337


   /cc @khalidmammadov 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb opened a new pull request #18772: Don't bake ENV and _cmd into tmp config for non-sudo

2021-10-06 Thread GitBox


ashb opened a new pull request #18772:
URL: https://github.com/apache/airflow/pull/18772


   If we are running tasks via sudo then AIRFLOW__ config env vars won't be 
visible anymore (without them showing up in `ps`) and we likely might not have 
permission to run the _cmd's specified to find the passwords.
   
   But if we are running as the same user then there is no need to "bake" those 
options in to the temporary config file -- if the operator decided they didn't 
want those values appearing in a config file on disk, then lets do our best to 
respect that.
   
   Note: this commit originally appears in 2019 (#4050) but a critical piece 
was missing, meaning that the secrets/envs were still actually appearing.
   
   Closes #18723
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ambeshsingh commented on issue #12103: Tasks can be stuck in running state indefinitely

2021-10-06 Thread GitBox


ambeshsingh commented on issue #12103:
URL: https://github.com/apache/airflow/issues/12103#issuecomment-936138592


   > This issue is reported against old version of Airflow that reached end of 
life. If the issue is reproducible on latest Airflow version please let us know
   
   @eladkal yes, I am on composer-1.17.2-airflow-1.10.15 version and 
intermittently facing this issue



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT commented on a change in pull request #17937: Enable FTPToS3Operator to transfer several files

2021-10-06 Thread GitBox


JavierLopezT commented on a change in pull request #17937:
URL: https://github.com/apache/airflow/pull/17937#discussion_r723149385



##
File path: airflow/providers/amazon/aws/transfers/ftp_to_s3.py
##
@@ -53,52 +64,89 @@ class FTPToS3Operator(BaseOperator):
 :type acl_policy: str
 """
 
-template_fields = (
-'s3_bucket',
-'s3_key',
-'ftp_path',
-)
+template_fields = ('ftp_path', 's3_bucket', 's3_key', 'ftp_filenames', 
's3_filenames')
 
 def __init__(
 self,
-s3_bucket,
-s3_key,
-ftp_path,
-ftp_conn_id='ftp_default',
-aws_conn_id='aws_default',
-replace=False,
-encrypt=False,
-gzip=False,
-acl_policy=None,
-*args,
+*,
+ftp_path: str,
+s3_bucket: str,
+s3_key: str,
+ftp_filenames: Optional[Union[str, List[str]]] = None,
+s3_filenames: Optional[Union[str, List[str]]] = None,
+ftp_conn_id: str = 'ftp_default',
+aws_conn_id: str = 'aws_default',
+replace: bool = False,
+encrypt: bool = False,
+gzip: bool = False,
+acl_policy: str = None,
 **kwargs,
 ):
-super().__init__(*args, **kwargs)
+super().__init__(**kwargs)
+self.ftp_path = ftp_path
 self.s3_bucket = s3_bucket
 self.s3_key = s3_key
-self.ftp_path = ftp_path
+self.ftp_filenames = ftp_filenames
+self.s3_filenames = s3_filenames
 self.aws_conn_id = aws_conn_id
 self.ftp_conn_id = ftp_conn_id
 self.replace = replace
 self.encrypt = encrypt
 self.gzip = gzip
 self.acl_policy = acl_policy
 
-def execute(self, context):
-s3_hook = S3Hook(self.aws_conn_id)
-ftp_hook = FTPHook(ftp_conn_id=self.ftp_conn_id)
+self.ftp_hook = FTPHook(ftp_conn_id=self.ftp_conn_id)

Review comment:
   Sure, I'll modify it. Why is that, though? 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723183980



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   Hmm, that's not what the JSON Schema specification says (OpenAPI takes 
both `required` and `minLength` directly from it), so this is more likely a bug 
in the implementation. In any case, maybe we should leave both fields out from 
the specification, but mention that both must not be empty if supplied, and 
validate them (being non-empty) manually instead.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723183980



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   Hmm, that's not what the JSON Schema specification says (OpenAPI takes 
both `required` and `minLength` directly from it), so this is more likely a bug 
in the implementation. In any case, maybe we should leave both fields out from 
the specification, but mention that both must be non-empty if supplied, and 
validate them manually instead.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723183980



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   Hmm, that's not what the JSON Schema specification says (OpenAPI takes 
both `required` and `minLength` directly from it), so this is more likely a bug 
in the implementation. In that case, maybe we should leave both fields out from 
the specification, but mention that both must be non-empty if supplied, and 
validate them manually instead.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


ashb commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723175547



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -74,22 +74,26 @@ class SecretsManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 However, these lists can be extended using the configuration parameter 
``extra_conn_words``.
 
 :param connections_prefix: Specifies the prefix of the secret to read to 
get Connections.
-If set to None (null), requests for connections will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
connections will not be
+sent to AWS Secrets Manager. If you don't want a connections_prefix, 
set it as an empty string
 :type connections_prefix: str
 :param variables_prefix: Specifies the prefix of the secret to read to get 
Variables.
-If set to None (null), requests for variables will not be sent to AWS 
Secrets Manager
+If set to None (null value in the configuration), requests for 
variables will not be sent to
+AWS Secrets Manager. If you don't want a variables_prefix, set it as 
an empty string
 :type variables_prefix: str
 :param config_prefix: Specifies the prefix of the secret to read to get 
Configurations.
-If set to None (null), requests for configurations will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
configurations will not be sent to
+AWS Secrets Manager. If you don't want a config_prefix, set it as an 
empty string

Review comment:
   Oh ugh. Any idea _why_ we have that? It seems odd to be able to 
configure it to use the Secrets Manager, but then not make requests...?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


JavierLopezT commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723174073



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -74,22 +74,26 @@ class SecretsManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 However, these lists can be extended using the configuration parameter 
``extra_conn_words``.
 
 :param connections_prefix: Specifies the prefix of the secret to read to 
get Connections.
-If set to None (null), requests for connections will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
connections will not be
+sent to AWS Secrets Manager. If you don't want a connections_prefix, 
set it as an empty string
 :type connections_prefix: str
 :param variables_prefix: Specifies the prefix of the secret to read to get 
Variables.
-If set to None (null), requests for variables will not be sent to AWS 
Secrets Manager
+If set to None (null value in the configuration), requests for 
variables will not be sent to
+AWS Secrets Manager. If you don't want a variables_prefix, set it as 
an empty string
 :type variables_prefix: str
 :param config_prefix: Specifies the prefix of the secret to read to get 
Configurations.
-If set to None (null), requests for configurations will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
configurations will not be sent to
+AWS Secrets Manager. If you don't want a config_prefix, set it as an 
empty string

Review comment:
   If it's null, then it shouldn't get connections. And this must be 
maintained because is what the previous behavior was:
   
https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/secrets-backends/aws-secrets-manager.html#optional-lookup
   
https://github.com/apache/airflow/blob/eda538f56cb2dc2728d303acddb42841fe419c36/airflow/providers/amazon/aws/secrets/secrets_manager.py#L57
   
https://github.com/apache/airflow/blob/eda538f56cb2dc2728d303acddb42841fe419c36/airflow/providers/amazon/aws/secrets/secrets_manager.py#L112-L113
   
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


JavierLopezT commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723174073



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -74,22 +74,26 @@ class SecretsManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 However, these lists can be extended using the configuration parameter 
``extra_conn_words``.
 
 :param connections_prefix: Specifies the prefix of the secret to read to 
get Connections.
-If set to None (null), requests for connections will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
connections will not be
+sent to AWS Secrets Manager. If you don't want a connections_prefix, 
set it as an empty string
 :type connections_prefix: str
 :param variables_prefix: Specifies the prefix of the secret to read to get 
Variables.
-If set to None (null), requests for variables will not be sent to AWS 
Secrets Manager
+If set to None (null value in the configuration), requests for 
variables will not be sent to
+AWS Secrets Manager. If you don't want a variables_prefix, set it as 
an empty string
 :type variables_prefix: str
 :param config_prefix: Specifies the prefix of the secret to read to get 
Configurations.
-If set to None (null), requests for configurations will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
configurations will not be sent to
+AWS Secrets Manager. If you don't want a config_prefix, set it as an 
empty string

Review comment:
   If it's null, then it shouldn't get connections. And this must be 
maintained because is what the previous behavior was:
   
https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/secrets-backends/aws-secrets-manager.html#optional-lookup
   
https://github.com/apache/airflow/blob/eda538f56cb2dc2728d303acddb42841fe419c36/airflow/providers/amazon/aws/secrets/secrets_manager.py#L57
   
https://github.com/apache/airflow/blob/eda538f56cb2dc2728d303acddb42841fe419c36/airflow/providers/amazon/aws/secrets/secrets_manager.py#L112
   
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil edited a comment on pull request #18759: Movin trigger dag to operations folder

2021-10-06 Thread GitBox


kaxil edited a comment on pull request #18759:
URL: https://github.com/apache/airflow/pull/18759#issuecomment-935745429


   Let's wait on merging this -- we can include it in 2.2.1


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (c9bf5f3 -> cf27419)

2021-10-06 Thread ash
This is an automated email from the ASF dual-hosted git repository.

ash pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from c9bf5f3  Coerce datetime to pendulum for timetable (#18522)
 add cf27419  Fix Pendulum 1.x references in documentation (#18766)

No new revisions were added by this update.

Summary of changes:
 docs/apache-airflow/templates-ref.rst | 16 
 docs/apache-airflow/timezone.rst  |  2 +-
 2 files changed, 9 insertions(+), 9 deletions(-)


[GitHub] [airflow] ashb closed issue #17853: Macros reference point to outdated pendulum information

2021-10-06 Thread GitBox


ashb closed issue #17853:
URL: https://github.com/apache/airflow/issues/17853


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb merged pull request #18766: Fix Pendulum 1.x references in documentation

2021-10-06 Thread GitBox


ashb merged pull request #18766:
URL: https://github.com/apache/airflow/pull/18766


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


ashb commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723171088



##
File path: 
docs/apache-airflow-providers-amazon/secrets-backends/aws-secrets-manager.rst
##
@@ -78,6 +79,7 @@ Verify that you can get the secret:
 "CreatedDate": "2020-04-08T02:10:35.132000+01:00"
 }
 
+If you don't want to use any ``connections_prefix`` for retrieving 
connections, set it as an empty string ``""`` in the configuration.

Review comment:
   ```suggestion
   If you don't want to use any ``connections_prefix`` for retrieving 
connections, set it as an ``null`` in the configuration.
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


ashb commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723170620



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -74,22 +74,26 @@ class SecretsManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 However, these lists can be extended using the configuration parameter 
``extra_conn_words``.
 
 :param connections_prefix: Specifies the prefix of the secret to read to 
get Connections.
-If set to None (null), requests for connections will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
connections will not be
+sent to AWS Secrets Manager. If you don't want a connections_prefix, 
set it as an empty string
 :type connections_prefix: str
 :param variables_prefix: Specifies the prefix of the secret to read to get 
Variables.
-If set to None (null), requests for variables will not be sent to AWS 
Secrets Manager
+If set to None (null value in the configuration), requests for 
variables will not be sent to
+AWS Secrets Manager. If you don't want a variables_prefix, set it as 
an empty string
 :type variables_prefix: str
 :param config_prefix: Specifies the prefix of the secret to read to get 
Configurations.
-If set to None (null), requests for configurations will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
configurations will not be sent to
+AWS Secrets Manager. If you don't want a config_prefix, set it as an 
empty string

Review comment:
   We don't need emptry string -- we can set it s `null` in the JSON config




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] GuidoTournois opened a new issue #18771: Current implementation of PagerdutyHook requires integration key to be present twice in connection

2021-10-06 Thread GitBox


GuidoTournois opened a new issue #18771:
URL: https://github.com/apache/airflow/issues/18771


   ### Apache Airflow Provider(s)
   
   pagerduty
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-pagerduty==2.0.1
   
   ### Apache Airflow version
   
   2.1.4 (latest released)
   
   ### Operating System
   
   macOS catalina
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   Currently, the PagerdutyHook has two attributes referring to the same 
integration key: `routing_key` and `token`.
   
   - `routing_key`: This attribute is retrieved from the `Extra` field and is 
used to perform the actual post request to the Pagerduty Endpoint. In the 
`create_event` the value is asserted to be not `None`.
   - `token`:  This attribute is retrieved from the `Password` field or can be 
set at initialization of a class instance by passing it into the init method of 
the class.  In the __Init__ method, its value is asserted to be not `None`. The 
token is **not used** for sending alerts to Pagerduty.
   
   As a result, the Integration key needs to set in both password and the extra 
field. This to me makes no sense at all. I would expect that I need to set the 
token only once in the Add connection form.
   
   Also, it makes the code unnecessarily complex.
   
   ### What you expected to happen
   
   _No response_
   
   ### How to reproduce
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


ephraimbuddy commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723153989



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   Probably we should have a different form for update? Post form should 
have some fields as required
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT commented on pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


JavierLopezT commented on pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#issuecomment-936060522


   > I'm not convinced this is needed.
   > 
   > I think the problem is that example had invalid JSON, (`{"x": True}` is 
not valid JSON) so as a result of this code
   > 
   > 
https://github.com/apache/airflow/blob/c9bf5f33e5d5bcbf7d31663a8571628434d7073f/airflow/configuration.py#L1031-L1036
   > 
   > it ended up silently ignoring the config you tried.
   
   Ok, I didn't know that. I searched for this but I couldn't find it. Thank 
you very much. I'll make changes accordingly


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT commented on a change in pull request #17937: Enable FTPToS3Operator to transfer several files

2021-10-06 Thread GitBox


JavierLopezT commented on a change in pull request #17937:
URL: https://github.com/apache/airflow/pull/17937#discussion_r723149385



##
File path: airflow/providers/amazon/aws/transfers/ftp_to_s3.py
##
@@ -53,52 +64,89 @@ class FTPToS3Operator(BaseOperator):
 :type acl_policy: str
 """
 
-template_fields = (
-'s3_bucket',
-'s3_key',
-'ftp_path',
-)
+template_fields = ('ftp_path', 's3_bucket', 's3_key', 'ftp_filenames', 
's3_filenames')
 
 def __init__(
 self,
-s3_bucket,
-s3_key,
-ftp_path,
-ftp_conn_id='ftp_default',
-aws_conn_id='aws_default',
-replace=False,
-encrypt=False,
-gzip=False,
-acl_policy=None,
-*args,
+*,
+ftp_path: str,
+s3_bucket: str,
+s3_key: str,
+ftp_filenames: Optional[Union[str, List[str]]] = None,
+s3_filenames: Optional[Union[str, List[str]]] = None,
+ftp_conn_id: str = 'ftp_default',
+aws_conn_id: str = 'aws_default',
+replace: bool = False,
+encrypt: bool = False,
+gzip: bool = False,
+acl_policy: str = None,
 **kwargs,
 ):
-super().__init__(*args, **kwargs)
+super().__init__(**kwargs)
+self.ftp_path = ftp_path
 self.s3_bucket = s3_bucket
 self.s3_key = s3_key
-self.ftp_path = ftp_path
+self.ftp_filenames = ftp_filenames
+self.s3_filenames = s3_filenames
 self.aws_conn_id = aws_conn_id
 self.ftp_conn_id = ftp_conn_id
 self.replace = replace
 self.encrypt = encrypt
 self.gzip = gzip
 self.acl_policy = acl_policy
 
-def execute(self, context):
-s3_hook = S3Hook(self.aws_conn_id)
-ftp_hook = FTPHook(ftp_conn_id=self.ftp_conn_id)
+self.ftp_hook = FTPHook(ftp_conn_id=self.ftp_conn_id)

Review comment:
   Sure, I'll modify it. Why is that, though? 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell commented on a change in pull request #18760: Amazon SQS Example

2021-10-06 Thread GitBox


josh-fell commented on a change in pull request #18760:
URL: https://github.com/apache/airflow/pull/18760#discussion_r723145605



##
File path: airflow/providers/amazon/aws/example_dags/example_sqs.py
##
@@ -0,0 +1,83 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow import DAG
+from airflow.operators.python import PythonOperator
+from airflow.providers.amazon.aws.hooks.sqs import SQSHook
+from airflow.providers.amazon.aws.operators.sqs import SQSPublishOperator
+from airflow.providers.amazon.aws.sensors.sqs import SQSSensor
+from airflow.utils.dates import days_ago
+
+QUEUE_NAME = 'Airflow-Example-Queue'
+AWS_CONN_ID = 'aws_default'
+
+
+def create_queue_fn(**kwargs):

Review comment:
   There is also an ongoing effort to transition away from using 
`days_ago(n)` for `start_date` to a static value as well cleaning up 
`default_args` usage so they are pertinent for the example. #18597 is an 
example of these updates for reference. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal opened a new issue #18770: Properly handle verify parameter in TrinoHook

2021-10-06 Thread GitBox


eladkal opened a new issue #18770:
URL: https://github.com/apache/airflow/issues/18770


   ### Body
   
   This task refers to comment left in the Hook:
   
   
https://github.com/apache/airflow/blob/c9bf5f33e5d5bcbf7d31663a8571628434d7073f/airflow/providers/trino/hooks/trino.py#L96-L100
   
   The change of https://github.com/trinodb/trino-python-client/pull/31  was 
released in trino version 0.301.0
   
   **The task:**
   Change the code + 
[tests](https://github.com/apache/airflow/blob/main/tests/providers/trino/hooks/test_trino.py)
 to work with the verify parameter directly. It will also require setting min 
version for trino in `setup.py`
   
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] josh-fell commented on a change in pull request #18760: Amazon SQS Example

2021-10-06 Thread GitBox


josh-fell commented on a change in pull request #18760:
URL: https://github.com/apache/airflow/pull/18760#discussion_r723141935



##
File path: airflow/providers/amazon/aws/example_dags/example_sqs.py
##
@@ -0,0 +1,83 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+from airflow import DAG
+from airflow.operators.python import PythonOperator
+from airflow.providers.amazon.aws.hooks.sqs import SQSHook
+from airflow.providers.amazon.aws.operators.sqs import SQSPublishOperator
+from airflow.providers.amazon.aws.sensors.sqs import SQSSensor
+from airflow.utils.dates import days_ago
+
+QUEUE_NAME = 'Airflow-Example-Queue'
+AWS_CONN_ID = 'aws_default'
+
+
+def create_queue_fn(**kwargs):

Review comment:
   @john-jac There weren't a lot of provider example DAGs that warranted an 
update to use the TaskFlow API but #18278 converted the applicable ones. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18767: Update ftp.py

2021-10-06 Thread GitBox


ashb commented on a change in pull request #18767:
URL: https://github.com/apache/airflow/pull/18767#discussion_r723140771



##
File path: airflow/providers/ftp/hooks/ftp.py
##
@@ -60,6 +60,10 @@ def get_conn(self) -> ftplib.FTP:
 if self.conn is None:
 params = self.get_connection(self.ftp_conn_id)
 pasv = params.extra_dejson.get("passive", True)
+
+if params.port:
+ftplib.FTP.port = params.port

Review comment:
   Changing the global like this isn't right. Please find another way.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


ashb commented on pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#issuecomment-935999358


   I see there is one tiny change in here, but the rest of the changes need 
reverting.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb closed pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


ashb closed pull request #18764:
URL: https://github.com/apache/airflow/pull/18764


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #17937: Enable FTPToS3Operator to transfer several files

2021-10-06 Thread GitBox


ashb commented on a change in pull request #17937:
URL: https://github.com/apache/airflow/pull/17937#discussion_r723132242



##
File path: airflow/providers/amazon/aws/transfers/ftp_to_s3.py
##
@@ -53,52 +64,89 @@ class FTPToS3Operator(BaseOperator):
 :type acl_policy: str
 """
 
-template_fields = (
-'s3_bucket',
-'s3_key',
-'ftp_path',
-)
+template_fields = ('ftp_path', 's3_bucket', 's3_key', 'ftp_filenames', 
's3_filenames')
 
 def __init__(
 self,
-s3_bucket,
-s3_key,
-ftp_path,
-ftp_conn_id='ftp_default',
-aws_conn_id='aws_default',
-replace=False,
-encrypt=False,
-gzip=False,
-acl_policy=None,
-*args,
+*,
+ftp_path: str,
+s3_bucket: str,
+s3_key: str,
+ftp_filenames: Optional[Union[str, List[str]]] = None,
+s3_filenames: Optional[Union[str, List[str]]] = None,
+ftp_conn_id: str = 'ftp_default',
+aws_conn_id: str = 'aws_default',
+replace: bool = False,
+encrypt: bool = False,
+gzip: bool = False,
+acl_policy: str = None,
 **kwargs,
 ):
-super().__init__(*args, **kwargs)
+super().__init__(**kwargs)
+self.ftp_path = ftp_path
 self.s3_bucket = s3_bucket
 self.s3_key = s3_key
-self.ftp_path = ftp_path
+self.ftp_filenames = ftp_filenames
+self.s3_filenames = s3_filenames
 self.aws_conn_id = aws_conn_id
 self.ftp_conn_id = ftp_conn_id
 self.replace = replace
 self.encrypt = encrypt
 self.gzip = gzip
 self.acl_policy = acl_policy
 
-def execute(self, context):
-s3_hook = S3Hook(self.aws_conn_id)
-ftp_hook = FTPHook(ftp_conn_id=self.ftp_conn_id)
+self.ftp_hook = FTPHook(ftp_conn_id=self.ftp_conn_id)
+
+def __upload_to_s3_from_ftp(self, remote_filename, s3_file_key):
+s3 = S3Hook(self.aws_conn_id)
 
 with NamedTemporaryFile() as local_tmp_file:
-ftp_hook.retrieve_file(
-remote_full_path=self.ftp_path, 
local_full_path_or_buffer=local_tmp_file.name
+self.ftp_hook.retrieve_file(
+remote_full_path=remote_filename, 
local_full_path_or_buffer=local_tmp_file.name
 )
 
-s3_hook.load_file(
+s3.load_file(
 filename=local_tmp_file.name,
-key=self.s3_key,
+key=s3_file_key,
 bucket_name=self.s3_bucket,
 replace=self.replace,
 encrypt=self.encrypt,
 gzip=self.gzip,
 acl_policy=self.acl_policy,
 )
+self.log.info(f'File upload to {s3_file_key}')
+
+def execute(self, context):
+if self.ftp_filenames:
+if isinstance(self.ftp_filenames, str):
+self.log.info(f'Getting files in {self.ftp_path}')
+
+list_dir = self.ftp_hook.list_directory(
+path=self.ftp_path,
+)
+
+if self.ftp_filenames == 'all':

Review comment:
   ```suggestion
   if self.ftp_filenames == '*':
   ```
   
   Using a string of `all` here feels odd/error prone.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #17937: Enable FTPToS3Operator to transfer several files

2021-10-06 Thread GitBox


ashb commented on a change in pull request #17937:
URL: https://github.com/apache/airflow/pull/17937#discussion_r723131359



##
File path: airflow/providers/amazon/aws/transfers/ftp_to_s3.py
##
@@ -53,52 +64,89 @@ class FTPToS3Operator(BaseOperator):
 :type acl_policy: str
 """
 
-template_fields = (
-'s3_bucket',
-'s3_key',
-'ftp_path',
-)
+template_fields = ('ftp_path', 's3_bucket', 's3_key', 'ftp_filenames', 
's3_filenames')
 
 def __init__(
 self,
-s3_bucket,
-s3_key,
-ftp_path,
-ftp_conn_id='ftp_default',
-aws_conn_id='aws_default',
-replace=False,
-encrypt=False,
-gzip=False,
-acl_policy=None,
-*args,
+*,
+ftp_path: str,
+s3_bucket: str,
+s3_key: str,
+ftp_filenames: Optional[Union[str, List[str]]] = None,
+s3_filenames: Optional[Union[str, List[str]]] = None,
+ftp_conn_id: str = 'ftp_default',
+aws_conn_id: str = 'aws_default',
+replace: bool = False,
+encrypt: bool = False,
+gzip: bool = False,
+acl_policy: str = None,
 **kwargs,
 ):
-super().__init__(*args, **kwargs)
+super().__init__(**kwargs)
+self.ftp_path = ftp_path
 self.s3_bucket = s3_bucket
 self.s3_key = s3_key
-self.ftp_path = ftp_path
+self.ftp_filenames = ftp_filenames
+self.s3_filenames = s3_filenames
 self.aws_conn_id = aws_conn_id
 self.ftp_conn_id = ftp_conn_id
 self.replace = replace
 self.encrypt = encrypt
 self.gzip = gzip
 self.acl_policy = acl_policy
 
-def execute(self, context):
-s3_hook = S3Hook(self.aws_conn_id)
-ftp_hook = FTPHook(ftp_conn_id=self.ftp_conn_id)
+self.ftp_hook = FTPHook(ftp_conn_id=self.ftp_conn_id)

Review comment:
   Do not create hooks in the constructor of operators please.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


ashb commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723130861



##
File path: 
docs/apache-airflow-providers-amazon/secrets-backends/aws-secrets-manager.rst
##
@@ -27,7 +27,7 @@ Here is a sample configuration:
 
 [secrets]
 backend = 
airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend
-backend_kwargs = {"connections_prefix": "airflow/connections", 
"variables_prefix": "airflow/variables", "profile_name": "default", 
"full_url_mode": False}
+backend_kwargs = {"connections_prefix": "airflow/connections", 
"variables_prefix": "airflow/variables", "profile_name": "default", 
"full_url_mode": "False"}

Review comment:
   Yeah, this should work. I don't get what the problem is here, 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


ashb commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723129199



##
File path: 
docs/apache-airflow-providers-amazon/secrets-backends/aws-secrets-manager.rst
##
@@ -27,7 +27,7 @@ Here is a sample configuration:
 
 [secrets]
 backend = 
airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend
-backend_kwargs = {"connections_prefix": "airflow/connections", 
"variables_prefix": "airflow/variables", "profile_name": "default", 
"full_url_mode": False}
+backend_kwargs = {"connections_prefix": "airflow/connections", 
"variables_prefix": "airflow/variables", "profile_name": "default", 
"full_url_mode": "False"}

Review comment:
   Isn't backend kwargs parsed as json? So the correct fix for this would be
   
   
   ```suggestion
   backend_kwargs = {"connections_prefix": "airflow/connections", 
"variables_prefix": "airflow/variables", "profile_name": "default", 
"full_url_mode": false}
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on a change in pull request #18742: Always run at least one backfill from BackfillJob

2021-10-06 Thread GitBox


ashb commented on a change in pull request #18742:
URL: https://github.com/apache/airflow/pull/18742#discussion_r723122515



##
File path: tests/cli/commands/test_dag_command.py
##
@@ -543,6 +544,7 @@ def test_dag_test_show_dag(self, mock_get_dag, 
mock_executor, mock_render_dag):
 executor=mock_executor.return_value,
 start_date=cli_args.execution_date,
 end_date=cli_args.execution_date,
+run_at_least_once=True

Review comment:
   ```suggestion
   run_at_least_once=True,
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb edited a comment on pull request #18745: Don't install SQLAlchemy/Pendulum adapters for other DBs

2021-10-06 Thread GitBox


ashb edited a comment on pull request #18745:
URL: https://github.com/apache/airflow/pull/18745#issuecomment-935958530


   The same problem people are reporting in that thread is happening in this CI 
build now, and also for me locally
   
   ```
 _ 
TestMovingCoreToContrib.test_is_subclass_393_airflow_providers_google_cloud_transfers_mysql_to_gcs_MySQLToGCSOperator
 _
 
 thing = 
 comp = 'mysql_to_gcs'
 import_path = 'airflow.providers.google.cloud.transfers.mysql_to_gcs'
 
 def _dot_lookup(thing, comp, import_path):
 try:
 >   return getattr(thing, comp)
 E   AttributeError: module 
'airflow.providers.google.cloud.transfers' has no attribute 'mysql_to_gcs'
 
 /usr/local/lib/python3.6/unittest/mock.py:1075: AttributeError
 
 During handling of the above exception, another exception occurred:
 
 """
 
 try:
 from MySQLdb.release import version_info
 >   from . import _mysql
 E   ImportError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: cannot 
allocate memory in static TLS block
 
 /usr/local/lib/python3.6/site-packages/MySQLdb/__init__.py:18: ImportError
 
 During handling of the above exception, another exception occurred:
 
 a = (,)
 
 @wraps(func)
 def standalone_func(*a):
 >   return func(*(a + p.args), **p.kwargs)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ashb commented on pull request #18745: Don't install SQLAlchemy/Pendulum adapters for other DBs

2021-10-06 Thread GitBox


ashb commented on pull request #18745:
URL: https://github.com/apache/airflow/pull/18745#issuecomment-935958530


   The same problem people are reporting in that thread is happening in this CI 
build now, and also for me locally
   
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723114533



##
File path: tests/api_connexion/endpoints/test_user_endpoint.py
##
@@ -472,6 +472,45 @@ def test_change_with_update_maek(self, autoclean_username, 
autoclean_user_payloa
 assert data["first_name"] == "Changed"
 assert data["last_name"] == "User"
 
+@pytest.mark.parametrize(
+"payload, error_message",
+[
+({"username": "another_user"}, "The username `another_user` 
already exists"),
+({"email": "another_u...@example.com"}, "The email 
`another_u...@example.com` already exists"),
+],
+ids=["username", "email"],
+)
+@pytest.mark.usefixtures("user_different")
+@pytest.mark.usefixtures("autoclean_admin_user")
+def test_patch_already_exists(
+self,
+payload,
+error_message,
+autoclean_user_payload,
+autoclean_username,
+):
+autoclean_user_payload.update(payload)
+response = self.client.patch(
+f"/api/v1/users/{autoclean_username}",
+json=autoclean_user_payload,
+environ_overrides={"REMOTE_USER": "test"},
+)
+assert response.status_code == 409, response.json
+
+assert response.json["detail"] == error_message
+
+@pytest.mark.usefixtures('autoclean_admin_user')
+def test_username_can_be_updated(self, autoclean_user_payload, 
autoclean_username):
+testusername = 'testusername'
+autoclean_user_payload.update({"username": testusername})
+response = self.client.patch(
+f"/api/v1/users/{autoclean_username}",
+json=autoclean_user_payload,
+environ_overrides={"REMOTE_USER": "test"},
+)
+assert response.json['username'] == testusername
+_delete_user(username=testusername)

Review comment:
   This would cause the db to not be cleaned up correctly if the assertion 
above fails. The cleanup needs to be placed in a fixture instead.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723114533



##
File path: tests/api_connexion/endpoints/test_user_endpoint.py
##
@@ -472,6 +472,45 @@ def test_change_with_update_maek(self, autoclean_username, 
autoclean_user_payloa
 assert data["first_name"] == "Changed"
 assert data["last_name"] == "User"
 
+@pytest.mark.parametrize(
+"payload, error_message",
+[
+({"username": "another_user"}, "The username `another_user` 
already exists"),
+({"email": "another_u...@example.com"}, "The email 
`another_u...@example.com` already exists"),
+],
+ids=["username", "email"],
+)
+@pytest.mark.usefixtures("user_different")
+@pytest.mark.usefixtures("autoclean_admin_user")
+def test_patch_already_exists(
+self,
+payload,
+error_message,
+autoclean_user_payload,
+autoclean_username,
+):
+autoclean_user_payload.update(payload)
+response = self.client.patch(
+f"/api/v1/users/{autoclean_username}",
+json=autoclean_user_payload,
+environ_overrides={"REMOTE_USER": "test"},
+)
+assert response.status_code == 409, response.json
+
+assert response.json["detail"] == error_message
+
+@pytest.mark.usefixtures('autoclean_admin_user')
+def test_username_can_be_updated(self, autoclean_user_payload, 
autoclean_username):
+testusername = 'testusername'
+autoclean_user_payload.update({"username": testusername})
+response = self.client.patch(
+f"/api/v1/users/{autoclean_username}",
+json=autoclean_user_payload,
+environ_overrides={"REMOTE_USER": "test"},
+)
+assert response.json['username'] == testusername
+_delete_user(username=testusername)

Review comment:
   This would cause the db to cleaned up correctly if the assertion above 
fails. The cleanup needs to be placed in a fixture instead.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


ephraimbuddy commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723108726



##
File path: airflow/api_connexion/schemas/user_schema.py
##
@@ -33,11 +33,11 @@ class Meta:
 model = User
 dateformat = "iso"
 
-first_name = auto_field()
-last_name = auto_field()
-username = auto_field()
+first_name = auto_field(required=True)

Review comment:
   ```suggestion
   first_name = auto_field()
   ```

##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   Unfortunately, min length means required in OPENAPI Doc




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723113723



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:

Review comment:
   Maybe?
   
   ```suggestion
   if new_username is not None and new_username != user.username:
   ```
   
   This saves the `find_user` call (which incurs a round-trip to the backend) 
if the username is supplied but not changing. Same for the email logic.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #18767: Update ftp.py

2021-10-06 Thread GitBox


boring-cyborg[bot] commented on pull request #18767:
URL: https://github.com/apache/airflow/pull/18767#issuecomment-935932214


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] rentom opened a new pull request #18767: Update ftp.py

2021-10-06 Thread GitBox


rentom opened a new pull request #18767:
URL: https://github.com/apache/airflow/pull/18767


   FTPHook did not use specified FTP Connection port. Looks like FTPSHook do, 
so I've added pretty much the same line here.
   
   First time for me, trying to contribute someting here. So I might not be 
following every guideline or rule there is. Sorry for that. My suggestion here 
is however pretty simple.  
   
   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/main/UPDATING.md).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


ephraimbuddy commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723102202



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   You're correct!




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723091557



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   I think one important consideration is that those fields are pre-filled 
in the web UI (I think by Flask-Appbuilder?) so setting them as required 
imposes no cost to the user, but that's not the case for the API. The PATCH 
verb is designed to allow the user to not send fields they don't want to 
modify, so requiring fields is kind of against the spirit IMO, especially for 
`email` (`username` is already needed to access the endpoint anyway so 
requiring it in the payload is only unnecessary duplication and slightly less 
problematic).
   
   But even if we decide to make these fields required, they still need to have 
`minLength: 1` IMO. It's very non-obvious that setting those fields to an empty 
string implies not changing them. The expected behaviour for providing an empty 
string in a field is setting that field to an empty string.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal edited a comment on issue #18664: [Oracle] Oracle Hook - make it possible to define a schema in the connection parameters

2021-10-06 Thread GitBox


eladkal edited a comment on issue #18664:
URL: https://github.com/apache/airflow/issues/18664#issuecomment-935698467


   We probably also need to customize the connection fields in the UI and set 
Service Name with explanation to avoid confusion. Similar to what was done for 
Salesforce https://github.com/apache/airflow/pull/17162
   
   @mehmax  do you want to work on this?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal edited a comment on issue #18664: [Oracle] Oracle Hook - make it possible to define a schema in the connection parameters

2021-10-06 Thread GitBox


eladkal edited a comment on issue #18664:
URL: https://github.com/apache/airflow/issues/18664#issuecomment-935698467


   I can understand the source of the confusion.
   We probably need to customize the connection fields in the UI and set 
Service Name & DB Schema with explanation to avoid confusion. Similar to what 
was done for Salesforce https://github.com/apache/airflow/pull/17162


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal closed issue #17180: dag example doesn't work for DataflowJobStatusSensor

2021-10-06 Thread GitBox


eladkal closed issue #17180:
URL: https://github.com/apache/airflow/issues/17180


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] khalidmammadov commented on pull request #18690: Fixing tests that leave traces (users)

2021-10-06 Thread GitBox


khalidmammadov commented on pull request #18690:
URL: https://github.com/apache/airflow/pull/18690#issuecomment-935846752


   @potiuk @uranusjr Can you please review?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] lwyszomi commented on a change in pull request #18699: Dataflow Operators - use project and location from job in on_kill method.

2021-10-06 Thread GitBox


lwyszomi commented on a change in pull request #18699:
URL: https://github.com/apache/airflow/pull/18699#discussion_r723062024



##
File path: airflow/providers/google/cloud/hooks/dataflow.py
##
@@ -761,15 +775,23 @@ def start_flex_template(
 .launch(projectId=project_id, body=body, location=location)
 )
 response = request.execute(num_retries=self.num_retries)
-job_id = response["job"]["id"]
+job = response["job"]

Review comment:
   @eladkal I verified and we don't need to change anything in the example 
Dags. 
   
Also I added comment to the  #17180 because is already fixed. Thanks that 
you pointed this issue because now we can mark it as fixed :)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on pull request #18739: Move docker decorator example dag to docker provider

2021-10-06 Thread GitBox


ephraimbuddy commented on pull request #18739:
URL: https://github.com/apache/airflow/pull/18739#issuecomment-935838384


   Hi @potiuk , this failure seems related to DockerDecorator not usable in 
airflow versions less than 2.2.
   It is failing at `import_all_provider_classes` on airflow 2.1 for install 
from `wheel` but passing for `sdist`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] lwyszomi commented on issue #17180: dag example doesn't work for DataflowJobStatusSensor

2021-10-06 Thread GitBox


lwyszomi commented on issue #17180:
URL: https://github.com/apache/airflow/issues/17180#issuecomment-935836076


   This was fixed when @mnojek worked on the system tests for the google 
provider:
   https://github.com/apache/airflow/pull/18494
   
   Documentation have a new example dag with adjustment


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil edited a comment on issue #18541: Error when running dag & something to do with parsing the start date

2021-10-06 Thread GitBox


kaxil edited a comment on issue #18541:
URL: https://github.com/apache/airflow/issues/18541#issuecomment-935831242


   That will be only available in soon to be released 2.2.0 -- until then you 
will have to manually apply patch in 
https://github.com/apache/airflow/pull/18315


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on issue #18541: Error when running dag & something to do with parsing the start date

2021-10-06 Thread GitBox


kaxil commented on issue #18541:
URL: https://github.com/apache/airflow/issues/18541#issuecomment-935831242


   That will be only available in soon to be released 2.2.0 -- until then you 
will have to manually apply patch in 
https://github.com/apache/airflow/pull/17732


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil closed issue #18541: Error when running dag & something to do with parsing the start date

2021-10-06 Thread GitBox


kaxil closed issue #18541:
URL: https://github.com/apache/airflow/issues/18541


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18766: Fix Pendulum 1.x references in documentation

2021-10-06 Thread GitBox


github-actions[bot] commented on pull request #18766:
URL: https://github.com/apache/airflow/pull/18766#issuecomment-935829555


   The PR is likely ready to be merged. No tests are needed as no important 
environment files, nor python files were modified by it. However, committers 
might decide that full test matrix is needed and add the 'full tests needed' 
label. Then you should rebase it to the latest main or amend the last commit of 
the PR, and push it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] Mavtti commented on issue #18746: dagrun_time doesn't trigger on_failure_callback

2021-10-06 Thread GitBox


Mavtti commented on issue #18746:
URL: https://github.com/apache/airflow/issues/18746#issuecomment-935823102


   > I thin you should pass it directly as constructor parameter of DAG. See 
also #18721
   
   Already tried but it doesn't change the outcome.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr opened a new pull request #18766: Fix Pendulum 1.x references in documentation

2021-10-06 Thread GitBox


uranusjr opened a new pull request #18766:
URL: https://github.com/apache/airflow/pull/18766


   Because no-one else seems to care enough 路 
   
   Close #17853.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on a change in pull request #18699: Dataflow Operators - use project and location from job in on_kill method.

2021-10-06 Thread GitBox


eladkal commented on a change in pull request #18699:
URL: https://github.com/apache/airflow/pull/18699#discussion_r723034695



##
File path: airflow/providers/google/cloud/hooks/dataflow.py
##
@@ -761,15 +775,23 @@ def start_flex_template(
 .launch(projectId=project_id, body=body, location=location)
 )
 response = request.execute(num_retries=self.num_retries)
-job_id = response["job"]["id"]
+job = response["job"]

Review comment:
   Can you please make sure examples dags of depended operators / sensors 
are aligned with this change?
   For example we have https://github.com/apache/airflow/issues/17180 which 
reported a problem related to the job id.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


ephraimbuddy commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723029378



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   In my opinion, since the UI has those fields as required, making them 
required in OpenAPi Document will be more explicit as the API documentation 
would show those fields as required just like the UI shows them. Having it 
implicitly enforced would not look very good from a usage standpoint. WDYT?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on issue #12103: Tasks can be stuck in running state indefinitely

2021-10-06 Thread GitBox


eladkal commented on issue #12103:
URL: https://github.com/apache/airflow/issues/12103#issuecomment-935781841


   This issue is reported against old version of Airflow that reached end of 
life.
   If the issue is reproducible on latest Airflow version please let us know


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r723022857



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   I think instead of `required` we should set `minLength: 1` instead? This 
would allow either field to be missing (implying unchanged), but if it's 
present, it must not be empty.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil merged pull request #18522: Coerce datetime to pendulum for timetable

2021-10-06 Thread GitBox


kaxil merged pull request #18522:
URL: https://github.com/apache/airflow/pull/18522


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (2a6cadb -> c9bf5f3)

2021-10-06 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 2a6cadb  Small improvements for Airflow UI (#18715)
 add c9bf5f3  Coerce datetime to pendulum for timetable (#18522)

No new revisions were added by this update.

Summary of changes:
 airflow/api_connexion/endpoints/dag_run_endpoint.py | 8 ++--
 1 file changed, 6 insertions(+), 2 deletions(-)


[GitHub] [airflow] BasPH commented on issue #18582: Allow Pools to be set via ENV

2021-10-06 Thread GitBox


BasPH commented on issue #18582:
URL: https://github.com/apache/airflow/issues/18582#issuecomment-935766452


   Yes; I always liked this hot reload functionality of Prometheus: 
https://prometheus.io/docs/prometheus/latest/configuration/configuration/. +1 
for rewriting the configuration, it has other nasty problems e.g. 
https://github.com/apache/airflow/issues/8255


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723016863



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -74,22 +74,26 @@ class SecretsManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 However, these lists can be extended using the configuration parameter 
``extra_conn_words``.
 
 :param connections_prefix: Specifies the prefix of the secret to read to 
get Connections.
-If set to None (null), requests for connections will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
connections will not be
+sent to AWS Secrets Manager. If you don't want a connections_prefix, 
set it as an empty string
 :type connections_prefix: str
 :param variables_prefix: Specifies the prefix of the secret to read to get 
Variables.
-If set to None (null), requests for variables will not be sent to AWS 
Secrets Manager
+If set to None (null value in the configuration), requests for 
variables will not be sent to
+AWS Secrets Manager. If you don't want a variables_prefix, set it as 
an empty string
 :type variables_prefix: str
 :param config_prefix: Specifies the prefix of the secret to read to get 
Configurations.
-If set to None (null), requests for configurations will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
configurations will not be sent to
+AWS Secrets Manager. If you don't want a config_prefix, set it as an 
empty string
 :type config_prefix: str
 :param profile_name: The name of a profile to use. If not given, then the 
default profile is used.
 :type profile_name: str
 :param sep: separator used to concatenate secret_prefix and secret_id. 
Default: "/"
 :type sep: str
 :param full_url_mode: if True, the secrets must be stored as one conn URI 
in just one field per secret.
-Otherwise, you can store the secret using different fields (password, 
user...)
+Otherwise, you can store the secret using different fields (password, 
user...). For using it
+as False, set as str "False" in backend_kwargs.

Review comment:
   Not sure (since I'm not too familiar with AWS Secrets to begin with). 
Maybe something like *to disable full URL mode entirely*?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


JavierLopezT commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723014871



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -74,22 +74,26 @@ class SecretsManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 However, these lists can be extended using the configuration parameter 
``extra_conn_words``.
 
 :param connections_prefix: Specifies the prefix of the secret to read to 
get Connections.
-If set to None (null), requests for connections will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
connections will not be
+sent to AWS Secrets Manager. If you don't want a connections_prefix, 
set it as an empty string
 :type connections_prefix: str
 :param variables_prefix: Specifies the prefix of the secret to read to get 
Variables.
-If set to None (null), requests for variables will not be sent to AWS 
Secrets Manager
+If set to None (null value in the configuration), requests for 
variables will not be sent to
+AWS Secrets Manager. If you don't want a variables_prefix, set it as 
an empty string
 :type variables_prefix: str
 :param config_prefix: Specifies the prefix of the secret to read to get 
Configurations.
-If set to None (null), requests for configurations will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
configurations will not be sent to
+AWS Secrets Manager. If you don't want a config_prefix, set it as an 
empty string
 :type config_prefix: str
 :param profile_name: The name of a profile to use. If not given, then the 
default profile is used.
 :type profile_name: str
 :param sep: separator used to concatenate secret_prefix and secret_id. 
Default: "/"
 :type sep: str
 :param full_url_mode: if True, the secrets must be stored as one conn URI 
in just one field per secret.
-Otherwise, you can store the secret using different fields (password, 
user...)
+Otherwise, you can store the secret using different fields (password, 
user...). For using it
+as False, set as str "False" in backend_kwargs.

Review comment:
   Having the value False. I wanted to point out that you can't pass the 
bool `False` to have `full_url_mode=False`, but the string `"False"`. I'll try 
to improve the descriptions. Any suggestions anyway?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#discussion_r723012643



##
File path: airflow/providers/amazon/aws/secrets/secrets_manager.py
##
@@ -74,22 +74,26 @@ class SecretsManagerBackend(BaseSecretsBackend, 
LoggingMixin):
 However, these lists can be extended using the configuration parameter 
``extra_conn_words``.
 
 :param connections_prefix: Specifies the prefix of the secret to read to 
get Connections.
-If set to None (null), requests for connections will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
connections will not be
+sent to AWS Secrets Manager. If you don't want a connections_prefix, 
set it as an empty string
 :type connections_prefix: str
 :param variables_prefix: Specifies the prefix of the secret to read to get 
Variables.
-If set to None (null), requests for variables will not be sent to AWS 
Secrets Manager
+If set to None (null value in the configuration), requests for 
variables will not be sent to
+AWS Secrets Manager. If you don't want a variables_prefix, set it as 
an empty string
 :type variables_prefix: str
 :param config_prefix: Specifies the prefix of the secret to read to get 
Configurations.
-If set to None (null), requests for configurations will not be sent to 
AWS Secrets Manager
+If set to None (null value in the configuration), requests for 
configurations will not be sent to
+AWS Secrets Manager. If you don't want a config_prefix, set it as an 
empty string
 :type config_prefix: str
 :param profile_name: The name of a profile to use. If not given, then the 
default profile is used.
 :type profile_name: str
 :param sep: separator used to concatenate secret_prefix and secret_id. 
Default: "/"
 :type sep: str
 :param full_url_mode: if True, the secrets must be stored as one conn URI 
in just one field per secret.
-Otherwise, you can store the secret using different fields (password, 
user...)
+Otherwise, you can store the secret using different fields (password, 
user...). For using it
+as False, set as str "False" in backend_kwargs.

Review comment:
   What does "using it as False" mean?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on pull request #18759: Movin trigger dag to operations folder

2021-10-06 Thread GitBox


kaxil commented on pull request #18759:
URL: https://github.com/apache/airflow/pull/18759#issuecomment-935745429


   Let's wait on merging this -- we can include it in 2.1.1


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[airflow] branch main updated (767a4f5 -> 2a6cadb)

2021-10-06 Thread kaxilnaik
This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git.


from 767a4f5  ECSOperator: airflow exception on edge case when cloudwatch 
log stream is not found (#18733)
 add 2a6cadb  Small improvements for Airflow UI (#18715)

No new revisions were added by this update.

Summary of changes:
 UPDATING.md   | 2 +-
 airflow/hooks/filesystem.py   | 4 ++--
 airflow/models/baseoperator.py| 2 +-
 airflow/models/dag.py | 2 +-
 airflow/providers/openfaas/hooks/openfaas.py  | 2 +-
 airflow/utils/file.py | 2 +-
 airflow/www/forms.py  | 4 +++-
 airflow/www/static/js/dags.js | 4 ++--
 airflow/www/templates/airflow/confirm.html| 4 ++--
 airflow/www/templates/airflow/dags.html   | 8 
 airflow/www/views.py  | 4 ++--
 docs/apache-airflow-providers-google/example-dags.rst | 2 +-
 docs/apache-airflow-providers-google/operators/cloud/dataprep.rst | 2 +-
 .../secrets-backends/google-cloud-secret-manager-backend.rst  | 4 ++--
 .../operators/azure_blob_to_gcs.rst   | 2 +-
 docs/apache-airflow/howto/connection.rst  | 8 
 docs/apache-airflow/logging-monitoring/logging-tasks.rst  | 2 +-
 docs/helm-chart/quick-start.rst   | 2 +-
 tests/core/test_impersonation_tests.py| 4 ++--
 tests/jobs/test_scheduler_job.py  | 4 ++--
 tests/www/views/test_views_tasks.py   | 4 ++--
 21 files changed, 37 insertions(+), 35 deletions(-)


[GitHub] [airflow] kaxil merged pull request #18715: Small improvements for Airflow UI

2021-10-06 Thread GitBox


kaxil merged pull request #18715:
URL: https://github.com/apache/airflow/pull/18715


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil edited a comment on pull request #13893: Fix race condition when using Dynamic DAGs

2021-10-06 Thread GitBox


kaxil edited a comment on pull request #13893:
URL: https://github.com/apache/airflow/pull/13893#issuecomment-935740190


   @AramisN There are many reasons for getting "not found in serialized_dag 
table" -- you might be facing a different error. The PR fixed breaking the 
scheduler, and from the stacktrace you are seeing this error on webserver.
   
   Please create a new issue with steps to reproduce


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil edited a comment on pull request #13893: Fix race condition when using Dynamic DAGs

2021-10-06 Thread GitBox


kaxil edited a comment on pull request #13893:
URL: https://github.com/apache/airflow/pull/13893#issuecomment-935740190


   @AramisN There are many reasons for getting "not found in serialized_dag 
table" -- you might be facing a different error. The PR fixed breaking the 
scheduler, and from the stacktrace you are seeing this error on webserver.
   
   Please create an new issue with steps to reproduce


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] kaxil commented on pull request #13893: Fix race condition when using Dynamic DAGs

2021-10-06 Thread GitBox


kaxil commented on pull request #13893:
URL: https://github.com/apache/airflow/pull/13893#issuecomment-935740190


   @AramisN There are many reasons for getting "not found in serialized_dag 
table" -- you might be facing a different error. The PR fixed breaking the 
scheduler, and from the stacktrace you are seeing this error on webserver


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #13541: added support for device_requests in DockerOperator

2021-10-06 Thread GitBox


potiuk commented on pull request #13541:
URL: https://github.com/apache/airflow/pull/13541#issuecomment-935722042


   If you want to take over and submit it and lead to completion - feel free 
:). Create a new PR please 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] lwyszomi opened a new issue #18765: Access Denied when access to profile page

2021-10-06 Thread GitBox


lwyszomi opened a new issue #18765:
URL: https://github.com/apache/airflow/issues/18765


   ### Apache Airflow version
   
   2.1.4 (latest released)
   
   ### Operating System
   
   Debian GNU/Linux rodete rodete (x86-64)
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   Minimum required installation with remote user or OAuth mode authorization.
   
   ### What happened
   
   After log in to the Airflow using different authorization (like Remote User 
mode or OAuth) we got a Access Denied Error when we try to open Profile page.
   
   ### What you expected to happen
   
   Profile page should be loaded correctly with all User information.
   
   ### How to reproduce
   
   * setup new env with OAuth autorization
   * log in to the airflow
   * go to the Profile page
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on issue #18765: Access Denied when access to profile page

2021-10-06 Thread GitBox


boring-cyborg[bot] commented on issue #18765:
URL: https://github.com/apache/airflow/issues/18765#issuecomment-935721554


   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] github-actions[bot] commented on pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


github-actions[bot] commented on pull request #18764:
URL: https://github.com/apache/airflow/pull/18764#issuecomment-935720784


   The PR is likely OK to be merged with just subset of tests for default 
Python and Database versions without running the full matrix of tests, because 
it does not modify the core of Airflow. If the committers decide that the full 
tests matrix is needed, they will add the label 'full tests needed'. Then you 
should rebase to the latest main or amend the last commit of the PR, and push 
it with --force-with-lease.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] potiuk commented on pull request #18761: Fix missing permissions for LDAP Auth (add can create)

2021-10-06 Thread GitBox


potiuk commented on pull request #18761:
URL: https://github.com/apache/airflow/pull/18761#issuecomment-935708550


   I think we need to decide first what we want to do. There is a new 
discussion https://github.com/apache/airflow/pull/18670 related to that (API 
doing the same thing) and my conclusion now is that blanket adding users might 
be a security issue. I ama leaning towards (only allow to add user if there is 
a corresponding LDAP user (it is not checked currently) in case LDAP auth is 
used. But before we merge that one we need to get to consensus how to do. I 
think it will be good if you @pawsok and @uranusjr  add your opinions to 
discussion there 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


ephraimbuddy commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r722983609



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The username `{new_username}` already 
exists")
+
+# Check unique email
+email = data.get('email')
+if email:
+usr = security_manager.find_user(email=email)
+if usr and usr != user:
+raise AlreadyExists(detail=f"The email `{email}` already exists")

Review comment:
   I have made them required fields, that solves it?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on issue #18664: [Oracle] Oracle Hook - make it possible to define a schema in the connection parameters

2021-10-06 Thread GitBox


eladkal commented on issue #18664:
URL: https://github.com/apache/airflow/issues/18664#issuecomment-935698467


   I can understand the source of the confusion.
   We probably need to customize the connection fields in the UI and set 
Service Name & DB Schema with explanation to avoid confusion.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


ephraimbuddy commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r722983033



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:

Review comment:
   Done. Thanks. I have also made some fields required as done in the UI




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] AramisN commented on pull request #13893: Fix race condition when using Dynamic DAGs

2021-10-06 Thread GitBox


AramisN commented on pull request #13893:
URL: https://github.com/apache/airflow/pull/13893#issuecomment-935694176


   Said its been fixed in 2.0.x but i see it again in 2.1.x :|
   
   Something bad has happened.
   Please consider letting us know by creating a bug report using GitHub.
   
   Python version: 3.8.5
   Airflow version: 2.1.1
   Node: f723608ad587
   
---
   Traceback (most recent call last):
 File "/root/miniconda3/lib/python3.8/site-packages/flask/app.py", line 
2447, in wsgi_app
   response = self.full_dispatch_request()
 File "/root/miniconda3/lib/python3.8/site-packages/flask/app.py", line 
1952, in full_dispatch_request
   rv = self.handle_user_exception(e)
 File "/root/miniconda3/lib/python3.8/site-packages/flask/app.py", line 
1821, in handle_user_exception
   reraise(exc_type, exc_value, tb)
 File "/root/miniconda3/lib/python3.8/site-packages/flask/_compat.py", line 
39, in reraise
   raise value
 File "/root/miniconda3/lib/python3.8/site-packages/flask/app.py", line 
1950, in full_dispatch_request
   rv = self.dispatch_request()
 File "/root/miniconda3/lib/python3.8/site-packages/flask/app.py", line 
1936, in dispatch_request
   return self.view_functions[rule.endpoint](**req.view_args)
 File "/root/miniconda3/lib/python3.8/site-packages/airflow/www/auth.py", 
line 34, in decorated
   return func(*args, **kwargs)
 File 
"/root/miniconda3/lib/python3.8/site-packages/airflow/www/decorators.py", line 
97, in view_func
   return f(*args, **kwargs)
 File 
"/root/miniconda3/lib/python3.8/site-packages/airflow/www/decorators.py", line 
60, in wrapper
   return f(*args, **kwargs)
 File 
"/root/miniconda3/lib/python3.8/site-packages/airflow/utils/session.py", line 
70, in wrapper
   return func(*args, session=session, **kwargs)
 File "/root/miniconda3/lib/python3.8/site-packages/airflow/www/views.py", 
line 2170, in graph
   dag = current_app.dag_bag.get_dag(dag_id)
 File 
"/root/miniconda3/lib/python3.8/site-packages/airflow/utils/session.py", line 
70, in wrapper
   return func(*args, session=session, **kwargs)
 File 
"/root/miniconda3/lib/python3.8/site-packages/airflow/models/dagbag.py", line 
186, in get_dag
   self._add_dag_from_db(dag_id=dag_id, session=session)
 File 
"/root/miniconda3/lib/python3.8/site-packages/airflow/models/dagbag.py", line 
258, in _add_dag_from_db
   raise SerializedDagNotFound(f"DAG '{dag_id}' not found in serialized_dag 
table")
   airflow.exceptions.SerializedDagNotFound: DAG 'DLS_SYS_REGION' not found in 
serialized_dag table


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] alberthier commented on pull request #13541: added support for device_requests in DockerOperator

2021-10-06 Thread GitBox


alberthier commented on pull request #13541:
URL: https://github.com/apache/airflow/pull/13541#issuecomment-935688549


   The proposed PR has been closed as there was no activity on it but the code 
works properly and fixes this issue.
   
   Airflow now depends on docker 5.0.2 so the `device_requests` parameter is 
now available in `APIClient`.
   
   Is there any chance to see this PR merged ? Or should a new one be submitted 
?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT opened a new pull request #18764: Enable AWS Secrets Manager backend to retrieve conns using different fields

2021-10-06 Thread GitBox


JavierLopezT opened a new pull request #18764:
URL: https://github.com/apache/airflow/pull/18764


   Fixing bugs for rc2 release


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r722968549



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:

Review comment:
   Sorry, the previous message is meant to be read in the context of the 
one below (about email; the same applies to both). Instead of checking for 
object equality, this could simply check whether `new_username` is equal to the 
username from the URL (i.e. `user.username`), which is both slightly cheaper 
and (more importantly) saves a db roundtrip when the username does not change.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] ephraimbuddy commented on a change in pull request #18757: Make REST API patch user endpoint work the same way as the UI

2021-10-06 Thread GitBox


ephraimbuddy commented on a change in pull request #18757:
URL: https://github.com/apache/airflow/pull/18757#discussion_r722966808



##
File path: airflow/api_connexion/endpoints/user_endpoint.py
##
@@ -124,9 +124,21 @@ def patch_user(username, update_mask=None):
 if user is None:
 detail = f"The User with username `{username}` was not found"
 raise NotFound(title="User not found", detail=detail)
-
-# Get fields to update. 'username' is always excluded (and it's an error to
-# include it in update_maek).
+# Check unique username
+new_username = data.get('username')
+if new_username:
+usr = security_manager.find_user(username=new_username)
+if usr and usr != user:

Review comment:
   This is for a case of changing usernames. The `user` is from endpoint 
URL and the `usr` is from the patch data.
   If the two are equal, it means that `username` is in data and also in `url` 
but not changing. However, if they are not equal, and the username from data 
exists in DB, then we should not add the username because it's in use




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] eladkal commented on a change in pull request #18755: Support mysql to s3 in parquet format

2021-10-06 Thread GitBox


eladkal commented on a change in pull request #18755:
URL: https://github.com/apache/airflow/pull/18755#discussion_r722946136



##
File path: airflow/providers/amazon/aws/transfers/mysql_to_s3.py
##
@@ -54,12 +59,17 @@ class MySQLToS3Operator(BaseOperator):
 You can specify this argument if you want to use a different
 CA cert bundle than the one used by botocore.
 :type verify: bool or str
-:param pd_csv_kwargs: arguments to include in pd.to_csv (header, index, 
columns...)
+:param pd_csv_kwargs: deprecated. Use pd_kwargs instead.
+Arguments to include in pd.to_csv (header, index, columns...)

Review comment:
   There is no need to change the docstring.
   you just need to deprecate the param:
   
   if pd_csv_kwargs:
   warnings.warn(
   "pd_csv_kwargs is deprecated. Please use pd_kwargs.",
   DeprecationWarning,
   stacklevel=2,
   )
   self.pd_kwargs = pd_csv_kwargs




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] GuidoTournois commented on a change in pull request #18763: Add standard hook fields to pagerdutyHook to make hook show up in UI

2021-10-06 Thread GitBox


GuidoTournois commented on a change in pull request #18763:
URL: https://github.com/apache/airflow/pull/18763#discussion_r722946720



##
File path: airflow/providers/pagerduty/provider.yaml
##
@@ -37,6 +37,13 @@ integrations:
 tags: [service]
 
 
+hook-class-names: # deprecated - to be removed after providers add dependency 
on Airflow 2.2.0+

Review comment:
   I have looked at the code of other providers and this comment is present 
in most of their `provider.yaml` files, as such I have added it. E.g.
   - airflow/providers/microsoft/azure/provider.yaml
   - airflow/providers/jira/provider.yaml
   - airflow/providers/slack/provider.yaml
   
   I will remove it, if you deem it unnecessary.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] JavierLopezT removed a comment on pull request #17448: Aws secrets manager backend

2021-10-06 Thread GitBox


JavierLopezT removed a comment on pull request #17448:
URL: https://github.com/apache/airflow/pull/17448#issuecomment-933282983


   Hello, this might sound ridiculous but I am trying to test this MR for 
https://github.com/apache/airflow/issues/18638#issuecomment-932991258 and I am 
unable to do it. Could you help me, please?
   
   While I was coding the MR and for several months, I have been using the 
Secrets Backend extending my docker image as follow:
   ```
   COPY plugins/aws_secrets_manager_backend.py 
/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/amazon/aws/secrets/secrets_manager.py
   COPY aws_config /home/airflow/.aws/config
   COPY aws_credentials /home/airflow/.aws/credentials
   ```
   In the code I had the custom defaults I needed, so no need for backend 
kwargs. Thus, my docker-compose only needed one env variable for having the 
backend working:
   `  AIRFLOW__SECRETS__BACKEND: 
airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend`
   
   Now, I have pinned in my requirements file the amazon library 
`apache-airflow-providers-amazon==2.3.0rc1`. I have deleted the COPY command of 
the secrets file from the Dockerfile and added the kwargs as env variable as 
follow:
   `  AIRFLOW__SECRETS__BACKEND_KWARGS: '{"connections_prefix": "data", 
"full_url_mode": False, "sep": "_"}'`
   
   I'm trying to connect to the secret `data_db_airflow_snowflake`. I have 
tried also with `connections_prefix` as an empty string (and the full name of 
the connection as an argument). 
   
   In any case, I am having an error like the following:
   
   ```
 File "/opt/airflow/plugins/snowflake_hook.py", line 213, in get_pandas_df
   df = super(SnowflakeHook, self).get_pandas_df(sql, parameters, **kwargs)
 File 
"/home/airflow/.local/lib/python3.9/site-packages/airflow/hooks/dbapi.py", line 
116, in get_pandas_df
   with closing(self.get_conn()) as conn:
 File "/opt/airflow/plugins/snowflake_hook.py", line 124, in get_conn
   conn_config = self._get_conn_params()
 File "/opt/airflow/plugins/snowflake_hook.py", line 63, in _get_conn_params
   conn = self.get_connection(self.snowflake_conn_id)
 File 
"/home/airflow/.local/lib/python3.9/site-packages/airflow/hooks/base.py", line 
67, in get_connection
   conn = Connection.get_connection_from_secrets(conn_id)
 File 
"/home/airflow/.local/lib/python3.9/site-packages/airflow/models/connection.py",
 line 379, in get_connection_from_secrets
   raise AirflowNotFoundException(f"The conn_id `{conn_id}` isn't defined")
   airflow.exceptions.AirflowNotFoundException: The conn_id `db_snowflake_***` 
isn't defined
   ```
   
   Entering in the container, the env variables, as well as the secrets file, 
seems to be right.
   
   Any idea? Thank you very much
   @mik-laj @potiuk @uranusjr 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18763: Add standard hook fields to pagerdutyHook to make hook show up in UI

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18763:
URL: https://github.com/apache/airflow/pull/18763#discussion_r722943801



##
File path: airflow/providers/pagerduty/provider.yaml
##
@@ -37,6 +37,13 @@ integrations:
 tags: [service]
 
 
+hook-class-names: # deprecated - to be removed after providers add dependency 
on Airflow 2.2.0+

Review comment:
   2.2.0 is going to be the next version, so this comment does not sem to 
make a lot of sense. Why is it here?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #18763: Add standard hook fields to pagerdutyHook to make hook show up in UI

2021-10-06 Thread GitBox


boring-cyborg[bot] commented on pull request #18763:
URL: https://github.com/apache/airflow/pull/18763#issuecomment-935614072


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] GuidoTournois opened a new pull request #18763: Add standard hook fields to pagerdutyHook to make hook show up in UI

2021-10-06 Thread GitBox


GuidoTournois opened a new pull request #18763:
URL: https://github.com/apache/airflow/pull/18763


   Pagerduty is not shown as a Connection type in the Connection view. That is 
because of two reasons:
   1. The PagerdutyHook class is not added in the provider.yaml, nor in the 
field `connection-types`, nor in the soon to be deprecated field 
`hook-class-names`. This PR fixes this.
   2. Even after fixing 1, the hooks could not be added as some required fields 
were missing: `conn_name_attr`, `default_conn_name`, `conn_type`, `hook_name`. 
This PR adds those.
   
   closes: #18748
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pawsok commented on pull request #18761: Fix missing permissions for LDAP Auth (add can create)

2021-10-06 Thread GitBox


pawsok commented on pull request #18761:
URL: https://github.com/apache/airflow/pull/18761#issuecomment-935599833


   > I think this needs a test.
   
   I was not able to find any existing tests for LDAP auth.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mehmax commented on issue #18664: [Oracle] Oracle Hook - make it possible to define a schema in the connection parameters

2021-10-06 Thread GitBox


mehmax commented on issue #18664:
URL: https://github.com/apache/airflow/issues/18664#issuecomment-935595790


   @nikochiko 
   In oracle, schema is something different to service_name. I dont know why is 
was coded like this.
   service name is needed to connect to the database
   
   I am no oracle expert, but let me explain it in an example.
   Tables in oracle are created in a schema. To query a table you have to use 
the following syntax:
   `SELECT * FROM SCHEMA.TABLE` .
   If you set a specific schema as a current_schema with
   `ALTER SESSION SET CURRENT_SCHEMA = SCHEMA`, you do not have to specifiy it 
in the SQL Syntax anymore:
   `SELECT * FROM TABLE` .
   
   Please refer to also to this links
   
https://docs.oracle.com/cd/B28359_01/server.111/b28310/general009.htm#ADMIN02101
   how it is working in cx_Oracle
   
https://cx-oracle.readthedocs.io/en/6.4.1/connection.html#Connection.current_schema
   
   In my opinion it can be solved by adding the following lines
   
   ```
   schema = conn.schema
   ...
   conn = cx_Oracle.connect(**conn_config)
   conn.current_schema = schema
   
   ```
   or (equivalent)
   ```
   schema = conn.schema
   ...
   conn = cx_Oracle.connect(**conn_config)
   conn.execute(f"ALTER SESSION SET CURRENT_SCHEMA={ schema }")
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on a change in pull request #18755: Support mysql to s3 in parquet format

2021-10-06 Thread GitBox


uranusjr commented on a change in pull request #18755:
URL: https://github.com/apache/airflow/pull/18755#discussion_r722926861



##
File path: airflow/providers/amazon/aws/transfers/mysql_to_s3.py
##
@@ -92,15 +108,27 @@ def __init__(
 self.aws_conn_id = aws_conn_id
 self.verify = verify
 
-self.pd_csv_kwargs = pd_csv_kwargs or {}
-if "path_or_buf" in self.pd_csv_kwargs:
-raise AirflowException('The argument path_or_buf is not allowed, 
please remove it')
-if "index" not in self.pd_csv_kwargs:
-self.pd_csv_kwargs["index"] = index
-if "header" not in self.pd_csv_kwargs:
-self.pd_csv_kwargs["header"] = header
+if file_format == "csv":
+self.file_format = FILE_FORMAT.CSV
+else:
+self.file_format = FILE_FORMAT.PARQUET
 
-def _fix_int_dtypes(self, df: pd.DataFrame) -> None:
+self.pd_kwargs = pd_kwargs or pd_csv_kwargs or {}
+if self.file_format == FILE_FORMAT.CSV:
+if "path_or_buf" in self.pd_kwargs:
+raise AirflowException('The argument path_or_buf is not 
allowed, please remove it')
+if "index" not in self.pd_kwargs:
+self.pd_kwargs["index"] = index
+if "header" not in self.pd_kwargs:
+self.pd_kwargs["header"] = header
+else:
+if pd_csv_kwargs is not None:
+raise AirflowException(
+f"The destination file format is parquet so pd_csv_kwargs 
shouldn't be set."
+)

Review comment:
   This should be a TypeError with a more… stdlib-sounding? message like
   
   ```
   pd_csv_kwargs may not be specified when file_format="parquet"
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] mnojek commented on a change in pull request #18715: Small improvements for Airflow UI

2021-10-06 Thread GitBox


mnojek commented on a change in pull request #18715:
URL: https://github.com/apache/airflow/pull/18715#discussion_r722925502



##
File path: airflow/www/templates/airflow/confirm.html
##
@@ -21,7 +21,7 @@
 
 {% block content %}
   {{ super() }}
-  Wait a minute.
+  Wait a minute

Review comment:
   This dot just doesn't look nice in the UI. Short sentences or 
instructions don't need a dot at the end, especially if they are not included 
in some bigger text wall, but are just a short notification or instruction. I 
know it's subjective, but I think it's commonly accepted. Sentences like 'Click 
OK', 'Check your filters', 'Insert value' do not require a full stop at the 
end. Also, it is perceived as insincere or even rude nowadays. In my opinion, 
it looks just better in this case.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr commented on pull request #18761: Fix missing permissions for LDAP Auth (add can create)

2021-10-06 Thread GitBox


uranusjr commented on pull request #18761:
URL: https://github.com/apache/airflow/pull/18761#issuecomment-935570942


   I think this needs a test.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] boring-cyborg[bot] commented on pull request #18761: Fix missing permissions for LDAP Auth (add can create) #18545

2021-10-06 Thread GitBox


boring-cyborg[bot] commented on pull request #18761:
URL: https://github.com/apache/airflow/pull/18761#issuecomment-935541991


   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (flake8, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it’s a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better .
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] pawsok opened a new pull request #18761: Fix missing permissions for LDAP Auth (add can create) #18545

2021-10-06 Thread GitBox


pawsok opened a new pull request #18761:
URL: https://github.com/apache/airflow/pull/18761


   Hi,
   
   This is to add an option to add new users from UsersVeiw if LDAP auth is 
enabled (if you simply do not want automated user registration with LDAP).
   The whole discussion can be found here: 
[#18290](https://github.com/apache/airflow/discussions/18290)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [airflow] uranusjr closed issue #18669: Failed to record duration of externally triggered DAGs

2021-10-06 Thread GitBox


uranusjr closed issue #18669:
URL: https://github.com/apache/airflow/issues/18669


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




<    1   2   3