[GitHub] [airflow] ephraimbuddy commented on issue #29199: TaskFlow AirflowSkipException causes downstream step to fail when multiple_outputs is true

2023-02-21 Thread via GitHub


ephraimbuddy commented on issue #29199:
URL: https://github.com/apache/airflow/issues/29199#issuecomment-1439556135

   > Where does the query go? It may (or may not) make sense depending on the 
situation.
   > 
   > I think another problem is that _technically_ the error is correct in that 
the XCom value is indeed missing. If we want to return None for SKIPPED, 
arguably the same should be done for other states as well.
   
   Yes. The XCom value is missing but the downstream task has the trigger rule 
'non_failed_min_one_success', so the downstream is trying to run but fails due 
to one of the upstream being skipped


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ecodina commented on issue #28647: Intermitent log on deferrable operator

2023-02-21 Thread via GitHub


ecodina commented on issue #28647:
URL: https://github.com/apache/airflow/issues/28647#issuecomment-1439555165

   Hi,
   
   We have not written any custom log backend: we use ``self.log`` from within 
the custom operator and the only related configuration we've changed from 
``airflow.cfg`` is ``base_log_folder``.
   
   In the following weeks we'll upgrade to the latest version of Airflow, to 
check whether it is still happening.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on issue #29199: TaskFlow AirflowSkipException causes downstream step to fail when multiple_outputs is true

2023-02-21 Thread via GitHub


ephraimbuddy commented on issue #29199:
URL: https://github.com/apache/airflow/issues/29199#issuecomment-1439554856

   Here's the full diff:
   ```diff
   diff --git a/airflow/models/xcom_arg.py b/airflow/models/xcom_arg.py
   index 133fd4280b..046bc4a6e2 100644
   --- a/airflow/models/xcom_arg.py
   +++ b/airflow/models/xcom_arg.py
   @@ -33,6 +33,7 @@ from airflow.utils.context import Context
from airflow.utils.edgemodifier import EdgeModifier
from airflow.utils.mixins import ResolveMixin
from airflow.utils.session import NEW_SESSION, provide_session
   +from airflow.utils.state import State
from airflow.utils.types import NOTSET, ArgNotSet

if TYPE_CHECKING:
   @@ -340,6 +341,14 @@ class PlainXComArg(XComArg):
return result
if self.key == XCOM_RETURN_KEY:
return None
   +from airflow.models.taskinstance import TaskInstance as TI
   +pti = session.query(TI).filter(
   +TI.dag_id == ti.dag_id,
   +TI.task_id == task_id,
   +TI.run_id == ti.run_id,
   +).first()
   +if pti.state == State.SKIPPED:
   +return None
raise XComNotFound(ti.dag_id, task_id, self.key)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal commented on a diff in pull request #29548: Syedahsn/ec2 create terminate operators

2023-02-21 Thread via GitHub


eladkal commented on code in PR #29548:
URL: https://github.com/apache/airflow/pull/29548#discussion_r1113927353


##
airflow/providers/amazon/aws/operators/ec2.py:
##
@@ -116,3 +116,139 @@ def execute(self, context: Context):
 target_state="stopped",
 check_interval=self.check_interval,
 )
+
+
+class EC2CreateInstanceOperator(BaseOperator):
+"""
+Create and start an EC2 Instance using boto3
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:EC2CreateInstanceOperator`
+
+:param image_id: ID of the AMI used to create the instance.
+:param max_count: Maximum number of instances to launch. Defaults to 1.
+:param min_count: Minimum number of instances to launch. Defaults to 1.
+:param aws_conn_id: AWS connection to use
+:param region_name: AWS region name associated with the client.
+:param poll_interval: Number of seconds to wait before attempting to
+check state of instance. Only used if wait_for_completion is True. 
Default is 20.
+:param max_attempts: Maximum number of attempts when checking state of 
instance.
+Only used if wait_for_completion is True. Default is 20.
+:param config: Dictionary for arbitrary parameters to the boto3 
run_instances call.
+:param wait_for_completion: If True, the operator will wait for the 
instance to be
+in the `running` state before returning.
+"""
+
+template_fields: Sequence[str] = (
+"image_id",
+"max_count",
+"min_count",
+"aws_conn_id",
+"region_name",
+"config",
+"wait_for_completion",

Review Comment:
   Do we have an issue for this task?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #29548: Syedahsn/ec2 create terminate operators

2023-02-21 Thread via GitHub


uranusjr commented on code in PR #29548:
URL: https://github.com/apache/airflow/pull/29548#discussion_r1113923418


##
airflow/providers/amazon/aws/operators/ec2.py:
##
@@ -116,3 +116,139 @@ def execute(self, context: Context):
 target_state="stopped",
 check_interval=self.check_interval,
 )
+
+
+class EC2CreateInstanceOperator(BaseOperator):
+"""
+Create and start an EC2 Instance using boto3
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:EC2CreateInstanceOperator`
+
+:param image_id: ID of the AMI used to create the instance.
+:param max_count: Maximum number of instances to launch. Defaults to 1.
+:param min_count: Minimum number of instances to launch. Defaults to 1.
+:param aws_conn_id: AWS connection to use
+:param region_name: AWS region name associated with the client.
+:param poll_interval: Number of seconds to wait before attempting to
+check state of instance. Only used if wait_for_completion is True. 
Default is 20.
+:param max_attempts: Maximum number of attempts when checking state of 
instance.
+Only used if wait_for_completion is True. Default is 20.
+:param config: Dictionary for arbitrary parameters to the boto3 
run_instances call.
+:param wait_for_completion: If True, the operator will wait for the 
instance to be
+in the `running` state before returning.
+"""
+
+template_fields: Sequence[str] = (
+"image_id",
+"max_count",
+"min_count",
+"aws_conn_id",
+"region_name",
+"config",
+"wait_for_completion",

Review Comment:
   Currently template fields are also used for XComArg, where boolean makes 
sense. There’re talks to separate those two ideas but for now having the field 
can be useful.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on issue #29366: default_args feature incompatible with Dynamic Task Mapping

2023-02-21 Thread via GitHub


uranusjr commented on issue #29366:
URL: https://github.com/apache/airflow/issues/29366#issuecomment-1439541245

   Intuitively it feels possible, except `default_args` is much more volatile 
to get right since it deals with unspecified arguments from indefinite operator 
classes. A poc would make discussion much easier.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on issue #29199: TaskFlow AirflowSkipException causes downstream step to fail when multiple_outputs is true

2023-02-21 Thread via GitHub


uranusjr commented on issue #29199:
URL: https://github.com/apache/airflow/issues/29199#issuecomment-1439532491

   Where does the query go? It may (or may not) make sense depending on the 
situation.
   
   I think another problem is that _technically_ the error is correct in that 
the XCom value is indeed missing. If we want to return None for SKIPPED, 
arguably the same should be done for other states as well.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #29245: fix code checking job names in sagemaker

2023-02-21 Thread via GitHub


uranusjr commented on code in PR #29245:
URL: https://github.com/apache/airflow/pull/29245#discussion_r1113909968


##
airflow/providers/amazon/aws/operators/sagemaker.py:
##
@@ -106,6 +108,41 @@ def _create_integer_fields(self) -> None:
 """
 self.integer_fields = []
 
+def _get_unique_job_name(
+self, proposed_name: str, fail_if_exists: bool, describe_func: 
Callable[[str], Any]
+) -> str:
+"""
+Returns the proposed name if it doesn't already exist, otherwise 
returns it with a random suffix.
+
+:param proposed_name: Base name.
+:param fail_if_exists: Will throw an error if a job with that name 
already exists
+instead of finding a new name.
+:param describe_func: The `describe_` function for that kind of job.
+We use it as an O(1) way to check if a job exists.
+"""
+job_name = proposed_name
+while self._check_if_job_exists(job_name, describe_func):
+# this while should loop only once in most cases, just setting it 
this way to regenerate a name
+# in case there is a random number collision.
+if fail_if_exists:
+raise AirflowException(f"A SageMaker job with name {job_name} 
already exists.")
+else:
+job_name = f"{proposed_name}-{random.randint(0, 9):09}"

Review Comment:
   Regarding reproduction, would it be a good idea to use a time-based token, 
say `time.time()`, here? It would still be somewhat “random” but more 
predictable.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on issue #19221: OpenAPI Generator generated python wrapper with a bug. It returns ApiValueError when calling get_variables().

2023-02-21 Thread via GitHub


github-actions[bot] commented on issue #19221:
URL: https://github.com/apache/airflow/issues/19221#issuecomment-1439526971

   This issue has been automatically marked as stale because it has been open 
for 365 days without any activity. There has been several Airflow releases 
since last activity on this issue. Kindly asking to recheck the report against 
latest Airflow version and let us know if the issue is reproducible. The issue 
will be closed in next 30 days if no further activity occurs from the issue 
author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on issue #19511: With large number of DAGS, dag_run state takes long time to update after last task

2023-02-21 Thread via GitHub


github-actions[bot] commented on issue #19511:
URL: https://github.com/apache/airflow/issues/19511#issuecomment-1439526941

   This issue has been automatically marked as stale because it has been open 
for 365 days without any activity. There has been several Airflow releases 
since last activity on this issue. Kindly asking to recheck the report against 
latest Airflow version and let us know if the issue is reproducible. The issue 
will be closed in next 30 days if no further activity occurs from the issue 
author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on issue #19934: Reloading of Logging and ORM in Dag Processing can be disruptive in spawn mode

2023-02-21 Thread via GitHub


github-actions[bot] commented on issue #19934:
URL: https://github.com/apache/airflow/issues/19934#issuecomment-1439526909

   This issue has been automatically marked as stale because it has been open 
for 365 days without any activity. There has been several Airflow releases 
since last activity on this issue. Kindly asking to recheck the report against 
latest Airflow version and let us know if the issue is reproducible. The issue 
will be closed in next 30 days if no further activity occurs from the issue 
author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on issue #20125: Airflow Gantt View is improper

2023-02-21 Thread via GitHub


github-actions[bot] commented on issue #20125:
URL: https://github.com/apache/airflow/issues/20125#issuecomment-1439526872

   This issue has been automatically marked as stale because it has been open 
for 365 days without any activity. There has been several Airflow releases 
since last activity on this issue. Kindly asking to recheck the report against 
latest Airflow version and let us know if the issue is reproducible. The issue 
will be closed in next 30 days if no further activity occurs from the issue 
author.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on issue #29199: TaskFlow AirflowSkipException causes downstream step to fail when multiple_outputs is true

2023-02-21 Thread via GitHub


ephraimbuddy commented on issue #29199:
URL: https://github.com/apache/airflow/issues/29199#issuecomment-1439526904

   > THis would change the behaviour of `zip`. `ArgNotSet` is `fillvalue`’s 
default value, and it needs to cause `XComNotFound` for the mechanism to work 
correctly. So either further logic is needed to distinguish between the cases, 
or one of these code paths need to change the sentinel value (not use 
`ArgNotSet`).
   > 
   > How does the `XComNotFound` exception propagate? It is likely better to 
catch this elsewhere in the stack and coerce to None.
   
   This worked but I'm not sure how it will behave in the case of mapped tasks 
since I'm not querying with map_index:
   
   ```python
   
   from airflow.models.taskinstance import TaskInstance as TI
   pti = session.query(TI).filter(
   TI.dag_id == ti.dag_id,
   TI.task_id == task_id,
   TI.run_id == ti.run_id,
   ).first()
   if pti.state == State.SKIPPED:
   return None
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #29679: Fix Quarantined `test_cli_internal_api_background`

2023-02-21 Thread via GitHub


potiuk commented on issue #29679:
URL: https://github.com/apache/airflow/issues/29679#issuecomment-1439524562

   cc: @mhenc 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #29644: Remove <2.0.0 limit on google-cloud-bigtable

2023-02-21 Thread via GitHub


uranusjr commented on PR #29644:
URL: https://github.com/apache/airflow/pull/29644#issuecomment-1439520760

   The error is from a bug in our CI setup code. I opened a PR (linked above) 
to fix it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr opened a new pull request, #29685: Do not crash when a version fails to parse

2023-02-21 Thread via GitHub


uranusjr opened a new pull request, #29685:
URL: https://github.com/apache/airflow/pull/29685

   Old files on PyPI may contain version numbers that are non-standard and 
can't be properly parsed. Those are no longer allowed for new versions, so we 
can safely ignore those versions since they must be ancient.
   
   The error is hit by #29644


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on a diff in pull request #29599: fix do_xcom_push=False bug in SnowflakeOperator

2023-02-21 Thread via GitHub


uranusjr commented on code in PR #29599:
URL: https://github.com/apache/airflow/pull/29599#discussion_r1113899583


##
tests/providers/common/sql/operators/test_sql.py:
##
@@ -89,6 +92,8 @@ def test_dont_xcom_push(self, mock_get_db_hook):
 handler=None,
 return_last=True,
 )
+mock_process_output.assert_not_called()
+

Review Comment:
   ```suggestion
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] s0neq commented on a diff in pull request #29635: YandexCloud provider: support Yandex SDK feature "endpoint"

2023-02-21 Thread via GitHub


s0neq commented on code in PR #29635:
URL: https://github.com/apache/airflow/pull/29635#discussion_r1113895749


##
airflow/providers/yandex/hooks/yandex.py:
##
@@ -78,6 +78,11 @@ def get_connection_form_widgets() -> dict[str, Any]:
 description="Optional. This key will be placed to all created 
Compute nodes"
 "to let you have a root shell there",
 ),
+"endpoint": StringField(
+lazy_gettext("API endpoint"),
+widget=BS3TextFieldWidget(),
+description="Optional.",

Review Comment:
   hi! i fixed the description, if everything's ok lets merge? it seems only 
people with write access can do it



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] sangrpatil2 commented on issue #29555: Dag Fails while creating a Dynamic Tasks using airflow variables.

2023-02-21 Thread via GitHub


sangrpatil2 commented on issue #29555:
URL: https://github.com/apache/airflow/issues/29555#issuecomment-1439511002

   @hussein-awala / @ephraimbuddy 
   
   Dag is only failing while creating dynamic tasks using airflow variables, 
the rest of the tasks/flow is working fine. You can refer to the sample code 
given below:
   
   `batch_list = [1,2] #[3,4] for next run
   domain_list = [A, B, C]
   
   
   start >> fetch_batch >> load_batch
   
   for domain in domain_list:
with TaskGroup(group_id=domain) as domain_tg:
step_task_name = 'load_' + domain
task_list = []
   
for i in range(0, len(batch_list)):
batch_id = batch_list[i]
task_list.append(create_python_operator(
dag=dag,
task_name=step_task_name + batch_id,
op_kwargs={
"command": 
},
python_callable=ssm_send_command,
trigger_rule="all_success"
))
if i == 0:
load_batch >> task_list[0]
else:
task_list[i-1] >> task_list[i]
   
domain_tg >> finish`
   
   Let me know if you need more details about it.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] nervoussidd commented on pull request #29683: this file adds a on_kill method to Dataproc operator.

2023-02-21 Thread via GitHub


nervoussidd commented on PR #29683:
URL: https://github.com/apache/airflow/pull/29683#issuecomment-1439486888

   Thanks, I'll check it out.
   
   On Wed, Feb 22, 2023, 11:35 AM Tzu-ping Chung ***@***.***>
   wrote:
   
   > Start with reading one of the CONTRIBUTING guides listed here:
   > https://github.com/apache/airflow/
   >
   > —
   > Reply to this email directly, view it on GitHub
   > ,
   > or unsubscribe
   > 

   > .
   > You are receiving this because you modified the open/close state.Message
   > ID: ***@***.***>
   >
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #29683: this file adds a on_kill method to Dataproc operator.

2023-02-21 Thread via GitHub


uranusjr commented on PR #29683:
URL: https://github.com/apache/airflow/pull/29683#issuecomment-1439485650

   Start with reading one of the CONTRIBUTION guides listed here: 
https://github.com/apache/airflow/
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on issue #29684: Not able to submit FlinkDeployment to Azure AKS Kubernetes Cluster

2023-02-21 Thread via GitHub


boring-cyborg[bot] commented on issue #29684:
URL: https://github.com/apache/airflow/issues/29684#issuecomment-1439479817

   Thanks for opening your first issue here! Be sure to follow the issue 
template!
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] tusharg1993 opened a new issue, #29684: Not able to submit FlinkDeployment to Azure AKS Kubernetes Cluster

2023-02-21 Thread via GitHub


tusharg1993 opened a new issue, #29684:
URL: https://github.com/apache/airflow/issues/29684

   ### Apache Airflow Provider(s)
   
   apache-flink
   
   ### Versions of Apache Airflow Providers
   
   1.0.0
   
   ### Apache Airflow version
   
   2.4.3
   
   ### Operating System
   
   Not sure
   
   ### Deployment
   
   Microsoft ADF Managed Airflow
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
   Hello, I am trying to use Flink Airflow operator to submit a Flink job. My 
AKS cluster already has Flink K8s controller installed on it and works 
perfectly using kubectl.
   
   However, while trying to do the same through Airflow results in following 
error
   
   
   ```
   [2023-02-22T05:46:11.473+] {flink_kubernetes.py:103} INFO - Creating 
flinkApplication with Context: None and op_context: {'conf': 
, 'dag': 
, 'dag_run': , 
'data_interval_end': DateTime(2023, 2, 22, 5, 41, 8, 519537, 
tzinfo=Timezone('UTC')), 'data_interval_start': DateTime(2023, 2, 21, 5, 41, 8, 
519537, tzinfo=Timezone('UTC')), 'ds': '2023-02-22', 'ds_nodash': '20230222', 
'execution_date': DateTime(2023, 2, 22, 5, 41, 8, 519537, 
tzinfo=Timezone('UTC')), 'inlets': [], 'logical_date': DateTime(2023, 2, 22, 5, 
41, 8, 519537, tzinfo=Timezone('UTC')), 'macros': , 
'next_ds': '2023-02-22', 'next_ds_nodash': '20230222', 
 'next_execution_date': DateTime(2023, 2, 22, 5, 41, 8, 519537, 
tzinfo=Timezone('UTC')), 'outlets': [], 'params': {}, 
'prev_data_interval_start_success': DateTime(2023, 2, 21, 5, 32, 5, 541590, 
tzinfo=Timezone('UTC')), 'prev_data_interval_end_success': DateTime(2023, 2, 
22, 5, 32, 5, 541590, tzinfo=Timezone('UTC')), 'prev_ds': '2023-02-22', 
'prev_ds_nodash': '20230222', 'prev_execution_date': DateTime(2023, 2, 22, 5, 
41, 8, 519537, tzinfo=Timezone('UTC')), 'prev_execution_date_success': None, 
'prev_start_date_success': DateTime(2023, 2, 22, 5, 32, 6, 249219, 
tzinfo=Timezone('UTC')), 'run_id': 'manual__2023-02-22T05:41:08.519537+00:00', 
'task': , 'task_instance': 
, 'task_instance_key_str': 
'tutorial__sample_flink_task__20230222', 'test_mode': False, 'ti': 
, 'tomorrow_ds': '2023-02-23', 'tomorrow_ds_nodash': '20230223', 
'triggering_dataset_events': .get_triggering_events at 
0x7f419749caf0>>, 'ts': '2023-02-22T05:41:08.519537+00:00', 'ts_nodash': 
'20230222T054108', 'ts_nodash_with_tz': '20230222T054108.519537+', 'var': 
{'json': None, 'value': None}, 'conn': None, 'yesterday_ds': '2023-02-21', 
'yesterday_ds_nodash': '20230221'}
   [2023-02-22T05:46:11.474+] {taskinstance.py:1851} ERROR - Task failed 
with exception
   Traceback (most recent call last):
 File 
"/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/apache/flink/operators/flink_kubernetes.py",
 line 107, in execute
   self.hook.custom_object_client.list_cluster_custom_object(
   AttributeError: 'KubernetesHook' object has no attribute 
'custom_object_client'
   ```
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   The error can be reproduced by trying to configure a AKS kubernetes 
connection with Airflow and then try to use Flink operator for submitting a 
FlinkDeployment job.
   
   ```
   TEST_VALID_APPLICATION_JSON = """
   apiVersion: flink.apache.org/v1beta1
   kind: FlinkDeployment
   metadata:
 name: basic-example
   spec:
 image: flink:1.16
 flinkVersion: v1_16
 flinkConfiguration:
   taskmanager.numberOfTaskSlots: "2"
 serviceAccount: flink
 jobManager:
   resource:
 memory: "2048m"
 cpu: 1
 taskManager:
   resource:
 memory: "2048m"
 cpu: 1
 job:
   jarURI: local:///opt/flink/examples/streaming/StateMachineExample.jar
   parallelism: 2
   upgradeMode: stateless
   """
   
   t6 = FlinkKubernetesOperator(
   application_file=TEST_VALID_APPLICATION_JSON,
   in_cluster=False,
   namespace="default",
   kubernetes_conn_id="tgoyal_aks",
   task_id="sample_flink_task",
   )
   ```
   
   
   
   ### Anything else
   
   I validated that the Kubernetes connection is configured correctly by using 
the following operator success

[GitHub] [airflow] nervoussidd commented on pull request #29683: this file adds a on_kill method to Dataproc operator.

2023-02-21 Thread via GitHub


nervoussidd commented on PR #29683:
URL: https://github.com/apache/airflow/pull/29683#issuecomment-143947

   Hello Tzu,
   No I didn't tested. I was looking for the guidance but I didn't knew whom
   to approach so I just learn some basic stuff like about the libraries and
   about the methods and wrote this code.
   Can u please guide me so that I can work on this and get this merged.
   And thank you for the feedback.
   
   Thank you
   Thank You
   
   On Wed, Feb 22, 2023, 11:02 AM Tzu-ping Chung ***@***.***>
   wrote:
   
   > This is not valid Python. Did you test it?
   >
   > —
   > Reply to this email directly, view it on GitHub
   > ,
   > or unsubscribe
   > 

   > .
   > You are receiving this because you modified the open/close state.Message
   > ID: ***@***.***>
   >
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on pull request #29683: this file adds a on_kill method to Dataproc operator.

2023-02-21 Thread via GitHub


uranusjr commented on PR #29683:
URL: https://github.com/apache/airflow/pull/29683#issuecomment-1439463713

   This is not valid Python. Did you test it?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] nervoussidd closed pull request #29683: this file adds a on_kill method to Dataproc operator.

2023-02-21 Thread via GitHub


nervoussidd closed pull request #29683: this file adds a on_kill method to 
Dataproc operator.
URL: https://github.com/apache/airflow/pull/29683


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boring-cyborg[bot] commented on pull request #29683: this file adds a on_kill method to Dataproc operator.

2023-02-21 Thread via GitHub


boring-cyborg[bot] commented on PR #29683:
URL: https://github.com/apache/airflow/pull/29683#issuecomment-1439451836

   Congratulations on your first Pull Request and welcome to the Apache Airflow 
community! If you have any issues or are unsure about any anything please check 
our Contribution Guide 
(https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
   Here are some useful points:
   - Pay attention to the quality of your code (ruff, mypy and type 
annotations). Our [pre-commits]( 
https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#prerequisites-for-pre-commit-hooks)
 will help you with that.
   - In case of a new feature add useful documentation (in docstrings or in 
`docs/` directory). Adding a new operator? Check this short 
[guide](https://github.com/apache/airflow/blob/main/docs/apache-airflow/howto/custom-operator.rst)
 Consider adding an example DAG that shows how users should use it.
   - Consider using [Breeze 
environment](https://github.com/apache/airflow/blob/main/BREEZE.rst) for 
testing locally, it's a heavy docker but it ships with a working Airflow and a 
lot of integrations.
   - Be patient and persistent. It might take some time to get a review or get 
the final approval from Committers.
   - Please follow [ASF Code of 
Conduct](https://www.apache.org/foundation/policies/conduct) for all 
communication including (but not limited to) comments on Pull Requests, Mailing 
list and Slack.
   - Be sure to read the [Airflow Coding style]( 
https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#coding-style-and-best-practices).
   Apache Airflow is a community-driven project and together we are making it 
better 🚀.
   In case of doubts contact the developers at:
   Mailing List: d...@airflow.apache.org
   Slack: https://s.apache.org/airflow-slack
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] nervoussidd opened a new pull request, #29683: this file adds a on_kill method to Dataproc operator.

2023-02-21 Thread via GitHub


nervoussidd opened a new pull request, #29683:
URL: https://github.com/apache/airflow/pull/29683

   
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request 
Guidelines](https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst#pull-request-guidelines)**
 for more information.
   In case of fundamental code changes, an Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvement+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in a 
newsfragment file, named `{pr_number}.significant.rst` or 
`{issue_number}.significant.rst`, in 
[newsfragments](https://github.com/apache/airflow/tree/main/newsfragments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] uranusjr commented on issue #29199: TaskFlow AirflowSkipException causes downstream step to fail when multiple_outputs is true

2023-02-21 Thread via GitHub


uranusjr commented on issue #29199:
URL: https://github.com/apache/airflow/issues/29199#issuecomment-1439451086

   THis would change the behaviour of `zip`. `ArgNotSet` is `fillvalue`’s 
default value, and it needs to cause `XComNotFound` for the mechanism to work 
correctly. So either further logic is needed to distinguish between the cases, 
or one of these code paths need to change the sentinel value (not use 
`ArgNotSet`).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] nervoussidd commented on issue #10385: Add on_kill method to DataprocWorkflowTemplateInstantiate[Inline]Operator

2023-02-21 Thread via GitHub


nervoussidd commented on issue #10385:
URL: https://github.com/apache/airflow/issues/10385#issuecomment-1439417565

   .take-issue


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] boushphong commented on issue #25024: Database connection with get_sqlalchemy_engine() results in an error because of a quoted uri

2023-02-21 Thread via GitHub


boushphong commented on issue #25024:
URL: https://github.com/apache/airflow/issues/25024#issuecomment-1439413393

   @shubham0473 You're still working on this? you mind if I take-over this 
issue?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] josh-fell commented on pull request #29549: Fix and augment `check-for-inclusive-language` CI check

2023-02-21 Thread via GitHub


josh-fell commented on PR #29549:
URL: https://github.com/apache/airflow/pull/29549#issuecomment-1439398230

   Force pushing to hopefully get passed the test_cli_internal_api_background 
test now that it's quarantined.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] r-richmond commented on issue #24730: Google CloudRun job operator

2023-02-21 Thread via GitHub


r-richmond commented on issue #24730:
URL: https://github.com/apache/airflow/issues/24730#issuecomment-1439340843

   > It also currently won't build with apache-airflow-providers-google because 
of incompatible protobuf support (see 
https://github.com/googleapis/python-run/issues/70).
   
   >The only issue is resolving protobuf versions between the 2 (see 
https://github.com/googleapis/python-run/issues/70). The Google team will not 
solve this on their side so someone will need to solve it in the Google 
providers code.
   
   FWIW I have https://github.com/apache/airflow/pull/29644 open which solves 
the protobuf==3.2.0 issue. However, I've hit some CI/CD issues that I'm not 
sure how to solve. If someone wants to take a look / take over / pass some 
suggestions I'm all for it. This protobuf pin is the source of many of my 
headaches. 
   
   
[Link](https://apache-airflow.slack.com/archives/CCPRP7943/p1676932204181159) 
to slack airflow thread if that makes it easier to discuss.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] josh-fell commented on pull request #29678: FIx formatting of Dataset inlet/outlet note in TaskFlow concepts

2023-02-21 Thread via GitHub


josh-fell commented on PR #29678:
URL: https://github.com/apache/airflow/pull/29678#issuecomment-1439340653

   > Can it make sense to replace all (or almost all) such "Note: " places in 
".rst" with the corresponding formatting?
   > 
   > For instance,
   > 
   > 
https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/video_intelligence.html
   > 
   > 
https://github.com/apache/airflow/blob/main/docs/apache-airflow-providers-google/operators/cloud/video_intelligence.rst#more-information
   > 
   > https://user-images.githubusercontent.com/38596482/220464962-e732ebb9-5b38-42e5-82c2-dec8161b637e.png";>
   
   @vemikhaylov Maybe? This was fixing readability and it looked like the rST 
had a directive started but never finished (i.e. just a hanging `..`); however, 
if you feel there are improvements to be made to the docs, PRs are absolutely 
welcome! If that includes converting these explicit notes to a formatted one, 
I'd say go for it.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] dimberman commented on pull request #29680: Google Cloud Providers - Introduce GoogleCloudBaseOperator

2023-02-21 Thread via GitHub


dimberman commented on PR #29680:
URL: https://github.com/apache/airflow/pull/29680#issuecomment-1439325320

   LGTM if the tests pass we can merge. That said if you could please add some 
context at the top.  I realize that there is more info in the ticket but it 
would be nice to have a few sentences here as well to give larger context so 
future users who are running git blame don't have to dive through a hyperlink 
rabbithole to find out why this change was made.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] 1yangsh commented on a diff in pull request #29378: Add gitSync optional env description

2023-02-21 Thread via GitHub


1yangsh commented on code in PR #29378:
URL: https://github.com/apache/airflow/pull/29378#discussion_r1113741287


##
chart/values.yaml:
##
@@ -1919,6 +1919,19 @@ dags:
 
 extraVolumeMounts: []
 env: []
+ # Change permissions on the checked-out files to the specified mode.
+ # - name: GIT_SYNC_PERMISSION
+ #   value: "0755"
+ # The time to wait before retrying a failed --exechook-command.
+ # - name: GIT_SYNC_EXECHOOK_BACKOFF
+ #   value: "3s"
+ # An optional command to be executed after syncing a new hash of the 
remote repository.
+ # - name: GIT_SYNC_EXECHOOK_COMMAND
+ #   value: "./scripts/entrypoint.sh"
+ # The timeout for the --exechook-command.
+ # - name: GIT_SYNC_EXECHOOK_TIMEOUT
+ #   value: "30s"

Review Comment:
   I was thinking of expressing only the frequently used env values, but it 
would be better to give a general guide.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on pull request #29681: Small Extension Fix of Internal API Configuration Path in init_views.py

2023-02-21 Thread via GitHub


Taragolis commented on PR #29681:
URL: https://github.com/apache/airflow/pull/29681#issuecomment-1439266217

   Is this fix also solve https://github.com/apache/airflow/issues/29679?
   
   
   [Tests / Quarantined 
tests](https://github.com/apache/airflow/actions/runs/4237992453/jobs/7364657398#step:5:640)
 
   ```console
 = test session starts 
==
 platform linux -- Python 3.7.16, pytest-7.2.1, pluggy-1.0.0
 rootdir: /opt/airflow, configfile: pytest.ini
 plugins: rerunfailures-11.1.1, anyio-3.6.2, timeouts-1.2.1, 
capture-warnings-0.0.4, cov-4.0.0, xdist-3.2.0, asyncio-0.20.3, 
instafail-0.4.2, httpx-0.21.3, requests-mock-1.10.0, time-machine-2.9.0
 asyncio: mode=strict
 setup timeout: 60.0s, execution timeout: 60.0s, teardown timeout: 60.0s
 collected 14730 items / 14729 deselected / 1 skipped / 1 selected
 
 tests/cli/commands/test_internal_api_command.py .
[100%]
 
  generated xml file: /files/test_result-Quarantined-sqlite.xml 
-
  slowest 100 durations 
=
 10.11s call 
tests/cli/commands/test_internal_api_command.py::TestCliInternalAPi::test_cli_internal_api_background
 6.04s setup
tests/cli/commands/test_internal_api_command.py::TestCliInternalAPi::test_cli_internal_api_background
 0.05s teardown 
tests/cli/commands/test_internal_api_command.py::TestCliInternalAPi::test_cli_internal_api_background
 === 1 passed, 1 skipped, 14729 deselected, 68 warnings in 131.00s 
(0:02:10) 
 Number of warnings: 0 /files/warnings-Quarantined-sqlite.txt
 All tests successful
 No stopped containers
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bugraoz93 opened a new pull request, #29682: Migrate BaseJob.heartbeat to InternalAPI

2023-02-21 Thread via GitHub


bugraoz93 opened a new pull request, #29682:
URL: https://github.com/apache/airflow/pull/29682

   Migration of `BaseJob.heartbeat` to Internal API.
   
   closes: #29315 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on pull request #29661: Disable unixodbc and related packages from Microsoft APT repo

2023-02-21 Thread via GitHub


Taragolis commented on PR #29661:
URL: https://github.com/apache/airflow/pull/29661#issuecomment-1439255547

   At the moment when you decide not to wait anymore 
   
   
https://github.com/microsoft/linux-package-repositories/issues/36#issuecomment-1439222008
 🤣 🤣 
   
   >Closing as the original issue (unixodbc.h) has been resolved.
   For issues with libodbc1 vs libodbc2, refer to 
https://github.com/microsoft/linux-package-repositories/issues/39.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] bugraoz93 opened a new pull request, #29681: Fix Extension of Internal API Configuration Path in init_views.py

2023-02-21 Thread via GitHub


bugraoz93 opened a new pull request, #29681:
URL: https://github.com/apache/airflow/pull/29681

   It is just a tiny extension change for the configuration path.
   
   **Context:**
   While working on [BaseJob.heartbeat to 
InternalAPI#29315](https://github.com/apache/airflow/issues/29315), I realised 
that the `Webserver` wasn't taking the configuration correctly for the Internal 
API. This was preventing the `Webserver` from booting up due to 
`FileNotFoundError` when trying to run with `[webserver] run_internal_api = 
True`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch revert-29408-docker-compose-change-example updated (2b07e01cdc -> ab54df3039)

2023-02-21 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a change to branch revert-29408-docker-compose-change-example
in repository https://gitbox.apache.org/repos/asf/airflow.git


 discard 2b07e01cdc Revert "Improve health checks in example docker-compose and 
clarify usage (#29408)"
 add 6c13f04365 AWS Glue job hook: Make s3_bucket parameter optional 
(#29659)
 add ab54df3039 Revert "Improve health checks in example docker-compose and 
clarify usage (#29408)"

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (2b07e01cdc)
\
 N -- N -- N   
refs/heads/revert-29408-docker-compose-change-example (ab54df3039)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 airflow/providers/amazon/aws/hooks/glue.py| 15 -
 tests/providers/amazon/aws/hooks/test_glue.py | 48 +++
 2 files changed, 54 insertions(+), 9 deletions(-)



[airflow] branch main updated (9de301da2a -> 6c13f04365)

2023-02-21 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 9de301da2a FIx formatting of Dataset inlet/outlet note in TaskFlow 
concepts (#29678)
 add 6c13f04365 AWS Glue job hook: Make s3_bucket parameter optional 
(#29659)

No new revisions were added by this update.

Summary of changes:
 airflow/providers/amazon/aws/hooks/glue.py| 15 -
 tests/providers/amazon/aws/hooks/test_glue.py | 48 +++
 2 files changed, 54 insertions(+), 9 deletions(-)



[GitHub] [airflow] eladkal merged pull request #29659: AWS Glue job hook: Make s3_bucket parameter optional

2023-02-21 Thread via GitHub


eladkal merged PR #29659:
URL: https://github.com/apache/airflow/pull/29659


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal closed issue #29423: GlueJobOperator throws error after migration to newest version of Airflow

2023-02-21 Thread via GitHub


eladkal closed issue #29423: GlueJobOperator throws error after migration to 
newest version of Airflow
URL: https://github.com/apache/airflow/issues/29423


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] IKholopov commented on pull request #29518: Google Cloud Providers - Fix _MethodDefault deepcopy failure

2023-02-21 Thread via GitHub


IKholopov commented on PR #29518:
URL: https://github.com/apache/airflow/pull/29518#issuecomment-1439217827

   > 
   
   Sounds good, created https://github.com/apache/airflow/pull/29680


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] IKholopov commented on a diff in pull request #29518: Google Cloud Providers - Fix _MethodDefault deepcopy failure

2023-02-21 Thread via GitHub


IKholopov commented on code in PR #29518:
URL: https://github.com/apache/airflow/pull/29518#discussion_r1113667482


##
airflow/providers/google/cloud/operators/cloud_base.py:
##
@@ -0,0 +1,34 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google API base operator."""
+from __future__ import annotations
+
+from google.api_core.gapic_v1.method import DEFAULT
+
+from airflow.models import BaseOperator
+
+
+class GoogleCloudBaseOperator(BaseOperator):
+"""
+Abstract base class that takes care of common specifics of the operators 
built
+on top of Google API client libraries.
+"""
+
+def __deepcopy__(self, memo):
+memo[id(DEFAULT)] = DEFAULT

Review Comment:
   Done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] IKholopov opened a new pull request, #29680: Google Cloud Providers - Introduce GoogleCloudBaseOperator

2023-02-21 Thread via GitHub


IKholopov opened a new pull request, #29680:
URL: https://github.com/apache/airflow/pull/29680

   As part of addressing #28751 the GoogleCloudBaseOperator is introduced to 
have common fix for all Google Cloud operators.
   
   Refactored to a separate PR per @potiuk request. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on pull request #26946: Fix assume role if user explicit set credentials

2023-02-21 Thread via GitHub


Taragolis commented on PR #26946:
URL: https://github.com/apache/airflow/pull/26946#issuecomment-1439190448

   > Hi, currently I'm using airflow 2.4.3 with apache-airflow-providers-amazon 
6.0.0, the issue that I'm facing is that I add the keys of some aws user and 
also add the role_arn in the extra field, but the connection is not assuming 
the role, and instead is using the aws user identity, so looks like the problem 
is there...
   
   Upgrade to `apache-airflow-providers-amazon>=6.1.0`, for more details see 
[Changelog for Amazon 
Provider](https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/index.html#id16)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] crua-godaddy commented on pull request #26946: Fix assume role if user explicit set credentials

2023-02-21 Thread via GitHub


crua-godaddy commented on PR #26946:
URL: https://github.com/apache/airflow/pull/26946#issuecomment-1439178281

   Hi, currently I'm using airflow 2.4.3 with apache-airflow-providers-amazon 
6.0.0, the issue that I'm facing is that I add the keys of some aws user and 
also add the role_arn in the extra field, but the connection is not assuming 
the role, and instead is using the aws user identity, so looks like the problem 
is there...


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch constraints-main updated: Updating constraints. Github run id:4237083783

2023-02-21 Thread github-bot
This is an automated email from the ASF dual-hosted git repository.

github-bot pushed a commit to branch constraints-main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/constraints-main by this push:
 new d6d0562b70 Updating constraints. Github run id:4237083783
d6d0562b70 is described below

commit d6d0562b709dce5a2c282e65396559c4cb648a33
Author: Automated GitHub Actions commit 
AuthorDate: Tue Feb 21 22:28:39 2023 +

Updating constraints. Github run id:4237083783

This update in constraints is automatically committed by the CI 
'constraints-push' step based on
'refs/heads/main' in the 'apache/airflow' repository with commit sha 
9de301da2a44385f57be5407e80e16ee376f3d39.

The action that build those constraints can be found at 
https://github.com/apache/airflow/actions/runs/4237083783/

The image tag used for that build was: 
9de301da2a44385f57be5407e80e16ee376f3d39. You can enter Breeze environment
with this image by running 'breeze shell --image-tag 
9de301da2a44385f57be5407e80e16ee376f3d39'

All tests passed in this build so we determined we can push the updated 
constraints.

See 
https://github.com/apache/airflow/blob/main/README.md#installing-from-pypi for 
details.
---
 constraints-3.10.txt  | 78 +-
 constraints-3.7.txt   | 80 ++-
 constraints-3.8.txt   | 78 +-
 constraints-3.9.txt   | 78 +-
 constraints-no-providers-3.10.txt |  4 +-
 constraints-no-providers-3.7.txt  |  4 +-
 constraints-no-providers-3.8.txt  |  4 +-
 constraints-no-providers-3.9.txt  |  4 +-
 constraints-source-providers-3.10.txt | 60 +-
 constraints-source-providers-3.7.txt  | 62 ++-
 constraints-source-providers-3.8.txt  | 60 +-
 constraints-source-providers-3.9.txt  | 60 +-
 12 files changed, 288 insertions(+), 284 deletions(-)

diff --git a/constraints-3.10.txt b/constraints-3.10.txt
index ba7d89eac1..a3782db965 100644
--- a/constraints-3.10.txt
+++ b/constraints-3.10.txt
@@ -1,5 +1,5 @@
 #
-# This constraints file was automatically generated on 2023-02-20T04:49:36Z
+# This constraints file was automatically generated on 2023-02-21T22:28:27Z
 # via "eager-upgrade" mechanism of PIP. For the "main" branch of Airflow.
 # This variant of constraints install uses the HEAD of the branch version for 
'apache-airflow' but installs
 # the providers from PIP-released packages at the moment of the constraint 
generation.
@@ -60,44 +60,44 @@ ansiwrap==0.8.4
 anyio==3.6.2
 apache-airflow-providers-airbyte==3.2.0
 apache-airflow-providers-alibaba==2.2.0
-apache-airflow-providers-amazon==7.2.0
-apache-airflow-providers-apache-beam==4.2.0
+apache-airflow-providers-amazon==7.2.1
+apache-airflow-providers-apache-beam==4.3.0
 apache-airflow-providers-apache-cassandra==3.1.1
 apache-airflow-providers-apache-drill==2.3.1
 apache-airflow-providers-apache-druid==3.3.1
 apache-airflow-providers-apache-flink==1.0.0
 apache-airflow-providers-apache-hdfs==3.2.0
-apache-airflow-providers-apache-hive==5.1.2
+apache-airflow-providers-apache-hive==5.1.3
 apache-airflow-providers-apache-impala==1.0.0
 apache-airflow-providers-apache-kylin==3.1.0
 apache-airflow-providers-apache-livy==3.2.0
 apache-airflow-providers-apache-pig==4.0.0
 apache-airflow-providers-apache-pinot==4.0.1
 apache-airflow-providers-apache-spark==4.0.0
-apache-airflow-providers-apache-sqoop==3.1.0
+apache-airflow-providers-apache-sqoop==3.1.1
 apache-airflow-providers-arangodb==2.1.1
 apache-airflow-providers-asana==2.1.0
 apache-airflow-providers-atlassian-jira==2.0.0
 apache-airflow-providers-celery==3.1.0
 apache-airflow-providers-cloudant==3.1.0
-apache-airflow-providers-cncf-kubernetes==5.2.0
+apache-airflow-providers-cncf-kubernetes==5.2.1
 apache-airflow-providers-databricks==4.0.0
 apache-airflow-providers-datadog==3.1.0
 apache-airflow-providers-dbt-cloud==3.0.0
 apache-airflow-providers-dingding==3.1.0
 apache-airflow-providers-discord==3.1.0
-apache-airflow-providers-docker==3.5.0
+apache-airflow-providers-docker==3.5.1
 apache-airflow-providers-elasticsearch==4.4.0
 apache-airflow-providers-exasol==4.1.3
 apache-airflow-providers-facebook==3.1.0
 apache-airflow-providers-github==2.2.0
-apache-airflow-providers-google==8.9.0
+apache-airflow-providers-google==8.10.0
 apache-airflow-providers-grpc==3.1.0
 apache-airflow-providers-hashicorp==3.2.0
 apache-airflow-providers-influxdb==2.1.0
 apache-airflow-providers-jdbc==3.3.0
 apache-airflow-providers-jenkins==3.2.0
-apache-airflow-providers-microsoft-azure==5.2.0
+apache-airflow-providers-microsoft-azure==5.2.1
 apache-airflow-providers-microsoft-mssql==3.3.2
 apache-airflow-providers-microsoft-winrm==3.1

[GitHub] [airflow] Taragolis commented on pull request #29661: Disable unixodbc and related packages from Microsoft APT repo

2023-02-21 Thread via GitHub


Taragolis commented on PR #29661:
URL: https://github.com/apache/airflow/pull/29661#issuecomment-1439171423

   For comparison output from latest official Airflow Docker. By default PROD 
image also contains Debian's packages, however if user upgrade packages then 
`unixodbc` related packages will upgrade to packages from Microsoft repo
   
   ```console
   ❯ docker run -it --rm --user root apache/airflow:2.5.1-python3.9 bash
   The container is run as root user. For security, consider using a regular 
user account.
   
   root@ed86d0174f98:/opt/airflow# dpkg --list | grep -E 
"(libodbc1|odbcinst1debian2|odbcinst|unixodbc-dev|unixodbc)"
   ii  libodbc1:arm64 2.3.6-0.1+b1   arm64
ODBC library for Unix
   ii  odbcinst   2.3.6-0.1+b1   arm64
Helper program for accessing odbc ini files
   ii  odbcinst1debian2:arm64 2.3.6-0.1+b1   arm64
Support library for accessing odbc ini files
   ii  unixodbc   2.3.6-0.1+b1   arm64
Basic ODBC tools
   
   root@ed86d0174f98:/opt/airflow# apt update
   ...  
   Reading package lists... Done
   Building dependency tree... Done
   Reading state information... Done
   16 packages can be upgraded. Run 'apt list --upgradable' to see them.
   
   root@ed86d0174f98:/opt/airflow# apt list --upgradable
   Listing... Done
   bind9-host/stable-security 1:9.16.37-1~deb11u1 arm64 [upgradable from: 
1:9.16.33-1~deb11u1]
   bind9-libs/stable-security 1:9.16.37-1~deb11u1 arm64 [upgradable from: 
1:9.16.33-1~deb11u1]
   curl/stable-security 7.74.0-1.3+deb11u5 arm64 [upgradable from: 
7.74.0-1.3+deb11u3]
   libcurl4/stable-security 7.74.0-1.3+deb11u5 arm64 [upgradable from: 
7.74.0-1.3+deb11u3]
   libgnutls30/stable-security 3.7.1-5+deb11u3 arm64 [upgradable from: 
3.7.1-5+deb11u2]
   libodbc1/bullseye 2.3.11 arm64 [upgradable from: 2.3.6-0.1+b1]
   libpq5/bullseye-pgdg 15.2-1.pgdg110+1 arm64 [upgradable from: 
15.1-1.pgdg110+1]
   libssl1.1/stable-security 1.1.1n-0+deb11u4 arm64 [upgradable from: 
1.1.1n-0+deb11u3]
   msodbcsql18/bullseye 18.2.1.1-1 arm64 [upgradable from: 18.1.2.1-1]
   odbcinst1debian2/bullseye 2.3.11 arm64 [upgradable from: 2.3.6-0.1+b1]
   odbcinst/bullseye 2.3.11 arm64 [upgradable from: 2.3.6-0.1+b1]
   openssl/stable-security 1.1.1n-0+deb11u4 arm64 [upgradable from: 
1.1.1n-0+deb11u3]
   postgresql-client-15/bullseye-pgdg 15.2-1.pgdg110+1 arm64 [upgradable from: 
15.1-1.pgdg110+1]
   postgresql-client-common/bullseye-pgdg 247.pgdg110+1 all [upgradable from: 
246.pgdg110+1]
   postgresql-client/bullseye-pgdg 15+247.pgdg110+1 all [upgradable from: 
15+246.pgdg110+1]
   unixodbc/bullseye 2.3.11 arm64 [upgradable from: 2.3.6-0.1+b1]
   
   root@ed86d0174f98:/opt/airflow# apt-cache policy unixodbc
   unixodbc:
 Installed: 2.3.6-0.1+b1
 Candidate: 2.3.11
 Version table:
2.3.11 500
   500 https://packages.microsoft.com/debian/11/prod bullseye/main 
arm64 Packages
*** 2.3.6-0.1+b1 500
   500 http://deb.debian.org/debian bullseye/main arm64 Packages
   100 /var/lib/dpkg/status
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch revert-29408-docker-compose-change-example updated (a8124e3bbe -> 2b07e01cdc)

2023-02-21 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a change to branch revert-29408-docker-compose-change-example
in repository https://gitbox.apache.org/repos/asf/airflow.git


 discard a8124e3bbe Revert "Improve health checks in example docker-compose and 
clarify usage (#29408)"
 add 37a317286a docs: fix typo (#29658)
 add 66a8d102fc Quarantine `test_cli_internal_api_background` (#29665)
 add 1f1f97e666 Add Maciej Obuchowski to triage to help with AIP-53 issues 
(#29668)
 add 6699f953e6 Avoid modifying PRs in Recheck old bug reports workflow 
(#29653)
 add e8aa957439 AWS system test sagemaker-endpoint: archive logs (#29581)
 add 9de301da2a FIx formatting of Dataset inlet/outlet note in TaskFlow 
concepts (#29678)
 add 2b07e01cdc Revert "Improve health checks in example docker-compose and 
clarify usage (#29408)"

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (a8124e3bbe)
\
 N -- N -- N   
refs/heads/revert-29408-docker-compose-change-example (2b07e01cdc)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:
 .asf.yaml  |  1 +
 .../{stale.yml => recheck-old-bug-report.yml}  | 31 ++
 .github/workflows/stale.yml| 19 -
 .../logging/s3-task-handler.rst|  2 +-
 docs/apache-airflow/core-concepts/taskflow.rst |  5 ++--
 tests/cli/commands/test_internal_api_command.py|  1 +
 .../amazon/aws/example_sagemaker_endpoint.py   | 15 +--
 7 files changed, 20 insertions(+), 54 deletions(-)
 copy .github/workflows/{stale.yml => recheck-old-bug-report.yml} (65%)



[GitHub] [airflow] steren commented on a diff in pull request #28525: Add CloudRunExecuteJobOperator

2023-02-21 Thread via GitHub


steren commented on code in PR #28525:
URL: https://github.com/apache/airflow/pull/28525#discussion_r1113612570


##
airflow/providers/google/cloud/operators/cloud_run.py:
##
@@ -0,0 +1,115 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains Google Cloud Run Jobs operators."""
+from __future__ import annotations
+
+from typing import TYPE_CHECKING, Sequence
+
+from airflow.models import BaseOperator
+from airflow.providers.google.cloud.hooks.cloud_run import CloudRunJobHook
+from airflow.providers.google.cloud.links.cloud_run import 
CloudRunJobExecutionLink
+
+if TYPE_CHECKING:
+from airflow.utils.context import Context
+
+
+class CloudRunExecuteJobOperator(BaseOperator):
+"""
+Executes an existing Cloud Run job.
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:CloudRunExecuteJobOperator`
+
+:param job_name: The name of the cloud run job to execute
+:param region: The region of the Cloud Run job (for example europe-west1)
+:param project_id: The ID of the GCP project that owns the job.
+If set to ``None`` or missing, the default project_id
+from the GCP connection is used.
+:param gcp_conn_id: The connection ID to use to connect to Google Cloud.
+:param delegate_to: The account to impersonate, if any.
+For this to work, the service account making the request
+must have domain-wide delegation enabled.
+:param wait_until_finished: If True, wait for the end of job execution
+before exiting. If False (default), only submits job.
+:param impersonation_chain: Optional service account to impersonate
+using short-term credentials, or chained list of accounts required
+to get the access_token of the last account in the list,
+which will be impersonated in the request. If set as a string, the
+account must grant the originating account the Service Account 
Token
+Creator IAM role. If set as a sequence, the identities from the 
list
+must grant Service Account Token Creator IAM role to the directly
+preceding identity, with first account from the list granting this
+role to the originating account (templated).
+"""
+
+template_fields: Sequence[str] = ("job_name", "region", "project_id", 
"gcp_conn_id")
+operator_extra_links = (CloudRunJobExecutionLink(),)
+
+def __init__(
+self,
+job_name: str,

Review Comment:
   Ah ok, this doesn't create a job, but executes an existing one, ok sounds 
good.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] steren commented on a diff in pull request #28525: Add CloudRunExecuteJobOperator

2023-02-21 Thread via GitHub


steren commented on code in PR #28525:
URL: https://github.com/apache/airflow/pull/28525#discussion_r1113612282


##
airflow/providers/google/cloud/hooks/cloud_run.py:
##
@@ -0,0 +1,245 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+"""This module contains a Google Cloud Run Hook."""
+from __future__ import annotations
+
+import json
+import time
+from typing import Any, Callable, Dict, List, Sequence, Union, cast
+
+from google.api_core.client_options import ClientOptions
+from googleapiclient.discovery import build
+
+from airflow.providers.google.common.hooks.base_google import GoogleBaseHook
+from airflow.utils.log.logging_mixin import LoggingMixin
+
+DEFAULT_CLOUD_RUN_REGION = "us-central1"
+
+
+class CloudRunJobSteps:
+"""
+Helper class with Cloud Run job status.
+Reference: 
https://cloud.google.com/run/docs/reference/rest/v1/namespaces.jobs#JobStatus

Review Comment:
   Using the V2 API doesn't mean using the Cloud Run Python client library. 
   Here, it seems like you are calling the API via HTTP directly? 
   You could hit the /v2/ endpoint instead of the /v1/, it's much easier to use.
   
   This is optional, since you have it working on v1



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis opened a new issue, #29679: Fix Quarantined `test_cli_internal_api_background`

2023-02-21 Thread via GitHub


Taragolis opened a new issue, #29679:
URL: https://github.com/apache/airflow/issues/29679

   ### Body
   
   Recently, [this 
test](https://github.com/apache/airflow/blob/9de301da2a44385f57be5407e80e16ee376f3d39/tests/cli/commands/test_internal_api_command.py#L134-L137)
 began to failed with timeout error and it has affected all tests in single CI 
run.
   
   We should figure out why it happen and try to resolve it.
   
   
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal commented on a diff in pull request #29635: YandexCloud provider: support Yandex SDK feature "endpoint"

2023-02-21 Thread via GitHub


eladkal commented on code in PR #29635:
URL: https://github.com/apache/airflow/pull/29635#discussion_r1113600318


##
airflow/providers/yandex/hooks/yandex.py:
##
@@ -78,6 +78,11 @@ def get_connection_form_widgets() -> dict[str, Any]:
 description="Optional. This key will be placed to all created 
Compute nodes"
 "to let you have a root shell there",
 ),
+"endpoint": StringField(
+lazy_gettext("API endpoint"),
+widget=BS3TextFieldWidget(),
+description="Optional.",

Review Comment:
   I think it's better to have here some more meaningful description



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal closed issue #29607: Status of testing Providers that were prepared on February 18, 2023

2023-02-21 Thread via GitHub


eladkal closed issue #29607: Status of testing Providers that were prepared on 
February 18, 2023
URL: https://github.com/apache/airflow/issues/29607


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal commented on issue #29607: Status of testing Providers that were prepared on February 18, 2023

2023-02-21 Thread via GitHub


eladkal commented on issue #29607:
URL: https://github.com/apache/airflow/issues/29607#issuecomment-1439130411

   Thank you everyone.
   Providers are released
   I invite everyone to help improve providers for the next release, a list of 
open issues can be found 
[here](https://github.com/apache/airflow/issues?q=is%3Aopen+is%3Aissue+label%3Aarea%3Aproviders).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vemikhaylov commented on pull request #29678: FIx formatting of Dataset inlet/outlet note in TaskFlow concepts

2023-02-21 Thread via GitHub


vemikhaylov commented on PR #29678:
URL: https://github.com/apache/airflow/pull/29678#issuecomment-1439128327

   Can it make sense to replace all (or almost all) such "Note: " places in 
".rst" with the corresponding formatting?
   
   For instance,
   
   
https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/cloud/video_intelligence.html
   

https://github.com/apache/airflow/blob/main/docs/apache-airflow-providers-google/operators/cloud/video_intelligence.rst#more-information
   
   https://user-images.githubusercontent.com/38596482/220464962-e732ebb9-5b38-42e5-82c2-dec8161b637e.png";>
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (e8aa957439 -> 9de301da2a)

2023-02-21 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from e8aa957439 AWS system test sagemaker-endpoint: archive logs (#29581)
 add 9de301da2a FIx formatting of Dataset inlet/outlet note in TaskFlow 
concepts (#29678)

No new revisions were added by this update.

Summary of changes:
 docs/apache-airflow/core-concepts/taskflow.rst | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[GitHub] [airflow] Taragolis merged pull request #29678: FIx formatting of Dataset inlet/outlet note in TaskFlow concepts

2023-02-21 Thread via GitHub


Taragolis merged PR #29678:
URL: https://github.com/apache/airflow/pull/29678


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow-site] eladkal merged pull request #737: Add documentation for packages - 2023-02-18

2023-02-21 Thread via GitHub


eladkal merged PR #737:
URL: https://github.com/apache/airflow-site/pull/737


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow-site] branch add-documentation-2023-02-18 updated (c873115a9f -> 9f492b15f4)

2023-02-21 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch add-documentation-2023-02-18
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


omit c873115a9f Add documentation for packages - 2023-02-18
 add 9f492b15f4 Add documentation for packages - 2023-02-18

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (c873115a9f)
\
 N -- N -- N   refs/heads/add-documentation-2023-02-18 (9f492b15f4)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

No new revisions were added by this update.

Summary of changes:



svn commit: r60245 - /dev/airflow/providers/ /release/airflow/providers/

2023-02-21 Thread eladkal
Author: eladkal
Date: Tue Feb 21 21:19:07 2023
New Revision: 60245

Log:
Release Airflow Providers on 2023-02-21

Added:
release/airflow/providers/apache-airflow-providers-amazon-7.2.1.tar.gz
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-amazon-7.2.1.tar.gz
release/airflow/providers/apache-airflow-providers-amazon-7.2.1.tar.gz.asc
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-amazon-7.2.1.tar.gz.asc

release/airflow/providers/apache-airflow-providers-amazon-7.2.1.tar.gz.sha512
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-amazon-7.2.1.tar.gz.sha512
release/airflow/providers/apache-airflow-providers-apache-beam-4.3.0.tar.gz
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-apache-beam-4.3.0.tar.gz

release/airflow/providers/apache-airflow-providers-apache-beam-4.3.0.tar.gz.asc
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-apache-beam-4.3.0.tar.gz.asc

release/airflow/providers/apache-airflow-providers-apache-beam-4.3.0.tar.gz.sha512
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-apache-beam-4.3.0.tar.gz.sha512
release/airflow/providers/apache-airflow-providers-apache-hive-5.1.3.tar.gz
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-apache-hive-5.1.3.tar.gz

release/airflow/providers/apache-airflow-providers-apache-hive-5.1.3.tar.gz.asc
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-apache-hive-5.1.3.tar.gz.asc

release/airflow/providers/apache-airflow-providers-apache-hive-5.1.3.tar.gz.sha512
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-apache-hive-5.1.3.tar.gz.sha512
release/airflow/providers/apache-airflow-providers-apache-sqoop-3.1.1.tar.gz
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-apache-sqoop-3.1.1.tar.gz

release/airflow/providers/apache-airflow-providers-apache-sqoop-3.1.1.tar.gz.asc
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-apache-sqoop-3.1.1.tar.gz.asc

release/airflow/providers/apache-airflow-providers-apache-sqoop-3.1.1.tar.gz.sha512
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-apache-sqoop-3.1.1.tar.gz.sha512

release/airflow/providers/apache-airflow-providers-cncf-kubernetes-5.2.1.tar.gz
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-cncf-kubernetes-5.2.1.tar.gz

release/airflow/providers/apache-airflow-providers-cncf-kubernetes-5.2.1.tar.gz.asc
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-cncf-kubernetes-5.2.1.tar.gz.asc

release/airflow/providers/apache-airflow-providers-cncf-kubernetes-5.2.1.tar.gz.sha512
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-cncf-kubernetes-5.2.1.tar.gz.sha512
release/airflow/providers/apache-airflow-providers-docker-3.5.1.tar.gz
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-docker-3.5.1.tar.gz
release/airflow/providers/apache-airflow-providers-docker-3.5.1.tar.gz.asc
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-docker-3.5.1.tar.gz.asc

release/airflow/providers/apache-airflow-providers-docker-3.5.1.tar.gz.sha512
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-docker-3.5.1.tar.gz.sha512
release/airflow/providers/apache-airflow-providers-google-8.10.0.tar.gz
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-google-8.10.0.tar.gz
release/airflow/providers/apache-airflow-providers-google-8.10.0.tar.gz.asc
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-google-8.10.0.tar.gz.asc

release/airflow/providers/apache-airflow-providers-google-8.10.0.tar.gz.sha512
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-google-8.10.0.tar.gz.sha512
release/airflow/providers/apache-airflow-providers-http-4.2.0.tar.gz
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-http-4.2.0.tar.gz
release/airflow/providers/apache-airflow-providers-http-4.2.0.tar.gz.asc
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-http-4.2.0.tar.gz.asc
release/airflow/providers/apache-airflow-providers-http-4.2.0.tar.gz.sha512
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-http-4.2.0.tar.gz.sha512

release/airflow/providers/apache-airflow-providers-microsoft-azure-5.2.1.tar.gz
  - copied unchanged from r60244, 
dev/airflow/providers/apache-airflow-providers-microsoft-azure-5.2.1.tar.gz
  

[airflow] branch main updated (6699f953e6 -> e8aa957439)

2023-02-21 Thread taragolis
This is an automated email from the ASF dual-hosted git repository.

taragolis pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 6699f953e6 Avoid modifying PRs in Recheck old bug reports workflow 
(#29653)
 add e8aa957439 AWS system test sagemaker-endpoint: archive logs (#29581)

No new revisions were added by this update.

Summary of changes:
 .../providers/amazon/aws/example_sagemaker_endpoint.py| 15 ++-
 1 file changed, 6 insertions(+), 9 deletions(-)



[GitHub] [airflow] Taragolis merged pull request #29581: AWS system test sagemaker-endpoint: archive logs

2023-02-21 Thread via GitHub


Taragolis merged PR #29581:
URL: https://github.com/apache/airflow/pull/29581


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] fritz-astronomer commented on pull request #29599: fix do_xcom_push=False bug in SnowflakeOperator

2023-02-21 Thread via GitHub


fritz-astronomer commented on PR #29599:
URL: https://github.com/apache/airflow/pull/29599#issuecomment-1439063340

   Did a quick rebase just to clean up commits. 
   Tests are added and passing in breeze ✅ 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] josh-fell opened a new pull request, #29678: FIx formatting of Dataset inlet/outlet note in TaskFlow concepts

2023-02-21 Thread via GitHub


josh-fell opened a new pull request, #29678:
URL: https://github.com/apache/airflow/pull/29678

   **Before**
   https://user-images.githubusercontent.com/48934154/220452755-2f674ada-b405-4bc7-ab12-77215c8bcfd7.png";>
   
   **After**
   https://user-images.githubusercontent.com/48934154/220452840-69c8d89c-406b-4781-8006-7cc12818b485.png";>
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch josh-fell-patch-1 created (now 773f21d95e)

2023-02-21 Thread joshfell
This is an automated email from the ASF dual-hosted git repository.

joshfell pushed a change to branch josh-fell-patch-1
in repository https://gitbox.apache.org/repos/asf/airflow.git


  at 773f21d95e FIx formatting of Dataset inlet/outlet note in TaskFlow 
concepts

This branch includes the following new commits:

 new 773f21d95e FIx formatting of Dataset inlet/outlet note in TaskFlow 
concepts

The 1 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.




[airflow] 01/01: FIx formatting of Dataset inlet/outlet note in TaskFlow concepts

2023-02-21 Thread joshfell
This is an automated email from the ASF dual-hosted git repository.

joshfell pushed a commit to branch josh-fell-patch-1
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 773f21d95e3f3281f9d4ddfb4e3280c7553a1006
Author: Josh Fell <48934154+josh-f...@users.noreply.github.com>
AuthorDate: Tue Feb 21 15:36:17 2023 -0500

FIx formatting of Dataset inlet/outlet note in TaskFlow concepts
---
 docs/apache-airflow/core-concepts/taskflow.rst | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git a/docs/apache-airflow/core-concepts/taskflow.rst 
b/docs/apache-airflow/core-concepts/taskflow.rst
index 92daf5dd5c..6f46cdb4cd 100644
--- a/docs/apache-airflow/core-concepts/taskflow.rst
+++ b/docs/apache-airflow/core-concepts/taskflow.rst
@@ -91,9 +91,10 @@ need to be able to be serialized. Airflow out of the box 
supports all built-in t
 supports objects that are decorated with ``@dataclass`` or ``@attr.define``. 
The following example shows the use of
 a ``Dataset``, which is ``@attr.define`` decorated, together with TaskFlow.
 
-::
+.. note::
+
+An additional benefit of using ``Dataset`` is that it automatically 
registers as an ``inlet`` in case it is used as an input argument. It also auto 
registers as an ``outlet`` if the return value of your task is a ``dataset`` or 
a ``list[Dataset]]``.
 
-  Note: An additional benefit of using ``Dataset`` is that it automatically 
registers as an ``inlet`` in case it is used as an input argument. It also auto 
registers as an ``outlet`` if the return value of your task is a ``dataset`` or 
a ``list[Dataset]]``.
 
 .. code-block:: python
 



[GitHub] [airflow] vemikhaylov commented on pull request #29608: Enable passing --xcom-args to tasks test CLI command

2023-02-21 Thread via GitHub


vemikhaylov commented on PR #29608:
URL: https://github.com/apache/airflow/pull/29608#issuecomment-1439054387

   Regarding the JSON direction potentially a better way would be to make it 
more explicitly structured like:
   
   ```
   {"get_python_echo_message": {"key": "return_value", "value": "test xcom 
arg"}}
   ```
   or just
   ```
   [{"task_id": "get_python_echo_message", "key": "return_value", "value": 
"test xcom arg"}, ...]
   ```
   
   `key` may be optional with `"return_value"` as the default.
   
   ```
   [{"task_id": "get_python_echo_message", "value": "test xcom arg"}, ...]
   ```
   
   It may look a bit more wordy and cumbersome, but at the same time clearer 
and more extensible in the future. Before adjusting the implementation, I would 
like to get more eyes on these to see if there are any concerns or if there are 
good alternatives to the JSON way.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] r-richmond commented on pull request #29644: Remove <2.0.0 limit on google-cloud-bigtable

2023-02-21 Thread via GitHub


r-richmond commented on PR #29644:
URL: https://github.com/apache/airflow/pull/29644#issuecomment-1439042843

   This hit the same error that my local run of find newer dependencies did. 
Not sure what to do from here..


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vandonr-amz commented on pull request #29245: fix code checking job names in sagemaker

2023-02-21 Thread via GitHub


vandonr-amz commented on PR #29245:
URL: https://github.com/apache/airflow/pull/29245#issuecomment-1439042642

   hello @dimberman we just had a new system test failure because of this issue 
(getting throttled on too many listing requests). Any chance you can re-assess 
your review ? It'd be nice to get this merged.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck commented on a diff in pull request #29627: Clarify `service_config` in AWS Connection

2023-02-21 Thread via GitHub


vincbeck commented on code in PR #29627:
URL: https://github.com/apache/airflow/pull/29627#discussion_r1113526645


##
docs/apache-airflow-providers-amazon/connections/aws.rst:
##
@@ -332,6 +317,29 @@ The following settings may be used within the 
``assume_role_with_saml`` containe
 - https://pypi.org/project/requests-gssapi/
 
 
+.. _howto/connection:aws:per-service-configuration:
+
+Per service configuration
+^
+
+S3 Bucket configurations
+
+
+For use S3 Bucket name per connection in 
:class:`~airflow.providers.amazon.aws.hooks.s3.S3Hook` methods
+provide selected options in Connection Extra.

Review Comment:
   ```suggestion
   To use S3 bucket name per connection in 
:class:`~airflow.providers.amazon.aws.hooks.s3.S3Hook` methods,
   provide selected options in Connection Extra.
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck commented on a diff in pull request #29657: Add `wait_for_completion` param in `RedshiftCreateClusterOperator`

2023-02-21 Thread via GitHub


vincbeck commented on code in PR #29657:
URL: https://github.com/apache/airflow/pull/29657#discussion_r1113524200


##
tests/providers/amazon/aws/operators/test_redshift_cluster.py:
##
@@ -76,6 +77,10 @@ def test_create_single_node_cluster(self, mock_get_conn):
 **params,
 )
 
+
mock_get_conn.return_value.get_waiter.return_value.wait.assert_called_once_with(

Review Comment:
   We should test that the waiter is not called when `wait_for_completion` is 
`False`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (1f1f97e666 -> 6699f953e6)

2023-02-21 Thread eladkal
This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 1f1f97e666 Add Maciej Obuchowski to triage to help with AIP-53 issues 
(#29668)
 add 6699f953e6 Avoid modifying PRs in Recheck old bug reports workflow 
(#29653)

No new revisions were added by this update.

Summary of changes:
 .../{stale.yml => recheck-old-bug-report.yml}  | 31 ++
 .github/workflows/stale.yml| 19 -
 2 files changed, 8 insertions(+), 42 deletions(-)
 copy .github/workflows/{stale.yml => recheck-old-bug-report.yml} (65%)



[GitHub] [airflow] eladkal merged pull request #29653: Avoid modifying PRs in Recheck old bug reports workflow

2023-02-21 Thread via GitHub


eladkal merged PR #29653:
URL: https://github.com/apache/airflow/pull/29653


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck commented on pull request #29580: Allow to specify which connection, variable or config are being looked up in the backend using *_lookup_pattern parameters

2023-02-21 Thread via GitHub


vincbeck commented on PR #29580:
URL: https://github.com/apache/airflow/pull/29580#issuecomment-1439024161

   Thanks @dimberman for the feedbacks. I addressed them, feel free to review 
when you get a chance :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] potiuk commented on issue #29656: Generate URI/JSON for a connection from the Airflow Connection UI

2023-02-21 Thread via GitHub


potiuk commented on issue #29656:
URL: https://github.com/apache/airflow/issues/29656#issuecomment-1439013943

   Sure. If you think so. Without sensitive info it might be fine - the 'get as 
uri' is kinda cumbersome though :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] romibuzi commented on pull request #29659: AWS Glue job hook: Make s3_bucket parameter optional

2023-02-21 Thread via GitHub


romibuzi commented on PR #29659:
URL: https://github.com/apache/airflow/pull/29659#issuecomment-1438999584

   @vincbeck I have pushed fixes, checks are currently running it should be ok 
after


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck commented on a diff in pull request #28241: Create Lambda create operator and sensor

2023-02-21 Thread via GitHub


vincbeck commented on code in PR #28241:
URL: https://github.com/apache/airflow/pull/28241#discussion_r1113495450


##
airflow/providers/amazon/aws/operators/lambda_function.py:
##
@@ -20,35 +20,122 @@
 import json
 from typing import TYPE_CHECKING, Sequence
 
+from airflow.compat.functools import cached_property
 from airflow.models import BaseOperator
 from airflow.providers.amazon.aws.hooks.lambda_function import LambdaHook
 
 if TYPE_CHECKING:
 from airflow.utils.context import Context
 
 
-class AwsLambdaInvokeFunctionOperator(BaseOperator):
+class LambdaCreateFunctionOperator(BaseOperator):
 """
-Invokes an AWS Lambda function.
-You can invoke a function synchronously (and wait for the response),
-or asynchronously.
-To invoke a function asynchronously,
-set `invocation_type` to `Event`. For more details,
-review the boto3 Lambda invoke docs.
+Creates an AWS Lambda function.
+
+More information regarding parameters of this operator can be found here
+
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/lambda.html#Lambda.Client.create_function
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:LambdaCreateFunctionOperator`
 
 :param function_name: The name of the AWS Lambda function, version, or 
alias.
-:param log_type: Set to Tail to include the execution log in the response. 
Otherwise, set to "None".
-:param qualifier: Specify a version or alias to invoke a published version 
of the function.
-:param invocation_type: One of RequestResponse / Event / DryRun
-:param client_context: Up to 3,583 bytes of base64-encoded data about the 
invoking client
-to pass to the function in the context object.
-:param payload: The JSON string that you want to provide to your Lambda 
function as input.
+:param runtime: The identifier of the function's runtime. Runtime is 
required if the deployment package
+is a .zip file archive.
+:param role: The Amazon Resource Name (ARN) of the function's execution 
role.
+:param handler: The name of the method within your code that Lambda calls 
to run your function.
+Handler is required if the deployment package is a .zip file archive.
+:param code: The code for the function.
+:param description: A description of the function.
+:param timeout: The amount of time (in seconds) that Lambda allows a 
function to run before stopping it.
+:param config: Optional dictionary for arbitrary parameters to the boto 
API create_lambda call.
+:param wait_for_completion: If True, the operator will wait until the 
function is active.
 :param aws_conn_id: The AWS connection ID to use
+"""
+
+template_fields: Sequence[str] = (
+"function_name",
+"runtime",
+"role",
+"handler",
+"code",
+"config",
+)
+ui_color = "#ff7300"
+
+def __init__(
+self,
+*,
+function_name: str,
+runtime: str | None = None,
+role: str,
+handler: str | None = None,
+code: dict,
+description: str | None = None,
+timeout: int | None = None,
+config: dict = {},
+wait_for_completion: bool = False,
+aws_conn_id: str = "aws_default",
+**kwargs,
+):
+super().__init__(**kwargs)
+self.function_name = function_name
+self.runtime = runtime
+self.role = role
+self.handler = handler
+self.code = code
+self.description = description
+self.timeout = timeout
+self.config = config
+self.wait_for_completion = wait_for_completion
+self.aws_conn_id = aws_conn_id
+
+@cached_property
+def hook(self) -> LambdaHook:
+return LambdaHook(aws_conn_id=self.aws_conn_id)
+
+def execute(self, context: Context):
+self.log.info("Creating AWS Lambda function: %s", self.function_name)
+response = self.hook.create_lambda(
+function_name=self.function_name,
+runtime=self.runtime,
+role=self.role,
+handler=self.handler,
+code=self.code,
+description=self.description,
+timeout=self.timeout,
+**self.config,
+)
+self.log.info("Lambda response: %r", response)
+
+if self.wait_for_completion:
+self.log.info("Wait for Lambda function to be active")
+waiter = self.hook.conn.get_waiter("function_active_v2")
+waiter.wait(
+FunctionName=self.function_name,
+)
+
+return response.get("FunctionArn")
+
+
+class AwsLambdaInvokeFunctionOperator(BaseOperator):

Review Comment:
   You're totally correct. However, this is not related to this PR, this class 
existed before this PR, the diff just makes it confusing. I actually dont add 
`AwsLambdaI

[GitHub] [airflow] vincbeck opened a new issue, #29677: Rename AWS lambda related resources

2023-02-21 Thread via GitHub


vincbeck opened a new issue, #29677:
URL: https://github.com/apache/airflow/issues/29677

   ### Apache Airflow Provider(s)
   
   amazon
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Apache Airflow version
   
   2.5.0
   
   ### Operating System
   
   MacOS
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   AWS Lambda in Amazon provider package do not follow the convention #20296. 
Hook, operators and sensors related to AWS lambda need to be renamed to follow 
this convention. Here are the proposed changes in order to fix it:
   - Rename `airflow/providers/amazon/aws/operators/lambda_function.py` to 
`airflow/providers/amazon/aws/operators/lambda.py`
   - Rename `airflow/providers/amazon/aws/sensors/lambda_function.py` to 
`airflow/providers/amazon/aws/sensors/lambda.py`
   - Rename `airflow/providers/amazon/aws/hooks/lambda_function.py` to 
`airflow/providers/amazon/aws/hooks/lambda.py`
   - Rename `AwsLambdaInvokeFunctionOperator` to `LambdaInvokeFunctionOperator`
   
   Since all these changes are breaking changes, it will have to be done 
following the deprecation pattern:
   - Copy/paste the files with the new name
   - Update the existing hook, operators and sensors to inherit from these new 
classes
   - Deprecate these classes by sending deprecation warnings. See an example 
[here](airflow/providers/amazon/aws/operators/aws_lambda.py)
   
   ### What happened
   
   _No response_
   
   ### What you think should happen instead
   
   _No response_
   
   ### How to reproduce
   
   N/A
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck commented on pull request #29659: AWS Glue job hook: Make s3_bucket parameter optional

2023-02-21 Thread via GitHub


vincbeck commented on PR #29659:
URL: https://github.com/apache/airflow/pull/29659#issuecomment-1438972523

   There are some static check failures though. Please read 
[documentation](https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst#id4)
 in order to fix them.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] mobuchowski opened a new issue, #29676: Implement system tests that confirm OpenLineage integration in selected provider works

2023-02-21 Thread via GitHub


mobuchowski opened a new issue, #29676:
URL: https://github.com/apache/airflow/issues/29676

   ### Body
   
   Airflow uses system tests as a way to test interaction of providers with 
external systems. We should use this mechanism to confirm lineage collection 
works as designed
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] mobuchowski opened a new issue, #29675: Implement native OpenLineage integration for chosen other Provider

2023-02-21 Thread via GitHub


mobuchowski opened a new issue, #29675:
URL: https://github.com/apache/airflow/issues/29675

   ### Body
   
   In a first phase, we want to choose one specific provider, implement 
OpenLineage integration with it and make sure it plays well with prepared 
framework. It seems to me that Snowflake is a nice candidate - it matches well 
with a lot of things we want to provide in the first phase of OL-Airflow 
provider. 
   
   This should also include unit tests that will confirm implementation works. 
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] mobuchowski opened a new issue, #29674: Migrate OpenLineage integration unit tests to Airflow test framework

2023-02-21 Thread via GitHub


mobuchowski opened a new issue, #29674:
URL: https://github.com/apache/airflow/issues/29674

   ### Body
   
   As part of this task, we also need to make sure they are properly run at CI.
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] mobuchowski opened a new issue, #29673: Provide OL SQL parser as internal OpenLineage provider API

2023-02-21 Thread via GitHub


mobuchowski opened a new issue, #29673:
URL: https://github.com/apache/airflow/issues/29673

   ### Body
   
   To easily provide OpenLineage events from SQL-based operators, we want to 
add SQL parser API that other providers can use. The API should be stable.
   
   There are two proposed options:
   
   - Provide literal API - classes that other providers could import and use 
directly
   - Pass parser class to `get_openlineage_facets_*` methods that the operators 
could use, but we need to be careful to not pass it to openlineage-airflow 
defined methods, as those won't accept it. 
   
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] vincbeck commented on a diff in pull request #29580: Allow to specify which connection, variable or config are being looked up in the backend using *_lookup_pattern parameters

2023-02-21 Thread via GitHub


vincbeck commented on code in PR #29580:
URL: https://github.com/apache/airflow/pull/29580#discussion_r1113446539


##
airflow/providers/amazon/aws/secrets/secrets_manager.py:
##
@@ -264,14 +286,14 @@ def get_conn_uri(self, conn_id: str) -> str | None:
 
 def get_variable(self, key: str) -> str | None:
 """
-Get Airflow Variable from Environment Variable
+Get Airflow Variable

Review Comment:
   In the context of this class, we are actually getting it from only one 
location: AWS Secrets Manager. If this function returns None, then it is 
fetched from other location: Environment variable then metastore but this is 
done outside of this class. [See documentation 
here](https://airflow.apache.org/docs/apache-airflow/1.10.10/howto/use-alternative-secrets-backend.html).
 Let me know if you still think I should update the documentation



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] mobuchowski opened a new issue, #29672: Adapt OpenLineage extractor framework code to make sure externally defined extractors work

2023-02-21 Thread via GitHub


mobuchowski opened a new issue, #29672:
URL: https://github.com/apache/airflow/issues/29672

   ### Body
   
   The OpenLineage provider should respect externally defined extractors, even 
if they are based on openlineage-airflow data types and classes.
   
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] mobuchowski opened a new issue, #29671: Adapt OpenLineage default extractor to properly accept all OL implementation

2023-02-21 Thread via GitHub


mobuchowski opened a new issue, #29671:
URL: https://github.com/apache/airflow/issues/29671

   ### Body
   
   Adapt default extractor to accept any valid type returned from Operators 
`get_openlineage_facets_*` method. 
   This needs to ensure compatibility with operators made with external 
extractors for current openlineage-airflow integration.
   
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] eladkal commented on a diff in pull request #28241: Create Lambda create operator and sensor

2023-02-21 Thread via GitHub


eladkal commented on code in PR #28241:
URL: https://github.com/apache/airflow/pull/28241#discussion_r1113441726


##
airflow/providers/amazon/aws/operators/lambda_function.py:
##
@@ -20,35 +20,122 @@
 import json
 from typing import TYPE_CHECKING, Sequence
 
+from airflow.compat.functools import cached_property
 from airflow.models import BaseOperator
 from airflow.providers.amazon.aws.hooks.lambda_function import LambdaHook
 
 if TYPE_CHECKING:
 from airflow.utils.context import Context
 
 
-class AwsLambdaInvokeFunctionOperator(BaseOperator):
+class LambdaCreateFunctionOperator(BaseOperator):
 """
-Invokes an AWS Lambda function.
-You can invoke a function synchronously (and wait for the response),
-or asynchronously.
-To invoke a function asynchronously,
-set `invocation_type` to `Event`. For more details,
-review the boto3 Lambda invoke docs.
+Creates an AWS Lambda function.
+
+More information regarding parameters of this operator can be found here
+
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/lambda.html#Lambda.Client.create_function
+
+.. seealso::
+For more information on how to use this operator, take a look at the 
guide:
+:ref:`howto/operator:LambdaCreateFunctionOperator`
 
 :param function_name: The name of the AWS Lambda function, version, or 
alias.
-:param log_type: Set to Tail to include the execution log in the response. 
Otherwise, set to "None".
-:param qualifier: Specify a version or alias to invoke a published version 
of the function.
-:param invocation_type: One of RequestResponse / Event / DryRun
-:param client_context: Up to 3,583 bytes of base64-encoded data about the 
invoking client
-to pass to the function in the context object.
-:param payload: The JSON string that you want to provide to your Lambda 
function as input.
+:param runtime: The identifier of the function's runtime. Runtime is 
required if the deployment package
+is a .zip file archive.
+:param role: The Amazon Resource Name (ARN) of the function's execution 
role.
+:param handler: The name of the method within your code that Lambda calls 
to run your function.
+Handler is required if the deployment package is a .zip file archive.
+:param code: The code for the function.
+:param description: A description of the function.
+:param timeout: The amount of time (in seconds) that Lambda allows a 
function to run before stopping it.
+:param config: Optional dictionary for arbitrary parameters to the boto 
API create_lambda call.
+:param wait_for_completion: If True, the operator will wait until the 
function is active.
 :param aws_conn_id: The AWS connection ID to use
+"""
+
+template_fields: Sequence[str] = (
+"function_name",
+"runtime",
+"role",
+"handler",
+"code",
+"config",
+)
+ui_color = "#ff7300"
+
+def __init__(
+self,
+*,
+function_name: str,
+runtime: str | None = None,
+role: str,
+handler: str | None = None,
+code: dict,
+description: str | None = None,
+timeout: int | None = None,
+config: dict = {},
+wait_for_completion: bool = False,
+aws_conn_id: str = "aws_default",
+**kwargs,
+):
+super().__init__(**kwargs)
+self.function_name = function_name
+self.runtime = runtime
+self.role = role
+self.handler = handler
+self.code = code
+self.description = description
+self.timeout = timeout
+self.config = config
+self.wait_for_completion = wait_for_completion
+self.aws_conn_id = aws_conn_id
+
+@cached_property
+def hook(self) -> LambdaHook:
+return LambdaHook(aws_conn_id=self.aws_conn_id)
+
+def execute(self, context: Context):
+self.log.info("Creating AWS Lambda function: %s", self.function_name)
+response = self.hook.create_lambda(
+function_name=self.function_name,
+runtime=self.runtime,
+role=self.role,
+handler=self.handler,
+code=self.code,
+description=self.description,
+timeout=self.timeout,
+**self.config,
+)
+self.log.info("Lambda response: %r", response)
+
+if self.wait_for_completion:
+self.log.info("Wait for Lambda function to be active")
+waiter = self.hook.conn.get_waiter("function_active_v2")
+waiter.wait(
+FunctionName=self.function_name,
+)
+
+return response.get("FunctionArn")
+
+
+class AwsLambdaInvokeFunctionOperator(BaseOperator):

Review Comment:
   Why the Aws prefix?
   This is not aligned with the convention 
https://github.com/apache/airflow/issues/20296



-- 
This is an automated message from the Apache Git S

[GitHub] [airflow] mobuchowski opened a new issue, #29670: Integrate OpenLineage provider configuration with general Airflow configuration

2023-02-21 Thread via GitHub


mobuchowski opened a new issue, #29670:
URL: https://github.com/apache/airflow/issues/29670

   ### Body
   
   Currently OL-Airflow is configured either with env variables (some things) 
or `openlineage.yml` file. 
   We need to add a way to configure provider using Airflow config the same way 
as other providers and core Airflow is configured.
   At the same time, we need to respect openlineage.yml and env based configs - 
the structure of new config should be consistent.
   
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] ephraimbuddy commented on issue #29199: TaskFlow AirflowSkipException causes downstream step to fail when multiple_outputs is true

2023-02-21 Thread via GitHub


ephraimbuddy commented on issue #29199:
URL: https://github.com/apache/airflow/issues/29199#issuecomment-1438935777

   This fixes it:
   ```diff
   diff --git a/airflow/models/xcom_arg.py b/airflow/models/xcom_arg.py
   index 133fd4280b..1c7290f794 100644
   --- a/airflow/models/xcom_arg.py
   +++ b/airflow/models/xcom_arg.py
   @@ -340,6 +340,8 @@ class PlainXComArg(XComArg):
return result
if self.key == XCOM_RETURN_KEY:
return None
   +if isinstance(result, ArgNotSet):
   +return None
raise XComNotFound(ti.dag_id, task_id, self.key)
   ```
   But I'm wondering what the implications are. WDYT @uranusjr 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] mobuchowski opened a new issue, #29669: Create OpenLineage provider and migrate existing code to it.

2023-02-21 Thread via GitHub


mobuchowski opened a new issue, #29669:
URL: https://github.com/apache/airflow/issues/29669

   ### Body
   
   As first step in AIP-53 implementation, we want to migrate [existing 
OpenLineage Airflow 
integration](https://github.com/OpenLineage/OpenLineage/tree/main/integration/airflow),
 to basically have the same relevant functionality that OL-Airflow integration 
has now, but in `airflow.providers.openlineage` namespace.
   
   This does not mean the provider will be ready to release or have all the 
expected features, but it provides canvas for a next tasks that will add more 
of a relevant functionality according to implementation doc 
https://docs.google.com/document/d/1YQMg4xePhetyjXCBHXwU7jbJpi5TEcqhwN8YbQ32P4Q/edit?usp=sharing
   
   ### Committer
   
   - [X] I acknowledge that I am a maintainer/committer of the Apache Airflow 
project.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] shubham22 commented on pull request #29659: AWS Glue job hook: Make s3_bucket parameter optional

2023-02-21 Thread via GitHub


shubham22 commented on PR #29659:
URL: https://github.com/apache/airflow/pull/29659#issuecomment-1438930972

   cc: @vincbeck @syedahsn @ferruzzi @vandonr-amz - can you please review this 
if you've a chance?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] Taragolis commented on pull request #29665: Quarantine `test_cli_internal_api_background`

2023-02-21 Thread via GitHub


Taragolis commented on PR #29665:
URL: https://github.com/apache/airflow/pull/29665#issuecomment-1438930914

   If I do not forget (50/50) I will create Task in Issues


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] fritz-astronomer commented on issue #29656: Generate URI/JSON for a connection from the Airflow Connection UI

2023-02-21 Thread via GitHub


fritz-astronomer commented on issue #29656:
URL: https://github.com/apache/airflow/issues/29656#issuecomment-1438926634

   There is an API route to get the connection - and as mentioned a CLI route. 
   
   I think this request is reasonable to expose the same functionality in the 
UI - even potentially with a password/sensitive info redacted or only in the 
connection creation modal, as the newest users are likely to use that.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] jedcunningham commented on a diff in pull request #29378: Add gitSync optional env description

2023-02-21 Thread via GitHub


jedcunningham commented on code in PR #29378:
URL: https://github.com/apache/airflow/pull/29378#discussion_r1113416556


##
chart/values.yaml:
##
@@ -1919,6 +1919,19 @@ dags:
 
 extraVolumeMounts: []
 env: []
+ # Change permissions on the checked-out files to the specified mode.
+ # - name: GIT_SYNC_PERMISSION
+ #   value: "0755"
+ # The time to wait before retrying a failed --exechook-command.
+ # - name: GIT_SYNC_EXECHOOK_BACKOFF
+ #   value: "3s"
+ # An optional command to be executed after syncing a new hash of the 
remote repository.
+ # - name: GIT_SYNC_EXECHOOK_COMMAND
+ #   value: "./scripts/entrypoint.sh"
+ # The timeout for the --exechook-command.
+ # - name: GIT_SYNC_EXECHOOK_TIMEOUT
+ #   value: "30s"

Review Comment:
   ```suggestion
# Supported env vars for gitsync can be found at 
https://github.com/kubernetes/git-sync
# - name: ""
#   value: ""

   ```
   
   Let's be general here, I don't want to end up duplicating the whole gitsync 
docs here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[GitHub] [airflow] pollynesterovich commented on pull request #29347: SSH Provider: Add cmd_timeout to ssh connection extra

2023-02-21 Thread via GitHub


pollynesterovich commented on PR #29347:
URL: https://github.com/apache/airflow/pull/29347#issuecomment-1438908550

   Please add these changes to the helm chart of the latest version 1.8.0 so it 
affects the work of dags.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



[airflow] branch main updated (66a8d102fc -> 1f1f97e666)

2023-02-21 Thread jedcunningham
This is an automated email from the ASF dual-hosted git repository.

jedcunningham pushed a change to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


from 66a8d102fc Quarantine `test_cli_internal_api_background` (#29665)
 add 1f1f97e666 Add Maciej Obuchowski to triage to help with AIP-53 issues 
(#29668)

No new revisions were added by this update.

Summary of changes:
 .asf.yaml | 1 +
 1 file changed, 1 insertion(+)



  1   2   3   >