amoghrajesh commented on code in PR #47644:
URL: https://github.com/apache/airflow/pull/47644#discussion_r1994772666


##########
task-sdk/src/airflow/sdk/execution_time/task_runner.py:
##########
@@ -711,6 +712,10 @@ def _execute_task(context: Context, ti: 
RuntimeTaskInstance):
     # Populate the context var so ExecutorSafeguard doesn't complain
     ctx.run(ExecutorSafeguard.tracker.set, task)
 
+    # Export context to make it available for operators to use.

Review Comment:
   ```suggestion
       # Export context in os.environ to make it available for operators to use.
   ```



##########
providers/apache/hive/src/airflow/providers/apache/hive/operators/hive.py:
##########
@@ -26,8 +26,7 @@
 from airflow.configuration import conf
 from airflow.models import BaseOperator
 from airflow.providers.apache.hive.hooks.hive import HiveCliHook
-from airflow.utils import operator_helpers
-from airflow.utils.operator_helpers import context_to_airflow_vars
+from airflow.sdk.execution_time.context import 
AIRFLOW_VAR_NAME_FORMAT_MAPPING, context_to_airflow_vars

Review Comment:
   We can just use the v3 check too



##########
task-sdk/tests/task_sdk/execution_time/test_task_runner.py:
##########
@@ -890,6 +890,34 @@ def test_run_with_inlets_and_outlets(
     mock_supervisor_comms.send_request.assert_any_call(msg=last_expected_msg, 
log=mock.ANY)
 
 
[email protected]("airflow.sdk.execution_time.task_runner.context_to_airflow_vars")
[email protected](os.environ, {}, clear=True)
+def test_execute_task_exports_env_vars(
+    mock_context_to_airflow_vars, create_runtime_ti, mock_supervisor_comms
+):
+    """Test that _execute_task exports airflow context to environment 
variables."""
+
+    def test_function():
+        return "test function"
+
+    task = PythonOperator(
+        task_id="test_task",
+        python_callable=test_function,
+    )
+
+    ti = create_runtime_ti(task=task, dag_id="dag_with_env_vars")
+    mock_supervisor_comms.get_message.return_value = OKResponse(
+        ok=True,
+    )

Review Comment:
   I dont think this is the right thing?



##########
providers/apache/hive/src/airflow/providers/apache/hive/hooks/hive.py:
##########
@@ -36,9 +36,9 @@
 from airflow.exceptions import AirflowException
 from airflow.hooks.base import BaseHook
 from airflow.providers.common.sql.hooks.sql import DbApiHook
+from airflow.sdk.execution_time.context import AIRFLOW_VAR_NAME_FORMAT_MAPPING

Review Comment:
   When we move around things into the sdk, one thing to note is the 
back-compat.
   
   To do that, we usually do a version check and import for the right version, 
one such example is here
   
https://github.com/apache/airflow/pull/47008/files#diff-90e854af5044e923499f865ed449cfacc40ca2db5afc47ac0cca278a29fcc9bfR47-R52
   
   This will have to be done for all providers



##########
task-sdk/src/airflow/sdk/execution_time/context.py:
##########
@@ -411,3 +448,70 @@ def context_update_for_unmapped(context: Context, task: 
BaseOperator) -> None:
     context["params"] = process_params(
         context["dag"], task, context["dag_run"].conf, suppress_exception=False
     )
+
+
+def context_to_airflow_vars(context: Mapping[str, Any], in_env_var_format: 
bool = False) -> dict[str, str]:

Review Comment:
   Type of context is probably: `context: Context` now



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to