nivangio commented on code in PR #29840:
URL: https://github.com/apache/airflow/pull/29840#discussion_r1123399094


##########
airflow/providers/databricks/operators/databricks.py:
##########
@@ -285,12 +285,12 @@ def __init__(
         *,
         json: Any | None = None,
         tasks: list[object] | None = None,
-        spark_jar_task: dict[str, str] | None = None,
-        notebook_task: dict[str, str] | None = None,
-        spark_python_task: dict[str, str | list[str]] | None = None,
-        spark_submit_task: dict[str, list[str]] | None = None,
-        pipeline_task: dict[str, str] | None = None,
-        dbt_task: dict[str, str | list[str]] | None = None,

Review Comment:
   Ok, this might be unrelated to the specific issue we are mentioning here but 
passing a `dict[str, dict[str, str]]` is also accepted and produces expected 
outcome under `notebook_task` parameter, for example.
   
   However, when running mypy checks, it fails. I agree that `object` might be 
vague and some more thought will be needed but, at the very least, the value 
for `notebook_task` dict param (and probably for the rest too) should accept 
`dict`, `str`, `int` to go in line with what's there so far (i.e., without this 
PR) and, eventually `XComArg` and `PlainXComArg` to support the changes in this 
PR



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to