ghostp13409 commented on code in PR #34894:
URL: https://github.com/apache/airflow/pull/34894#discussion_r1358698178


##########
airflow/providers/apache/spark/operators/spark_submit.py:
##########
@@ -160,6 +160,11 @@ def on_kill(self) -> None:
             self._hook = self._get_hook()
         self._hook.on_kill()
 
+    def property_files(self) -> None:
+        if self.hook is None:
+            self._hook = self._get_hook()
+        self._hook.property_files()
+
     def _get_hook(self) -> SparkSubmitHook:

Review Comment:
   I've added properties-file param to Operator and Hook files. However, I 
haven't added the default file path handling logic as it seems to be already 
handled by `spark-submit` command as per the 
[docs](https://books.japila.pl/apache-spark-internals/tools/spark-submit/#driver-cores).
 I will follow up with the test case changes in the next commit. let me know if 
there is any feedback or concern.
   
   > you have to accept the property_files param value as one of the parameters 
that can be used when creating SparkSubmitOperator. This code doesn't involve 
those changes.
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to