[jira] [Commented] (SPARK-44460) Pass user auth credential to Python workers for foreachBatch and listener
[ https://issues.apache.org/jira/browse/SPARK-44460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17757087#comment-17757087 ] Wei Liu commented on SPARK-44460: - [~rangadi] This seems to be a Databricks internal issue. See the updates in SC-138245 > Pass user auth credential to Python workers for foreachBatch and listener > - > > Key: SPARK-44460 > URL: https://issues.apache.org/jira/browse/SPARK-44460 > Project: Spark > Issue Type: Task > Components: Connect, Structured Streaming >Affects Versions: 3.4.1 >Reporter: Raghu Angadi >Priority: Major > > No user specific credentials are sent to Python worker that runs user > functions like foreachBatch() and streaming listener. > We might need to pass in these. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-44460) Pass user auth credential to Python workers for foreachBatch and listener
[ https://issues.apache.org/jira/browse/SPARK-44460?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17755293#comment-17755293 ] Raghu Angadi commented on SPARK-44460: -- [~WweiL] I think you looked into this issue. could you add an update? This might be post-poned currently. > Pass user auth credential to Python workers for foreachBatch and listener > - > > Key: SPARK-44460 > URL: https://issues.apache.org/jira/browse/SPARK-44460 > Project: Spark > Issue Type: Task > Components: Connect, Structured Streaming >Affects Versions: 3.4.1 >Reporter: Raghu Angadi >Priority: Major > > No user specific credentials are sent to Python worker that runs user > functions like foreachBatch() and streaming listener. > We might need to pass in these. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org