[ https://issues.apache.org/jira/browse/SPARK-39940?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jungtaek Lim resolved SPARK-39940. ---------------------------------- Fix Version/s: 3.4.0 Resolution: Fixed Issue resolved by pull request 37368 [https://github.com/apache/spark/pull/37368] > Batch query cannot read the updates from streaming query if streaming query > writes to the catalog table via DSv1 sink > --------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-39940 > URL: https://issues.apache.org/jira/browse/SPARK-39940 > Project: Spark > Issue Type: Bug > Components: Structured Streaming > Affects Versions: 3.4.0, 3.3.1, 3.2.3 > Reporter: Jungtaek Lim > Assignee: Jungtaek Lim > Priority: Major > Fix For: 3.4.0 > > > (I think this should be ancient issue but there's no good way to list up all > affected versions, so I just pick up the recent version in each version line.) > When streaming query writes to catalog table via DSv1 sink, there is no > refreshing/invalidation of the destination table, hence querying the > destination table with batch query is not guaranteed to read the latest > "committed" updates. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org