[ https://issues.apache.org/jira/browse/SPARK-45178?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jungtaek Lim resolved SPARK-45178. ---------------------------------- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 42940 [https://github.com/apache/spark/pull/42940] > Fallback to use single batch executor for Trigger.AvailableNow with > unsupported sources rather than using wrapper > ----------------------------------------------------------------------------------------------------------------- > > Key: SPARK-45178 > URL: https://issues.apache.org/jira/browse/SPARK-45178 > Project: Spark > Issue Type: Bug > Components: Structured Streaming > Affects Versions: 4.0.0 > Reporter: Jungtaek Lim > Assignee: Jungtaek Lim > Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > We have observed the case where wrapper implementation of > Trigger.AvailableNow ( > AvailableNowDataStreamWrapper and subclasses) is not fully compatible with > 3rd party data source and brought up correctness issue. > > While we could persuade 3rd party data source to support > Trigger.AvailableNow, pursuing all 3rd parties to do this is too aggressive > and challenging goal we never be able to make. Also, it may not be also > possible to come up with the wrapper implementation which would have zero > issue with any arbitrary source. > > As a mitigation, we want to make a slight behavioral change for such case, > falling back to single batch execution (a.k.a. Trigger.Once) rather than > using wrapper implementation. The exact behavior between Trigger.AvailableNow > and Trigger.Once are different so it's technically behavioral change, but > it's probably lot less surprised than failing the query. > > For extreme case where users are confident that there will be no issue at all > on using wrapper, we will come up with a flag to provide the previous > behavior. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org