[jira] [Resolved] (KAFKA-8434) Make global stream time consistent over all stream tasks
[ https://issues.apache.org/jira/browse/KAFKA-8434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Richard Yu resolved KAFKA-8434. --- Resolution: Fixed > Make global stream time consistent over all stream tasks > > > Key: KAFKA-8434 > URL: https://issues.apache.org/jira/browse/KAFKA-8434 > Project: Kafka > Issue Type: Improvement > Components: streams >Reporter: Richard Yu >Priority: Major > Labels: kip, needs-discussion > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (KAFKA-8434) Make global stream time consistent over all stream tasks
Richard Yu created KAFKA-8434: - Summary: Make global stream time consistent over all stream tasks Key: KAFKA-8434 URL: https://issues.apache.org/jira/browse/KAFKA-8434 Project: Kafka Issue Type: Improvement Reporter: Richard Yu -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (KAFKA-8433) Give the opportunity to use serializers and deserializers with IntegrationTestUtils
Anthony Callaert created KAFKA-8433: --- Summary: Give the opportunity to use serializers and deserializers with IntegrationTestUtils Key: KAFKA-8433 URL: https://issues.apache.org/jira/browse/KAFKA-8433 Project: Kafka Issue Type: Improvement Components: streams Affects Versions: 2.3.0 Reporter: Anthony Callaert Currently, each static method using a producer or a consumer don't allow to pass serializers or deserializers as arguments. Because of that we are not able to mock schema registry (for example), or other producer / consumer specific attributs. To resolve that we just need to add methods using serializers or deserializers as arguments. Kafka producer and consumer constructors already accept null serializers or deserializers. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Resolved] (KAFKA-8246) refactor topic/group instance id validation condition
[ https://issues.apache.org/jira/browse/KAFKA-8246?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Boyang Chen resolved KAFKA-8246. Resolution: Not A Problem Since it's only one time validation, we don't need to refactor. > refactor topic/group instance id validation condition > - > > Key: KAFKA-8246 > URL: https://issues.apache.org/jira/browse/KAFKA-8246 > Project: Kafka > Issue Type: Improvement >Reporter: Boyang Chen >Assignee: Boyang Chen >Priority: Major > -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (KAFKA-8432) Add static membership to Sticky assignor
Boyang Chen created KAFKA-8432: -- Summary: Add static membership to Sticky assignor Key: KAFKA-8432 URL: https://issues.apache.org/jira/browse/KAFKA-8432 Project: Kafka Issue Type: Sub-task Reporter: Boyang Chen -- This message was sent by Atlassian JIRA (v7.6.3#76005)
Re: [DISCUSS] KIP-401 TransformerSupplier/ProcessorSupplier enhancements
Per Matthias's suggestion from a while ago, I actually implemented a good amount of option B to get a sense of the user experience and documentation requirements. For a few reasons mentioned below, I think it's not my favorite option, and I prefer option C. But since I did the work and it can help discussion, I may as well share: https://github.com/apache/kafka/pull/6821. Things I learned along the way implementing Option B: - For the name of the interface, I like ConnectedStoreProvider. It isn't perfect but it seems to capture the general gist without being overly verbose. I get that from a strict standpoint it's not "providing connected stores" but is instead "providing stores to be connected," but I think that in context and with documentation, the risk of someone being confused by that is low. - I definitely felt the discoverability issue while trying to write clear documentation; you really have to make sure to connect the dots for the user when the interface isn't connected to anything. - Another problem with a separate interface found while writing tests/examples: defining a ProcessorSupplier that also implements ConnectedStoreProvider cannot be done anonymously, since you can't define an anonymous class in Java that implements multiple interfaces. I actually consider this a fairly major usability issue - it means a user always has to have a custom class rather than doing it inline. We could provide an abstract class that implements the two, but at that point, we're not that far from option A or C anyway. I updated the KIP with my current thinking, which as mentioned is Matthias's option C. Once again for clarity, that *is not* what is in the linked pull request. The current KIP is my proposal. Thanks everyone for the input! P.S. What do folks use to edit the HTML documentation, e.g. processor-api.html? I looked at doing it by hand it but it kind of looked like agony with all the small tags required for formatting code, so I'm sort of assuming there's tooling for it. On Fri, May 24, 2019 at 12:49 AM Matthias J. Sax wrote: > I think the discussion mixed approaches a little bit, hence, let me > rephrase my understanding: > > > A) add new method with default implementation to `ProcessorSupplier`: > > For this case, we don't add a new interface, but only add a new method > to `ProcessorSupplier` -- to keep backward compatibility, we need to add > a default implementation. Users opt into the new feature by overwriting > the default implementation. > > > B) We add a new interface with new method: > > For this case, `ProcessorSupplier` interface is not changed and it does > also _not_ extend the new interface. Because `ProcessorSupplier` is not > changed, it's naturally backward compatible. Users opt into the new > feature, by adding the new interface to their ProcessorSupplier > implementation and they need to implement the new method because there > is no default implementation. Kafka Streams can use `instanceof` to > detect if the new interface is used or not and thus, to the right thing. > > > What was also discussed is a mix of both: > > C) We add a new interface with new method and let `ProcessorSupplier` > extend the new interface: > > Here, we need to add a default implementation to preserve backward > compatibility. Similar to (A), users opt into the feature by overwriting > the default implementation. > > > > Option (C) is the same as (A) from a user point of view because a user > won't care about the new interface. It only makes a difference for our > code base, as we can share the default implementation of the new method > This is only a small gain, as the implementation is trivial but also a > small drawback as we add new public interface that is useless to the > user because the user would never implement the interface directly. > > > > For (A/C), it might be simpler for users to detect the feature. For (B), > we have the advantage that users must implement the method if they use > the new interface. > > Overall, it seems that (A) might be the best choice because it makes the > feature easier discoverable and does not add a "useless" interface. If > you want to go with (C) to share the default implementation code, that's > also fine with me. I am convinced now (even if I brought it up), that > (B) might be not optimal because feature discoverability seems to be > important. > > > > > About `null` vs `emptyList`: I still tend to like `null` better but it's > really a detail and not too important. Note, that the question only > arises for (A/C), but not for (B) because for (B) we don't need a > default implementation. > > > > > @Paul: It's unclear to me atm what your final proposal is because you > mentioned that you might want to rename `StateStoreConnector`? It's also > unclear to me atm, if you prefer (A), (B), or (C). > > Maybe you can update the KIP if necessary and clearly state what you > final proposal is. Beside this, it seems we can move to a VOTE? > > > >
Re: Contributor Apply
nit: You can *now* self-assign tickets. On Fri, May 24, 2019 at 11:30 PM Matthias J. Sax wrote: > Added you to the list on contributors. You can not self-assign tickets. > > Please read > https://cwiki.apache.org/confluence/display/KAFKA/Contributing+Code+Changes > to get started. > > > > -Matthias > > On 5/24/19 11:47 AM, gongfuboych...@gmail.com wrote: > > hi, > > > > my name is LiMing Zhou from China, who is java developer. want to be > contributor to kafka project > > JIRA ID: GongFuBoy > > > > thanks > > LiMing Zhou > > > >
[jira] [Created] (KAFKA-8431) Add a onTimeoutExpired callback to Kafka Consumer
Richard Yu created KAFKA-8431: - Summary: Add a onTimeoutExpired callback to Kafka Consumer Key: KAFKA-8431 URL: https://issues.apache.org/jira/browse/KAFKA-8431 Project: Kafka Issue Type: Improvement Components: consumer Reporter: Richard Yu Currently, after the changes introduced in KIP-266, many methods in Kafka Consumer have a bounded execution time given by a user specified {{Duration}} parameter. However, in some cases, some methods could not perform their operations in the allocated timeout. In this case, the user might wish to have a {{onTimeoutExpired}} callback which would be called should a blocking method timeout before any results could be returned. The user can implement something like described above, but Kafka can spare the user the necessity of coding such a feature if we can support one by itself. One possible use of this callback is to retry the method (e.g. the {{onTimeoutExpired}} callback triggers another call to the same method after some allocated time). -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[DISCUSS] KIP-474: To deprecate WindowStore#put(key, value)
We propose to deprecate the WindowStore#put(key, value), as it does not have a timestamp as a parameter. The window store requires a timestamp to map the key to a window frame. This method uses the current record timestamp(as specified in the description of the method). There is a method present with a timestamp as a parameter which can be used instead.