Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on PR #14360: URL: https://github.com/apache/kafka/pull/14360#issuecomment-2114110881 Thanks @gharris1727 @Cerchie -- can you add fixed into #14448 ? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
gharris1727 commented on PR #14360: URL: https://github.com/apache/kafka/pull/14360#issuecomment-2108456039 Hi @mjsax @Cerchie FYI this causes some javadoc build issues: ``` streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java:661: warning - Tag @link: reference not found: JMX_REPORTER"jmx.reporter" streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java:575: warning - invalid usage of tag {@code default.windowed.key.serde.inner streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java:583: warning - invalid usage of tag {@code default.windowed.value.serde.inner streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java:661: warning - Tag @link: reference not found: JMX_REPORTER"jmx.reporter" streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java:660: warning - invalid usage of tag {@code auto.include.jmx.reporter streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java:661: warning - Tag @link: reference not found: JMX_REPORTER"jmx.reporter" streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java:661: warning - Tag @link: reference not found: JMX_REPORTER"jmx.reporter" ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on PR #14360: URL: https://github.com/apache/kafka/pull/14360#issuecomment-2103760848 Thanks for the PR @Cerchie! Merged to `trunk`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax merged PR #14360: URL: https://github.com/apache/kafka/pull/14360 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1596187852 ## docs/streams/developer-guide/config-streams.html: ## @@ -300,12 +306,12 @@ num.standby.replicasnull - default.windowed.key.serde.inner + default.windowed.key.serde.inner (Deprecated. Use windowed.inner.class.serde instead.) Medium Default serializer/deserializer for the inner class of windowed keys, implementing the Serde interface. null - default.windowed.value.serde.inner + default.windowed.value.serde.inner (Deprecated. Use windowed.inner.class.serde instead.) Medium Default serializer/deserializer for the inner class of windowed values, implementing the Serde interface. null Review Comment: I just see by change, that we use a different formatting for marking a deprecated config: ``` default.dsl.store Low DEPRECATED] The default state store type used by DSL operators. Deprecated in favor of dsl.store.suppliers.class ``` I think we should use a unified formatting -- don't have a preference which one. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1586935977 ## docs/streams/developer-guide/config-streams.html: ## @@ -240,24 +240,29 @@ num.standby.replicas - acceptable.recovery.lag + acceptable.recovery.lag Medium The maximum acceptable lag (number of offsets to catch up) for an instance to be considered caught-up and ready for the active task. 1 - application.server + application.server Low A host:port pair pointing to an embedded user defined endpoint that can be used for discovering the locations of state stores within a single Kafka Streams application. The value of this must be different for each instance of the application. the empty string - buffered.records.per.partition + buffered.records.per.partition Low The maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + statestore.cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. +10485760 + + cache.max.bytes.buffering (Deprecated. Use cache.max.bytes instead.) Review Comment: ```suggestion cache.max.bytes.buffering (Deprecated. Use statestore.cache.max.bytes instead.) ``` ## streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java: ## @@ -564,25 +565,22 @@ public class StreamsConfig extends AbstractConfig { static final String DSL_STORE_SUPPLIERS_CLASS_DOC = "Defines which store implementations to plug in to DSL operators. Must implement the org.apache.kafka.streams.state.DslStoreSuppliers interface."; static final Class DSL_STORE_SUPPLIERS_CLASS_DEFAULT = BuiltInDslStoreSuppliers.RocksDBDslStoreSuppliers.class; -/** {@code default.windowed.key.serde.inner} */ +/** {@code default.windowed.key.serde.inner + * @deprecated since 3.0.0} */ @SuppressWarnings("WeakerAccess") @Deprecated public static final String DEFAULT_WINDOWED_KEY_SERDE_INNER_CLASS = "default.windowed.key.serde.inner"; private static final String DEFAULT_WINDOWED_KEY_SERDE_INNER_CLASS_DOC = "Default serializer / deserializer for the inner class of a windowed key. Must implement the " + "org.apache.kafka.common.serialization.Serde interface."; -/** {@code default.windowed.value.serde.inner} */ +/** {@code default.windowed.value.serde.inner + * @deprecated since 3.0.0 } */ @SuppressWarnings("WeakerAccess") @Deprecated public static final String DEFAULT_WINDOWED_VALUE_SERDE_INNER_CLASS = "default.windowed.value.serde.inner"; private static final String DEFAULT_WINDOWED_VALUE_SERDE_INNER_CLASS_DOC = "Default serializer / deserializer for the inner class of a windowed value. Must implement the " + "org.apache.kafka.common.serialization.Serde interface."; -public static final String WINDOWED_INNER_CLASS_SERDE = "windowed.inner.class.serde"; Review Comment: Seems we cannot just remove it, but also marked as deprecated only instead? Also, given that deprecation is part of KIP-1020, should this be only done in the KIP-1020 PR, and this PR would only do the docs cleanup? ## docs/streams/developer-guide/config-streams.html: ## @@ -326,8 +331,15 @@ num.standby.replicasDefault serializer/deserializer for the inner class of windowed keys, implementing the Serde interface. Review Comment: Seems this line does not belong to `dsl.store.suppliers.class` config? ## docs/streams/developer-guide/config-streams.html: ## @@ -300,7 +305,7 @@ num.standby.replicasnull - default.windowed.key.serde.inner + default.windowed.key.serde.inner (Deprecated.) Review Comment: Below is `default.window.value.serde.inner` which was also deprecated, right? (L308 original, new L313) ## docs/streams/developer-guide/config-streams.html: ## @@ -326,8 +331,15 @@ num.standby.replicasDefault serializer/deserializer for the inner class of windowed keys, implementing the Serde interface. +null - max.task.idle.ms + default.windowed.value.serde.inner (Deprecated.) Review Comment: It seems `default.windowed.value.serde.inner` exist already? Why do we add it here? ## streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java: ## @@ -647,7 +645,8 @@ public class StreamsConfig extends AbstractConfig { @SuppressWarnings("WeakerAccess") public static final String METRICS_SAMPLE_WINDOW_MS_CONFIG = CommonClientConfigs.METRICS_SAMPLE_WINDOW_MS_CONFIG; -/** {@code auto.include.jmx.reporter} */ +/** {@code auto.include.jmx.r
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
Cerchie commented on PR #14360: URL: https://github.com/apache/kafka/pull/14360#issuecomment-2087132901 tagging @mjsax here, made some edits in response to the last roung -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
Cerchie commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1585489940 ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + statestore.cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. +10485760 + + cache.max.bytes.buffering (Deprecated. Use cache.max.bytes instead.) Review Comment: went through and did it manually, I think there was more than just this instance -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
Cerchie commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1585481048 ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + statestore.cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. Review Comment: found it in some confluent docs -- looks like it's been partially implemented. https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=186878390 will remove -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1508421651 ## docs/streams/developer-guide/config-streams.html: ## @@ -300,8 +306,10 @@ num.standby.replicasnull - default.windowed.key.serde.inner + default.windowed.key.serde.inner (Deprecated. Use windowed.inner.class.serde instead.) Medium +<<< HEAD +<<< HEAD Review Comment: Seems you missed to delete some marker lines from resolving conflict during rebasing. (more below) ## docs/streams/developer-guide/config-streams.html: ## @@ -1010,6 +1016,18 @@ topology.optimization + windowed.inner.class.serde + + + +Serde for the inner class of a windowed record. Must implement the org.apache.kafka.common.serialization.Serde interface. + + +Note that setting this config in KafkaStreams application would result in an error as it is meant to be used only from Plain consumer client. Review Comment: I just did some more digging, and now I actually think that @ableegoldman is right, we might want to treat `windowed.inner.serde.class` similar to `window.size`... (ie, maybe remove from StreamsConfig -- we could add this to the KIP Lucia started). I also understand now, why the docs says, using it would result in an error (for both configs): Kafka Streams will always pass window-size and inner-serde via the _constructor_ and we will also verify that we don't get an parameter set twice (or zero time), and throw an error inside `configure()` method of the windowed serdes... Thus, we might want to not add `windowed.inner.serde.class` to the docs in this PR to begin with... Sorry for the back and forth. Reading and understanding code is hard... ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + statestore.cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. Review Comment: > Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. What does this mean? Cannot follow. ## docs/streams/developer-guide/config-streams.html: ## @@ -326,6 +334,18 @@ num.standby.replicasDefault serializer/deserializer for the inner class of windowed keys, implementing the Serde interface. Deprecated. +=== +Default serializer/deserializer for the inner class of windowed keys, implementing the Serde interface. Review Comment: Duplicate line (both are not 100% the same) -- seems a conflict was not resolve correctly. ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + statestore.cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. +10485760 + + cache.max.bytes.buffering (Deprecated. Use cache.max.bytes instead.) Review Comment: If we insert a new row, we need to change "even / odd" for all rows below... super annoying... (otherwise we get two rows with the same background color instead of nicely interleaved rows) -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
Cerchie commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1498343776 ## streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java: ## @@ -482,7 +482,8 @@ public class StreamsConfig extends AbstractConfig { public static final String BUILT_IN_METRICS_VERSION_CONFIG = "built.in.metrics.version"; private static final String BUILT_IN_METRICS_VERSION_DOC = "Version of the built-in metrics to use."; -/** {@code cache.max.bytes.buffering} */ +/** {@code cache.max.bytes.buffering} + * @deprecated since 3.4.0 Use cache.max.bytes instead with the cache.size metric at the DEBUG level. */ Review Comment: I believe I had gleaned that from a release blog but I can't find it now... happy to commit the suggestion -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1483595167 ## docs/streams/developer-guide/config-streams.html: ## @@ -1010,6 +1016,18 @@ topology.optimization + windowed.inner.class.serde + + + +Serde for the inner class of a windowed record. Must implement the org.apache.kafka.common.serialization.Serde interface. + + +Note that setting this config in KafkaStreams application would result in an error as it is meant to be used only from Plain consumer client. Review Comment: > Are you proposing to move it to ClientConfig or just not .define it in any class? In the end, we added it via KIP and would need to move/remove it via KIP, and I'm not sure it's worth doing a KIP over. We can just leave it out of the Streams config docs Yes, the idea was to maybe do a small KIP and remove it from `StreamsConfig` as it does not belong there. If Lucia is interested in doing it -- I agree it's not super critical, and just removing from the docs is also fine with me. About `window.inner.serde.class` it is indeed used by Kafka Streams itself (cf `TimeWindowedSerializer` and others). If one would set default serde to `TimeWindowedSerializer` it would be used, so I think it's ok to have it in the docs, and also correct to have it as `StreamsConfig` registered via `define()`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
ableegoldman commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1479177626 ## docs/streams/developer-guide/config-streams.html: ## @@ -1010,6 +1016,18 @@ topology.optimization + windowed.inner.class.serde + + + +Serde for the inner class of a windowed record. Must implement the org.apache.kafka.common.serialization.Serde interface. + + +Note that setting this config in KafkaStreams application would result in an error as it is meant to be used only from Plain consumer client. Review Comment: I guess `window.size.ms` is just in a weird place somewhere between a client config and a streams config. Given that it's not for use in an actual Kafka Streams application, it makes sense to me that we should not include it in the Streams config docs. Are you proposing to move it to ClientConfig or just not `.define` it in any class? In the end, we added it via KIP and would need to move/remove it via KIP, and I'm not sure it's worth doing a KIP over. We can just leave it out of the Streams config docs Also -- this applies to both `window.size.ms` and `window.inner.serde.class`, no? I don't understand this bit: > window.inner.serde.class is a KS config and should just be documented in the regular way. They are essentially the same kind of config, they just refer to the two different parameters of a windowed serde. If `window.size.ms` is not a KS config, then neither is `window.inner.serde.class` -- and vice versa -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
Cerchie commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1475018035 ## docs/streams/developer-guide/config-streams.html: ## @@ -1010,6 +1016,18 @@ topology.optimization + windowed.inner.class.serde + + + +Serde for the inner class of a windowed record. Must implement the org.apache.kafka.common.serialization.Serde interface. + + +Note that setting this config in KafkaStreams application would result in an error as it is meant to be used only from Plain consumer client. Review Comment: Going back to the [blog post](https://www.confluent.io/blog/apache-kafka-3-4-0-new-features-and-updates/) I don't know where the cache.size metric is coming from-- because in the quoted [KIP-770](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=186878390) it appears to be `cache-size-bytes-total `. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1473438104 ## docs/streams/developer-guide/config-streams.html: ## @@ -1010,6 +1016,18 @@ topology.optimization + windowed.inner.class.serde + + + +Serde for the inner class of a windowed record. Must implement the org.apache.kafka.common.serialization.Serde interface. + + +Note that setting this config in KafkaStreams application would result in an error as it is meant to be used only from Plain consumer client. Review Comment: Thanks Sophie. > Note that this config is only used by plain consumer/producer clients For this case, why are we documenting is in KS docs -- should it not be in clients docs? (Also, this applies to `window.size.ms` introduced in 2.8 via https://cwiki.apache.org/confluence/display/KAFKA/KIP-659%3A+Improve+TimeWindowedDeserializer+and+TimeWindowedSerde+to+handle+window+size, right, but not to `windowed.inner.serde.class` which is a KS config added in 3.0 via https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=177047930 In the end, KS module provides the window-serdes and thus it does not make sense to add `window.size.ms` to `ClientConfig` -- especially, as you pointed out, because is only necessary for console consumer. Thus, while `StreamsConfig#WINDOW_SIZE_MS_CONFIG` must exist as a variable name, I am wondering if it's actually correct that we added it as a StreamsConfig, ie, via `define(...)`? Mabye we should do a small KIP and remove it? -- For use, we should not mention `window.size.ms` in KS docs on the web-page (at least not for "top level config" -- we should either add it to a "windowed serde" section, or to (console) consumer config section where it belong to)? `window.inner.serde.class` is a KS config and should just be documented in the regular way. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1473419962 ## streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java: ## @@ -482,7 +482,8 @@ public class StreamsConfig extends AbstractConfig { public static final String BUILT_IN_METRICS_VERSION_CONFIG = "built.in.metrics.version"; private static final String BUILT_IN_METRICS_VERSION_DOC = "Version of the built-in metrics to use."; -/** {@code cache.max.bytes.buffering} */ +/** {@code cache.max.bytes.buffering} + * @deprecated since 3.4.0 Use cache.max.bytes instead with the cache.size metric at the DEBUG level. */ Review Comment: ```suggestion * @deprecated since 3.4.0 Use {@link #CACHE_MAX_BYTES_CONFIG "cache.max.bytes"} instead. */ ``` ## streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java: ## @@ -537,14 +538,16 @@ public class StreamsConfig extends AbstractConfig { public static final String ROCKS_DB = "rocksDB"; public static final String IN_MEMORY = "in_memory"; -/** {@code default.windowed.key.serde.inner} */ +/** {@code default.windowed.key.serde.inner + * @deprecated since 3.0.0 Use windowed.inner.class.serde instead.} */ Review Comment: ```suggestion * @deprecated since 3.0.0 Use {@link #WINDOWED_INNER_CLASS_SERDE "windowed.inner.class.serde"} instead.} */ ``` ## streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java: ## @@ -482,7 +482,8 @@ public class StreamsConfig extends AbstractConfig { public static final String BUILT_IN_METRICS_VERSION_CONFIG = "built.in.metrics.version"; private static final String BUILT_IN_METRICS_VERSION_DOC = "Version of the built-in metrics to use."; -/** {@code cache.max.bytes.buffering} */ +/** {@code cache.max.bytes.buffering} + * @deprecated since 3.4.0 Use cache.max.bytes instead with the cache.size metric at the DEBUG level. */ Review Comment: Not sure if I understand the reference to `cache.size` metric? Can you elaborate? ## streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java: ## @@ -537,14 +538,16 @@ public class StreamsConfig extends AbstractConfig { public static final String ROCKS_DB = "rocksDB"; public static final String IN_MEMORY = "in_memory"; -/** {@code default.windowed.key.serde.inner} */ +/** {@code default.windowed.key.serde.inner + * @deprecated since 3.0.0 Use windowed.inner.class.serde instead.} */ @SuppressWarnings("WeakerAccess") @Deprecated public static final String DEFAULT_WINDOWED_KEY_SERDE_INNER_CLASS = "default.windowed.key.serde.inner"; private static final String DEFAULT_WINDOWED_KEY_SERDE_INNER_CLASS_DOC = "Default serializer / deserializer for the inner class of a windowed key. Must implement the " + "org.apache.kafka.common.serialization.Serde interface."; -/** {@code default.windowed.value.serde.inner} */ +/** {@code default.windowed.value.serde.inner + * @deprecated since 3.0.0 Use windowed.inner.class.serde instead.} */ Review Comment: as above ## streams/src/main/java/org/apache/kafka/streams/StreamsConfig.java: ## @@ -620,7 +623,8 @@ public class StreamsConfig extends AbstractConfig { @SuppressWarnings("WeakerAccess") public static final String METRICS_SAMPLE_WINDOW_MS_CONFIG = CommonClientConfigs.METRICS_SAMPLE_WINDOW_MS_CONFIG; -/** {@code auto.include.jmx.reporter} */ +/** {@code auto.include.jmx.reporter + * @deprecated and will removed in 4.0.0 Use org.apache.kafka.common.metrics.JmxReporter in metric.reporters instead.} */ Review Comment: as above -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
Cerchie commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1469760873 ## docs/streams/developer-guide/config-streams.html: ## @@ -1010,6 +1016,18 @@ topology.optimization + windowed.inner.class.serde + + + +Serde for the inner class of a windowed record. Must implement the org.apache.kafka.common.serialization.Serde interface. + + +Note that setting this config in KafkaStreams application would result in an error as it is meant to be used only from Plain consumer client. Review Comment: Thank you Sophie! Changed the sentence. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
Cerchie commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1469755433 ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. Review Comment: I've not done it myself, basing it on what is said in the release https://www.confluent.io/blog/apache-kafka-3-4-0-new-features-and-updates/ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
ableegoldman commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1465626991 ## docs/streams/developer-guide/config-streams.html: ## @@ -1010,6 +1016,18 @@ topology.optimization + windowed.inner.class.serde + + + +Serde for the inner class of a windowed record. Must implement the org.apache.kafka.common.serialization.Serde interface. + + +Note that setting this config in KafkaStreams application would result in an error as it is meant to be used only from Plain consumer client. Review Comment: Bear with me here because it's kind of a long story in full and I only half remember it...and also this statement is only half true. A more accurate statement would be something like this: "Note that this config is only used by plain consumer/producer clients that set a windowed de/serializer type via configs. For Kafka Streams applications that deal with windowed types you must pass in the inner serde type when you instantiate the windowed serde object for your topology" In other words, there's no concept of a "default windowed serde" type in Streams because you always have to pass in the actual Serde object, and need to pass in the window size and inner serde type at that time. The same can also be done for plain producer/consumer clients but someone pointed out that the console clients have to be configured via properties, which means they can only invoke the default constructor based on the class name passed in for the key/value de/serializer configs. Which meant we needed to add the window size and inner serde type as configs that could be used to configure a windowed Serde that was instantiated via default constructor I guess we should update the docs string in StreamsConfig as well. I assume this error slipped in because of the window.size config, which _is_ only needed for the consumer client (since the producer client throws away the window size when encoding a windowed type). But both producer and consumer need to know the inner serde type -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1456637776 ## docs/streams/developer-guide/config-streams.html: ## @@ -1010,6 +1016,18 @@ topology.optimization + windowed.inner.class.serde + + + +Serde for the inner class of a windowed record. Must implement the org.apache.kafka.common.serialization.Serde interface. + + +Note that setting this config in KafkaStreams application would result in an error as it is meant to be used only from Plain consumer client. Review Comment: This sentence confuses me. I realize that it's also in the Java code, but not sure what it actually means? Maybe @ableegoldman who reviewed the original PR can help out? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1456632988 ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. Review Comment: Not 100% sue what > Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache actually mean? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1456632789 ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + cache.max.bytes Review Comment: ```suggestion statestore.cache.max.bytes ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1456631743 ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. Review Comment: ```suggestion Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1456631544 ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. +10485760 bytes Review Comment: The description already say `Maximum number of memory bytes` so seems we can drop `bytes` here? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
mjsax commented on code in PR #14360: URL: https://github.com/apache/kafka/pull/14360#discussion_r1456631293 ## docs/streams/developer-guide/config-streams.html: ## @@ -257,7 +258,12 @@ num.standby.replicasThe maximum number of records to buffer per partition. 1000 - cache.max.bytes.buffering + cache.max.bytes +Medium +Maximum number of memory bytes to be used for record caches across all threads. Note that at the debug level you can use cache.size to monitor the actual size of the Kafka Streams cache. +10485760 bytes Review Comment: ```suggestion 10485760 ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] KAFKA-15307: update/note deprecated configs [kafka]
github-actions[bot] commented on PR #14360: URL: https://github.com/apache/kafka/pull/14360#issuecomment-1868700765 This PR is being marked as stale since it has not had any activity in 90 days. If you would like to keep this PR alive, please ask a committer for review. If the PR has merge conflicts, please update it with the latest from trunk (or appropriate release branch) If this PR is no longer valid or desired, please feel free to close it. If no activity occurs in the next 30 days, it will be automatically closed. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org