[ 
https://issues.apache.org/jira/browse/KAFKA-8157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16811103#comment-16811103
 ] 

ASF GitHub Bot commented on KAFKA-8157:
---------------------------------------

guozhangwang commented on pull request #6547: KAFKA-8157: fix the incorrect 
usage of segment.index.bytes (2.2)
URL: https://github.com/apache/kafka/pull/6547
 
 
   Should be cherry-picked to older branches as well.
   
   ### Committer Checklist (excluded from commit message)
   - [ ] Verify design and implementation 
   - [ ] Verify test coverage and CI build status
   - [ ] Verify documentation (including upgrade notes)
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Missing "key.serializer" exception when setting "segment index bytes"
> ---------------------------------------------------------------------
>
>                 Key: KAFKA-8157
>                 URL: https://issues.apache.org/jira/browse/KAFKA-8157
>             Project: Kafka
>          Issue Type: Bug
>          Components: streams
>    Affects Versions: 2.2.0
>         Environment: ubuntu 18.10, localhost and Aiven too
>            Reporter: Cristian D
>            Priority: Major
>              Labels: beginner, newbie
>
> As a `kafka-streams` user,
> When I set the "segment index bytes" property
> Then I would like to have internal topics with the specified allocated disk 
> space
>  
> At the moment, when setting the "topic.segment.index.bytes" property, the 
> application is exiting with following exception: 
> {code:java}
> Exception in thread "main" org.apache.kafka.common.config.ConfigException: 
> Missing required configuration "key.serializer" which has no default value.
> {code}
> Tested with `kafka-streams` v2.0.0 and v2.2.0.
>  
> Stack trace:
> {code:java}
> Exception in thread "main" org.apache.kafka.common.config.ConfigException: 
> Missing required configuration "key.serializer" which has no default value.
>  at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:474)
>  at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:464)
>  at 
> org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:62)
>  at 
> org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:75)
>  at 
> org.apache.kafka.clients.producer.ProducerConfig.<init>(ProducerConfig.java:392)
>  at 
> org.apache.kafka.streams.StreamsConfig.getMainConsumerConfigs(StreamsConfig.java:1014)
>  at 
> org.apache.kafka.streams.processor.internals.StreamThread.create(StreamThread.java:666)
>  at org.apache.kafka.streams.KafkaStreams.<init>(KafkaStreams.java:718)
>  at org.apache.kafka.streams.KafkaStreams.<init>(KafkaStreams.java:634)
>  at org.apache.kafka.streams.KafkaStreams.<init>(KafkaStreams.java:544)
>  at app.Main.main(Main.java:36)
> {code}
> A demo application simulating the exception:
> https://github.com/razorcd/java-snippets-and-demo-projects/tree/master/kafkastreamsdemo
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to