Looks like you can't connect to: 10.100.98.100:9092

I'd validate that this is the issue using telnet and then check the
firewall / ipfilters settings.

On Thu, Dec 18, 2014 at 2:21 PM, Sa Li <sal...@gmail.com> wrote:
> Dear all
>
> We just build a kafka production cluster, I can create topics in kafka
> production from another host. But when I am send very simple message as
> producer, it generate such errors:
>
> root@precise64:/etc/kafka# bin/kafka-console-producer.sh --broker-list
> 10.100.98.100:9092 --topic my-replicated-topic-production
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/etc/kafka/core/build/dependant-libs-2.10.4/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/etc/kafka/core/build/dependant-libs-2.10.4/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> my test message 1
> [2014-12-18 21:44:25,830] WARN Failed to send producer request with
> correlation id 2 to broker 101 with data for partitions
> [my-replicated-topic-production,1]
> (kafka.producer.async.DefaultEventHandler)
> java.nio.channels.ClosedChannelException
>         at kafka.network.BlockingChannel.send(BlockingChannel.scala:100)
>         at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:73)
>         at
> kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:72)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SyncProducer.scala:103)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1$$anonfun$apply$mcV$sp$1.apply(SyncProducer.scala:103)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1$$anonfun$apply$mcV$sp$1.apply(SyncProducer.scala:103)
>         at kafka.metrics.KafkaTimer.time(KafkaTimer.scala:33)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1.apply$mcV$sp(SyncProducer.scala:102)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1.apply(SyncProducer.scala:102)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1.apply(SyncProducer.scala:102)
>         at kafka.metrics.KafkaTimer.time(KafkaTimer.scala:33)
>         at kafka.producer.SyncProducer.send(SyncProducer.scala:101)
>         at
> kafka.producer.async.DefaultEventHandler.kafka$producer$async$DefaultEventHandler$$send(DefaultEventHandler.scala:256)
>         at
> kafka.producer.async.DefaultEventHandler$$anonfun$dispatchSerializedData$2.apply(DefaultEventHandler.scala:107)
>         at
> kafka.producer.async.DefaultEventHandler$$anonfun$dispatchSerializedData$2.apply(DefaultEventHandler.scala:99)
>         at
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>         at
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>         at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>         at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>         at
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
>         at
> kafka.producer.async.DefaultEventHandler.dispatchSerializedData(DefaultEventHandler.scala:99)
>         at
> kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:72)
>         at
> kafka.producer.async.ProducerSendThread.tryToHandle(ProducerSendThread.scala:105)
>         at
> kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:88)
>         at
> kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:68)
>         at scala.collection.immutable.Stream.foreach(Stream.scala:547)
>         at
> kafka.producer.async.ProducerSendThread.processEvents(ProducerSendThread.scala:67)
>         at
> kafka.producer.async.ProducerSendThread.run(ProducerSendThread.scala:45)
> [2014-12-18 21:44:25,948] WARN Failed to send producer request with
> correlation id 5 to broker 101 with data for partitions
> [my-replicated-topic-production,1]
> (kafka.producer.async.DefaultEventHandler)
> java.nio.channels.ClosedChannelException
>         at kafka.network.BlockingChannel.send(BlockingChannel.scala:100)
>         at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:73)
>         at
> kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:72)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SyncProducer.scala:103)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1$$anonfun$apply$mcV$sp$1.apply(SyncProducer.scala:103)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1$$anonfun$apply$mcV$sp$1.apply(SyncProducer.scala:103)
>         at kafka.metrics.KafkaTimer.time(KafkaTimer.scala:33)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1.apply$mcV$sp(SyncProducer.scala:102)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1.apply(SyncProducer.scala:102)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1.apply(SyncProducer.scala:102)
>         at kafka.metrics.KafkaTimer.time(KafkaTimer.scala:33)
>         at kafka.producer.SyncProducer.send(SyncProducer.scala:101)
>         at
> kafka.producer.async.DefaultEventHandler.kafka$producer$async$DefaultEventHandler$$send(DefaultEventHandler.scala:256)
>         at
> kafka.producer.async.DefaultEventHandler$$anonfun$dispatchSerializedData$2.apply(DefaultEventHandler.scala:107)
>         at
> kafka.producer.async.DefaultEventHandler$$anonfun$dispatchSerializedData$2.apply(DefaultEventHandler.scala:99)
>         at
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>         at
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>         at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>         at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>         at
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
>         at
> kafka.producer.async.DefaultEventHandler.dispatchSerializedData(DefaultEventHandler.scala:99)
>         at
> kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:72)
>         at
> kafka.producer.async.ProducerSendThread.tryToHandle(ProducerSendThread.scala:105)
>         at
> kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:88)
>         at
> kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:68)
>         at scala.collection.immutable.Stream.foreach(Stream.scala:547)
>         at
> kafka.producer.async.ProducerSendThread.processEvents(ProducerSendThread.scala:67)
>         at
> kafka.producer.async.ProducerSendThread.run(ProducerSendThread.scala:45)
> [2014-12-18 21:44:26,129] WARN Failed to send producer request with
> correlation id 8 to broker 100 with data for partitions
> [my-replicated-topic-production,0]
> (kafka.producer.async.DefaultEventHandler)
> java.nio.channels.ClosedChannelException
>         at kafka.network.BlockingChannel.send(BlockingChannel.scala:100)
>         at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:73)
>         at
> kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:72)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SyncProducer.scala:103)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1$$anonfun$apply$mcV$sp$1.apply(SyncProducer.scala:103)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1$$anonfun$apply$mcV$sp$1.apply(SyncProducer.scala:103)
>         at kafka.metrics.KafkaTimer.time(KafkaTimer.scala:33)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1.apply$mcV$sp(SyncProducer.scala:102)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1.apply(SyncProducer.scala:102)
>         at
> kafka.producer.SyncProducer$$anonfun$send$1.apply(SyncProducer.scala:102)
>         at kafka.metrics.KafkaTimer.time(KafkaTimer.scala:33)
>         at kafka.producer.SyncProducer.send(SyncProducer.scala:101)
>         at
> kafka.producer.async.DefaultEventHandler.kafka$producer$async$DefaultEventHandler$$send(DefaultEventHandler.scala:256)
>         at
> kafka.producer.async.DefaultEventHandler$$anonfun$dispatchSerializedData$2.apply(DefaultEventHandler.scala:107)
>         at
> kafka.producer.async.DefaultEventHandler$$anonfun$dispatchSerializedData$2.apply(DefaultEventHandler.scala:99)
>         at
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>         at
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>         at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>         at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>         at
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
>         at
> kafka.producer.async.DefaultEventHandler.dispatchSerializedData(DefaultEventHandler.scala:99)
>         at
> kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:72)
>         at
> kafka.producer.async.ProducerSendThread.tryToHandle(ProducerSendThread.scala:105)
>         at
> kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:88)
>         at
> kafka.producer.async.ProducerSendThread$$anonfun$processEvents$3.apply(ProducerSendThread.scala:68)
>         at scala.collection.immutable.Stream.foreach(Stream.scala:547)
>         at
> kafka.producer.async.ProducerSendThread.processEvents(ProducerSendThread.scala:67)
>         at
> kafka.producer.async.ProducerSendThread.run(ProducerSendThread.scala:45)
>
> Any ideas about this type of error?
>
>
> thanks
>
> --
>
> Alec Li

Reply via email to