It looks like replicas never catch up even when there is no load. Am I
missing something?
On Sat, Dec 10, 2016 at 8:09 PM, Mohit Anchlia <mohitanch...@gmail.com>
wrote:
> Does Kafka automatically replicate the under replicated partitions?
>
> I looked at these metrics through jm
) what the cause might be (e.g. saturating network,
> requests processing slow due to some other resource contention, etc).
>
> -Ewen
>
> On Fri, Dec 9, 2016 at 5:20 PM, Mohit Anchlia <mohitanch...@gmail.com>
> wrote:
>
> > What's the best way to fix NotEnoughReplication
What's the best way to fix NotEnoughReplication given all the nodes are up
and running? Zookeeper did go down momentarily. We are on Kafka 0.10
org.apache.kafka.common.errors.NotEnoughReplicasException: Number of insync
replicas for partition [__consumer_offsets,20] is [1], below required
minimum
u defined
>
> auto.offset.reset: earliest
>
> or otherwise made sure (KafkaConsumer.position()) that the consumer does
> not just wait for *new* messages to arrive?
>
> Harald.
>
>
>
> On 06.12.2016 20:11, Mohit Anchlia wrote:
>
>> I see this messag
I see this message in the logs:
[2016-12-06 13:54:16,586] INFO [GroupCoordinator 0]: Preparing to
restabilize group DemoConsumer with old generation 3
(kafka.coordinator.GroupCoordinator)
On Tue, Dec 6, 2016 at 10:53 AM, Mohit Anchlia <mohitanch...@gmail.com>
wrote:
> I have a
I have a consumer polling a topic of Kafka 0.10. Even though the topic has
messages the consumer poll is not fetching the message. The thread dump
reveals:
"main" #1 prio=5 os_prio=0 tid=0x7f3ba4008800 nid=0x798 runnable
[0x7f3baa6c3000]
java.lang.Thread.State: RUNNABLE
at
>
> - -Matthias
>
> On 10/24/16 9:19 AM, Mohit Anchlia wrote:
> > Would this be an issue if I connect to a remote Kafka instance
> > running on the Linux box? Or is this a client issue. What's rockdb
> > used for to keep state?
> >
> > On Mon,
12
>
> Kafka 0.10.1.0 which was release last week does contain the fix
> already. The fix will be in CP 3.1 coming up soon!
>
> (sorry that I did mix up versions in a previous email)
>
> - -Matthias
>
> On 10/23/16 12:10 PM, Mohit Anchlia wrote:
> > So if I get it
sion is 3.0.1
> CP 3.1 should be release the next weeks
>
> So CP 3.2 should be there is about 4 month (Kafka follows a time base
> release cycle of 4 month and CP usually aligns with Kafka releases)
>
> - -Matthias
>
>
> On 10/20/16 5:10 PM, Mohit Anchlia wrote:
> >
you should use example master branch because of API changes from
> 0.10.0.x to 0.10.1 (and thus, changing CP-3.1 to 0.10.1.0 will not be
> compatible and not compile, while changing CP-3.2-SNAPSHOT to 0.10.1.0
> should work -- hopefully ;) )
>
>
> - -Matthias
>
> On 10/20/16
here will be a 0.10.1 examples branch, after CP-3.1 was
> released
>
>
> - -Matthias
>
> On 10/20/16 3:48 PM, Mohit Anchlia wrote:
> > I just now cloned this repo. It seems to be using 10.1
> >
> > https://github.com/confluentinc/examples and running examples in
> &g
;
> On Thursday, October 20, 2016, Mohit Anchlia <mohitanch...@gmail.com>
> wrote:
>
> > I am trying to run the examples from git. While running the wordcount
> > example I see this error:
> >
> > Caused by: *java.lang.RuntimeException*: librocksdbjni
I am trying to run the examples from git. While running the wordcount
example I see this error:
Caused by: *java.lang.RuntimeException*: librocksdbjni-win64.dll was not
found inside JAR.
Am I expected to include this jar locally?
In 0.9 release it's not clear if Security features of LDAP authentication
and authorization are available? If authN and authZ are available can
somebody point me to relevant documentation that shows how to configure
Kafka to enable authN and authZ?
Are there any command line or UI tools available to monitor kafka?
I am using latest stable release of Kafka and trying to post a message.
However I see this error:
Client:
Exception in thread "main" *kafka.common.FailedToSendMessageException*:
Failed to send messages after 3 tries.
at kafka.producer.async.DefaultEventHandler.handle(
On the server side this is what I see:
[2015-11-20 14:45:31,849] INFO Closing socket connection to /177.40.23.2.
(kafka.network.Processor)
On Fri, Nov 20, 2015 at 11:51 AM, Mohit Anchlia <mohitanch...@gmail.com>
wrote:
> I am using latest stable release of Kafka and trying to post
Is there a tentative date for the release of 0.9.0? I tried looking at Jira
tickets however there is no mention of a tentative date when 0.9.0 is going
to be released.
Is there a tentative release date for Kafka 0.9.0?
ethod.
> You don't have to handle errors that way, we are just showing that you can.
>
> On Thu, Oct 22, 2015 at 8:34 PM, Mohit Anchlia <mohitanch...@gmail.com>
> wrote:
> > It's in this link. Most of the examples have some kind of error handling
> >
> > http://pe
handling?
>
> Guozhang
>
> On Thu, Oct 22, 2015 at 5:43 PM, Mohit Anchlia <mohitanch...@gmail.com>
> wrote:
>
> > The examples in the javadoc seems to imply that developers need to manage
> > all of the aspects around failures. Those examples are for rewindi
It looks like the new consumer API expects developers to manage the
failures? Or is there some other API that can abstract the failures,
primarily:
1) Automatically resent failed messages because of network issue or some
other issue between the broker and the consumer
2) Ability to acknowledge
ava consumer.
> 3) copycat framework for ingress / egress of Kafka.
>
> Guozhang
>
> On Tue, Oct 20, 2015 at 4:32 PM, Mohit Anchlia <mohitanch...@gmail.com>
> wrote:
>
> > Thanks. Are there any other major changes in .9 release other than the
> > Consumer
never mind, I found the documentation
On Wed, Oct 21, 2015 at 9:50 AM, Mohit Anchlia <mohitanch...@gmail.com>
wrote:
> Thanks. Where can I find new Java consumer API documentation with
> examples?
>
> On Tue, Oct 20, 2015 at 6:37 PM, Guozhang Wang <wang
I read through the documentation however when I try to access Java API
through the link posted on the design page I get "no page found"
http://people.apache.org/~nehanarkhede/kafka-0.9-consumer-javadoc/doc/kafka/clients/consumer/KafkaConsumer.html
On Wed, Oct 21, 2015 at 9:59 AM, Moh
Is there a wiki page where I can find all the major design changes in 0.9.0?
On Mon, Oct 19, 2015 at 4:24 PM, Guozhang Wang <wangg...@gmail.com> wrote:
> It is not released yet, we are shooting for Nov. for 0.9.0.
>
> Guozhang
>
> On Mon, Oct 19, 2015 at 4:08 PM, Moh
ease date, it is not
> complete yet.
>
> Guozhang
>
> On Tue, Oct 20, 2015 at 3:18 PM, Mohit Anchlia <mohitanch...@gmail.com>
> wrote:
>
> > Is there a wiki page where I can find all the major design changes in
> > 0.9.0?
> >
> > On Mon, Oct 19,
te:
> Hi Mohit,
>
> Are you referring to the new Java consumer or the old consumer? Or more
> specifically what examples doc are you referring to?
>
> Guozhang
>
> On Mon, Oct 19, 2015 at 10:01 AM, Mohit Anchlia <mohitanch...@gmail.com>
> wrote:
>
> > I see mo
here in case you are interested in trying it out:
>
> https://cwiki.apache.org/confluence/display/KAFKA/Consumer+Client+Re-Design
>
> Guozhang
>
> On Mon, Oct 19, 2015 at 2:54 PM, Mohit Anchlia <mohitanch...@gmail.com>
> wrote:
>
> > By old consumer you mean ve
I see most of the consumer examples create a while/for loop and then fetch
messages iteratively. Is that the only way by which clients can consumer
messages? If this is the preferred way then how do you deal with failures,
exceptions such that messages are not lost.
Also, please point me to
I am seeing following exception, don't understand the issue here. Is there
a way to resolve this error?
client consumer logs:
Exception in thread main kafka.common.ConsumerRebalanceFailedException:
groupB_ip-10-38-19-230-1414174925481-97fa3f2a can't rebalance after 4
retries
at
, Oct 22, 2014 at 11:41 AM, Mohit Anchlia mohitanch...@gmail.com
wrote:
I can't find this property in server.properties file. Is that the right
place to set this parameter?
On Tue, Oct 21, 2014 at 6:27 PM, Jun Rao jun...@gmail.com wrote:
Could you also set replica.fetch.wait.max.ms
, Mohit Anchlia mohitanch...@gmail.com
wrote:
I set the property to 1 in the consumer code that is passed to
createJavaConsumerConnector
code, but it didn't seem to help
props.put(fetch.wait.max.ms, fetchMaxWait);
On Tue, Oct 21, 2014 at 1:21 PM, Guozhang Wang wangg...@gmail.com
Narkhede neha.narkh...@gmail.com
wrote:
Can you give more information about the performance test? Which test? Which
queue? How did you measure the dequeue latency.
On Mon, Oct 20, 2014 at 5:09 PM, Mohit Anchlia mohitanch...@gmail.com
wrote:
I am running a performance test and from what I
be the cause.
-Jay
On Tue, Oct 21, 2014 at 10:50 AM, Mohit Anchlia mohitanch...@gmail.com
wrote:
It's consistently close to 100ms which makes me believe that there are
some
settings that I might have to tweak, however, I am not sure how to
confirm
that assumption :)
On Tue, Oct 21, 2014 at 8
if that fixes the problem).
The reason I suspect this problem is because the default timeout in the
java consumer is 100ms.
-Jay
On Tue, Oct 21, 2014 at 11:06 AM, Mohit Anchlia mohitanch...@gmail.com
wrote:
This is the version I am using: kafka_2.10-0.8.1.1
I think this is fairly
On Tue, Oct 21, 2014 at 11:39 AM, Mohit Anchlia mohitanch...@gmail.com
wrote:
Is this a parameter I need to set it in kafka server or on the client
side?
Also, can you help point out which one exactly is consumer max wait time
from this list?
https://kafka.apache.org/08/configuration.html
0x9515bcb0 (a java.lang.Object)
at kafka.utils.Utils$.read(Utils.scala:375)
On Tue, Oct 21, 2014 at 2:15 PM, Mohit Anchlia mohitanch...@gmail.com
wrote:
I set the property to 1 in the consumer code that is passed to
createJavaConsumerConnector
code, but it didn't seem to help
props.put
I am running a performance test and from what I am seeing is that messages
are taking about 100ms to pop from the queue itself and hence making the
test slow. I am looking for pointers of how I can troubleshoot this issue.
There seems to be plenty of CPU and IO available. I am running 22
Is Kafka supposed to throw exception if topic doesn't exist? It appears
that there is no exception thrown even though no messages are delivered and
there are errors logged in Kafka logs.
if needed.
On Fri, Oct 17, 2014 at 2:57 PM, Mohit Anchlia mohitanch...@gmail.com
wrote:
Thanks! How can I tell if I am using async producer? I thought all the
sends are async in nature
On Fri, Oct 17, 2014 at 11:44 AM, Gwen Shapira gshap...@cloudera.com
wrote:
If you have
.
—
Sent from Mailbox
On Fri, Oct 17, 2014 at 3:15 PM, Mohit Anchlia mohitanch...@gmail.com
wrote:
Little confused :) From one of the examples I am using property
request.required.acks=0,
I thought this sets the producer to be async?
On Fri, Oct 17, 2014 at 11:59 AM, Gwen Shapira
I added the following dependency in my pom file, however after I add the
dependency I get errors:
dependency
groupIdorg.apache.kafka/groupId
artifactIdkafka_2.10/artifactId
version0.8.1.1/version
/dependency
Errors:
ArtifactTransferException: Failure to transfer
Could somebody help throw some light on why my commands might be hanging?
What's the easiest way to monitor and debug this problem?
On Mon, Oct 13, 2014 at 5:07 PM, Mohit Anchlia mohitanch...@gmail.com
wrote:
I am new to Kafka and I just installed Kafka. I am getting the following
error
until there is data available.
hope that helps.
-Harsha
On Fri, Oct 10, 2014, at 12:32 PM, Mohit Anchlia wrote:
I am new to Kafka and very little familiarity with Scala. I see that the
build requires sbt tool, but do I also need to install Scala
separately?
Is there a detailed
I am new to Kafka and I just installed Kafka. I am getting the following
error. Zookeeper seems to be running.
[ec2-user@ip-10-231-154-117 kafka_2.10-0.8.1.1]$
bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test
SLF4J: Failed to load class org.slf4j.impl.StaticLoggerBinder.
, 2014 at 5:29 PM, Jun Rao jun...@gmail.com wrote:
Is that error transient or persistent?
Thanks,
Jun
On Mon, Oct 13, 2014 at 5:07 PM, Mohit Anchlia mohitanch...@gmail.com
wrote:
I am new to Kafka and I just installed Kafka. I am getting the following
error. Zookeeper seems to be running
I am new to Kafka and very little familiarity with Scala. I see that the
build requires sbt tool, but do I also need to install Scala separately?
Is there a detailed documentation on software requirements on the broker
machine.
I am also looking for 3 different types of java examples 1) Follow
48 matches
Mail list logo