Re: Attempting to put a clean entry for key [...] into NamedCache [...] when it already contains a dirty entry for the same key

2016-12-03 Thread Eno Thereska
Hi Mathieu, What version of Kafka are you using? There was recently a fix that went into trunk, just checking if you're using an older version. (to make forward progress you can turn the cache off, like this: streamsConfiguration.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0); ) Thanks

Attempting to put a clean entry for key [...] into NamedCache [...] when it already contains a dirty entry for the same key

2016-12-03 Thread Mathieu Fenniak
Hey all, I've just been running a quick test of my kafka-streams application on the latest Kafka trunk (@e43bbce), and came across this error. I was wondering if anyone has seen this error before, have any thoughts on what might cause it, or can suggest a direction to investigate it further.

Re: Initializing StateStores takes *really* long for large datasets

2016-12-03 Thread williamtellme123
Unsubscribe Sent via the Samsung Galaxy S7, an AT 4G LTE smartphone Original message From: Guozhang Wang Date: 12/2/16 5:13 PM (GMT-06:00) To: users@kafka.apache.org Subject: Re: Initializing StateStores takes *really* long for large datasets Before we

Re: Kafka windowed table not aggregating correctly

2016-12-03 Thread williamtellme123
Unsubscribe Sent via the Samsung Galaxy S7, an AT 4G LTE smartphone Original message From: Guozhang Wang Date: 12/2/16 5:48 PM (GMT-06:00) To: users@kafka.apache.org Subject: Re: Kafka windowed table not aggregating correctly Sachin, One thing to note

How to collect connect metrcs

2016-12-03 Thread Will Du
Hi folks, How I can collect Kafka connect metrics from Confluent? Are there any API to use? In addition, if one file is very big, can multiple task working on the same file simultaneously? Thanks, Will

Re: Suggestions

2016-12-03 Thread Martin Gainty
Vincenzo Nota Bene: you can *force* resolution of a good version (3.4.8) to be used by all artifacts downstream from your parent pom.xml..the mechanism is called https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#Dependency_Management Buona Fortuna

Re: How to connect Modbus, DNP or IEC61850 data to Kafka

2016-12-03 Thread hans
If the data volumes are low or you just want a quick prototype as a proof of concept you could use existing tools like node-red to connect the various input protocols with Kafka as an output protocol. For example install from http://nodered.org then install node-red-contrib-modbus, then install

Re: Suggestions

2016-12-03 Thread Vincenzo D'Amore
I found what's wrong, well... finally! Given that consumer application shoul load received data into a Solr instance. Incidentally my version of Solr is SolrCloud, and Solrj client use zookeeper, a different version of zookeeper... Now I specified in my pom.xml the same version of zookeeper 3.4.8

How to connect Modbus, DNP or IEC61850 data to Kafka

2016-12-03 Thread Wang LongTian
Dear all gurus, I'm new to Kafka and I'm going to connect the real time data steaming from power system supervision and control devices to Kafka via different communication protocols for example Modbus, DNP or IEC61850 and next to Storm processing system. I'm wondering how can I get these data

Re: Adding topics to KafkaStreams after ingestion has been started?

2016-12-03 Thread Ali Akhtar
I suppose the topic won't be deleted, but this would be a rare enough occurrence that there won't be too many dormant topics hanging around. Alternatively perhaps I can store the undeleted topics somewhere, and whenever a new node starts, it could check this list and delete them. On Sat, Dec 3,

Re: Adding topics to KafkaStreams after ingestion has been started?

2016-12-03 Thread Matthias J. Sax
Not sure. Would need to think about it more. However, default commit interval in streams is 30 sec. You can configure is via StreamConfig COMMIT_INTERVAL_MS. So using the additional thread and waiting for 5 minutes sounds ok. Question is, what would happen if the JVM goes down before you delete

Re: Adding topics to KafkaStreams after ingestion has been started?

2016-12-03 Thread Ali Akhtar
Is there a way to make sure the offsets got committed? Perhaps, after the last msg has been consumed, I can setup a task to run after a safe time (say 5 mins? ) in another thread which would delete the topic? What would be a safe time to use? On Sat, Dec 3, 2016 at 3:04 PM, Matthias J. Sax

Re: Adding topics to KafkaStreams after ingestion has been started?

2016-12-03 Thread Matthias J. Sax
I guess yes. You might only want to make sure the topic offsets got committed -- not sure if committing offsets of a deleted topic could cause issue (ie, crashing you Streams app) -Matthias On 12/2/16 11:04 PM, Ali Akhtar wrote: > Thank you very much. Last q - Is it safe to do this from within a