Re: KSQL to search for data in Kafka Topics

2020-05-19 Thread Christopher Smith
I'm not sure what you mean by "not using stream topology". What does that mean to you that you'd rather avoid? However, you can indeed use KSQL to define streams & tables that process data from a number of topics. However, I think you may have the misimpression that KSQL is designed so you can

KSQL to search for data in Kafka Topics

2020-05-19 Thread M. Manna
Hello, I am quite new to KSQL, so apologise for misunderstanding it's concept. I have a list of topics that I want to search data for. I am not using stream process, but plain topics which has data retained for 14 days. All i want to do is search for data in SQL-like way as long as it's within

Re: Re: Is it possible to send avro serialized data to kafka using kafka-console-producer.sh

2020-05-08 Thread wangl...@geekplus.com.cn
The kafka-avro-console-producer is only in conflunet kafka. But i am using apache kafka. Seems apache kafka kafka-console-producer is not able to send avro serialazed data to kafka, but kafka-console-consumer can read avro serialized data. I have tried

Re: Is it possible to send avro serialized data to kafka using kafka-console-producer.sh

2020-05-08 Thread Miguel Silvestre
;,"type":"string"}]}' https://docs.confluent.io/3.0.0/quickstart.html -- Miguel Silvestre On Fri, May 8, 2020 at 10:53 AM wangl...@geekplus.com.cn < wangl...@geekplus.com.cn> wrote: > > I can consume avro serialized data from kafka like this: > > b

Is it possible to send avro serialized data to kafka using kafka-console-producer.sh

2020-05-08 Thread wangl...@geekplus.com.cn
I can consume avro serialized data from kafka like this: bin/kafka-console-consumer.sh --bootstrap-server xxx:9092 --topic xxx --property print.key=true --formatter io.confluent.kafka.formatter.AvroMessageFormatter --property schema.registry.url=http://xxx:8088 It is possible to send avro

Re: How to write data from kafka to CSV file on a Mac

2020-02-26 Thread Richard Rossel
Amin > wrote: > > > > Hello, > > I'm new to kafka and I'd like to write data from kafka to a CSV file in a > > Mac. Please, advise. > > Thank You & Kindest Regards,Doaa. > > > > -- > Richard Rossel > Atlanta - GA > -- Richard Rossel Atlanta - GA

Re: How to write data from kafka to CSV file on a Mac

2020-02-26 Thread Doaa K. Amin
; I'm new to kafka and I'd like to write data from kafka to a CSV file in a > Mac. Please, advise. > Thank You & Kindest Regards,Doaa. -- Richard Rossel Atlanta - GA

Re: How to write data from kafka to CSV file on a Mac

2020-02-25 Thread Richard Rossel
Hello, > I'm new to kafka and I'd like to write data from kafka to a CSV file in a > Mac. Please, advise. > Thank You & Kindest Regards,Doaa. -- Richard Rossel Atlanta - GA

How to write data from kafka to CSV file on a Mac

2020-02-25 Thread Doaa K. Amin
Hello, I'm new to kafka and I'd like to write data from kafka to a CSV file in a Mac. Please, advise. Thank You & Kindest Regards,Doaa.

Re: Question on performance data for Kafka vs NATS

2019-03-22 Thread Adam Bellemare
One more thing to note: You are looking at regular, base NATS. On its own, it is not a direct 1-1 comparison to Kafka because it lacks things like data retention, clustering and replication. Instead, you would want to compare it to NATS-Streaming, (

Re: Question on performance data for Kafka vs NATS

2019-03-21 Thread Hans Jespersen
Thats a 4.5 year old benchmark and it was run with a single broker node and only 1 producer and 1 consumer all running on a single MacBookPro. Definitely not the target production environment for Kafka. -hans > On Mar 21, 2019, at 11:43 AM, M. Manna wrote: > > HI All, > >

Question on performance data for Kafka vs NATS

2019-03-21 Thread M. Manna
HI All, https://nats.io/about/ this shows a general comparison of sender/receiver throughputs for NATS and other messaging system including our favourite Kafka. It appears that Kafka, despite taking the 2nd place, has a very low throughput. My question is, where does Kafka win over NATS? is it

Re: Magic byte error when trying to consume Avro data with Kafka Connect

2018-12-07 Thread Robin Moffatt
ned -- Robin Moffatt | Developer Advocate | ro...@confluent.io | @rmoff On Thu, 6 Dec 2018 at 18:30, Marcos Juarez wrote: > We're trying to use Kafka Connect to pull down data from Kafka, but we're > having issues with the Avro deserialization. > > When we attempt to consume data

Re: Magic byte error when trying to consume Avro data with Kafka Connect

2018-12-06 Thread Patrick Plaatje
fka Connect to pull down data from Kafka, but we're > having issues with the Avro deserialization. > > When we attempt to consume data using the kafka-avro-console-consumer, we > can consume it, and deserialize it correctly. Our command is similar to > the following: > > *./kaf

Magic byte error when trying to consume Avro data with Kafka Connect

2018-12-06 Thread Marcos Juarez
We're trying to use Kafka Connect to pull down data from Kafka, but we're having issues with the Avro deserialization. When we attempt to consume data using the kafka-avro-console-consumer, we can consume it, and deserialize it correctly. Our command is similar to the following: *./kafka-avro

Re: Reliable way to purge data from Kafka topics

2018-05-25 Thread Shantanu Deshmukh
e > > > behing. If it is centralized in one place, in could be better to no use > > > mirror maker and have duplication of the consumer. > > > > > > So something looking more like a star schema, let me try some ascii > art : > > > > > > Main DC :

Re: Reliable way to purge data from Kafka topics

2018-05-25 Thread Vincent Maurin
n one place, in could be better to no use > > mirror maker and have duplication of the consumer. > > > > So something looking more like a star schema, let me try some ascii art : > > > > Main DC :Data storage/processing DC : > > Producer --

Re: Reliable way to purge data from Kafka topics

2018-05-25 Thread Shantanu Deshmukh
something looking more like a star schema, let me try some ascii art : > > Main DC :Data storage/processing DC : > Producer --> Kafka |Consumer > Data storage > | /-> > Backup DC : |

Re: Reliable way to purge data from Kafka topics

2018-05-25 Thread Vincent Maurin
some ascii art : Main DC :Data storage/processing DC : Producer --> Kafka |Consumer > Data storage | /-> Backup DC : | / Producer --> Kafka |Consumer / If you have an outage on the main, th

Re: Reliable way to purge data from Kafka topics

2018-05-25 Thread Jörn Franke
Purging will never prevent that it does not get replicated for sure. There will be always a case (error to purge etc) and then it is still replicated. You may reduce the probability but it will never be impossible. Your application should be able to handle duplicated messages. > On 25. May

Reliable way to purge data from Kafka topics

2018-05-25 Thread Shantanu Deshmukh
Hello, We have cross data center replication. Using Kafka mirror maker we are replicating data from our primary cluster to backup cluster. Problem arises when we start operating from backup cluster, in case of drill or actual outage. Data gathered at backup cluster needs to be reverse-replicated

Re: AW: Exception stopps data processing (Kafka Streams)

2018-05-16 Thread Matthias J. Sax
gt; > -Ursprüngliche Nachricht- > Von: Matthias J. Sax <matth...@confluent.io> > Gesendet: Dienstag, 15. Mai 2018 22:58 > An: users@kafka.apache.org > Betreff: Re: Exception stopps data processing (Kafka Streams) > > Claudia, > > I leader change is a

AW: Exception stopps data processing (Kafka Streams)

2018-05-16 Thread Claudia Wegmann
'max.in.flight.requests.per.connection' to 1 to still guarantee ordering, right? Best, Claudia -Ursprüngliche Nachricht- Von: Matthias J. Sax <matth...@confluent.io> Gesendet: Dienstag, 15. Mai 2018 22:58 An: users@kafka.apache.org Betreff: Re: Exception stopps data processing (Kafka Streams) Claudia, I

Re: Exception stopps data processing (Kafka Streams)

2018-05-15 Thread Matthias J. Sax
Claudia, I leader change is a retryable error. What is your producer config for `retries`? You might want to increase it such that the producer does not throw the exception immediately but retries couple of times -- you might also want to adjust `retry.backoff.ms` that sets the time to wait until

Exception stopps data processing (Kafka Streams)

2018-05-15 Thread Claudia Wegmann
Hey there, I've got a few Kafka Streams services which run smoothly most of the time. Sometimes, however, some of them get an exception "Abort sending since an error caught with a previous record" (see below for a full example). The Stream Service having this exception just stops its work

Loading and streaming data from Kafka to BigQuery

2017-07-06 Thread Ofir Sharony
Hi guys, I would like to recommend the following post, discussing and comparing techniques for loading data from Kafka to BigQuery. https://medium.com/myheritage-engineering/kafka-to-bigquery-load-a-guide-for-streaming-billions-of-daily-events-cbbf31f4b737 Feedback is welcome. *Ofir Sharony

Re: Deleting/Purging data from Kafka topics (Kafka 0.10)

2017-06-23 Thread karan alang
nks. > --Vahid > > > > > From: karan alang <karan.al...@gmail.com> > To: users@kafka.apache.org > Date: 06/22/2017 11:14 PM > Subject:Re: Deleting/Purging data from Kafka topics (Kafka 0.10) > > > > Hi Vahid, > here is th

Re: Deleting/Purging data from Kafka topics (Kafka 0.10)

2017-06-23 Thread Vahid S Hashemian
`. I hope this answers your question. Thanks. --Vahid From: karan alang <karan.al...@gmail.com> To: users@kafka.apache.org Date: 06/22/2017 11:14 PM Subject:Re: Deleting/Purging data from Kafka topics (Kafka 0.10) Hi Vahid, here is the output of the GetOffsetShell co

Re: Deleting/Purging data from Kafka topics (Kafka 0.10)

2017-06-23 Thread karan alang
est:0:105 > - with `--time -2` you should get test:0:105 > > Could you please advise whether you're seeing a different behavior? > > Thanks. > --Vahid > > > > > From: "Vahid S Hashemian" <vahidhashem...@us.ibm.com> > To: users@kafka.apache

Re: Deleting/Purging data from Kafka topics (Kafka 0.10)

2017-06-22 Thread Vahid S Hashemian
whether you're seeing a different behavior? Thanks. --Vahid From: "Vahid S Hashemian" <vahidhashem...@us.ibm.com> To: users@kafka.apache.org Date: 06/22/2017 06:43 PM Subject:Re: Deleting/Purging data from Kafka topics (Kafka 0.10) Hi Karan, I think the issue is

Re: Deleting/Purging data from Kafka topics (Kafka 0.10)

2017-06-22 Thread Vahid S Hashemian
com> To: users@kafka.apache.org Date: 06/22/2017 06:09 PM Subject:Re: Deleting/Purging data from Kafka topics (Kafka 0.10) Hi Vahid, somehow, the changes suggested don't seem to be taking effect, and i dont see the data being purged from the topic. Here are the steps i followe

Re: Deleting/Purging data from Kafka topics (Kafka 0.10)

2017-06-22 Thread karan alang
r example if this broker config > value is much higher, then the broker doesn't delete old logs regular > enough. > > --Vahid > > > > From: karan alang <karan.al...@gmail.com> > To: users@kafka.apache.org > Date: 06/22/2017 12:27 PM > Subject:Delet

Re: Deleting/Purging data from Kafka topics (Kafka 0.10)

2017-06-22 Thread Vahid S Hashemian
<karan.al...@gmail.com> To: users@kafka.apache.org Date: 06/22/2017 12:27 PM Subject:Deleting/Purging data from Kafka topics (Kafka 0.10) Hi All - How do i go about deleting data from Kafka Topics ? I've Kafka 0.10 installed. I tried setting the parameter of the topi

Deleting/Purging data from Kafka topics (Kafka 0.10)

2017-06-22 Thread karan alang
Hi All - How do i go about deleting data from Kafka Topics ? I've Kafka 0.10 installed. I tried setting the parameter of the topic as shown below -> $KAFKA10_HOME/bin/kafka-topics.sh --zookeeper localhost:2161 --alter --topic mmtopic6 --config retention.ms=1000 I was expecting to have the d

Re: Data in kafka topic in Json format

2017-06-02 Thread Hans Jespersen
hans >> >> >> >> >>> On Jun 2, 2017, at 8:12 AM, Mina Aslani <aslanim...@gmail.com> wrote: >>> >>> Hi, >>> >>> I would like to add that I use kafka-connect and schema-registery >> version ` >>> 3.2.1-6

Re: Data in kafka topic in Json format

2017-06-02 Thread Mina Aslani
> > > On Fri, Jun 2, 2017 at 10:59 AM, Mina Aslani <aslanim...@gmail.com> > wrote: > > > >> Hi. > >> > >> Is there any way that I get the data into a Kafka topic in Json format? > >> The source that I ingest the data from have th

Re: Data in kafka topic in Json format

2017-06-02 Thread Hans Jespersen
9 AM, Mina Aslani <aslanim...@gmail.com> wrote: > >> Hi. >> >> Is there any way that I get the data into a Kafka topic in Json format? >> The source that I ingest the data from have the data in Json format, >> however when I look that data in the kafka topic, s

Re: Data in kafka topic in Json format

2017-06-02 Thread Hans Jespersen
. -hans > On Jun 2, 2017, at 7:59 AM, Mina Aslani <aslanim...@gmail.com> wrote: > > Hi. > > Is there any way that I get the data into a Kafka topic in Json format? > The source that I ingest the data from have the data in Json format, > however when I look that data

Re: Data in kafka topic in Json format

2017-06-02 Thread Mina Aslani
Hi, I would like to add that I use kafka-connect and schema-registery version ` 3.2.1-6`. Best regards, Mina On Fri, Jun 2, 2017 at 10:59 AM, Mina Aslani <aslanim...@gmail.com> wrote: > Hi. > > Is there any way that I get the data into a Kafka topic in Json format? > The so

Data in kafka topic in Json format

2017-06-02 Thread Mina Aslani
Hi. Is there any way that I get the data into a Kafka topic in Json format? The source that I ingest the data from have the data in Json format, however when I look that data in the kafka topic, schema and payload fields are added and data is not in json format. I want to avoid implementing

RE: Android app produces data in Kafka

2017-06-01 Thread Tauzell, Dave
: Mireia [mailto:mireya.miguel.alva...@alumnos.upm.es] Sent: Thursday, June 1, 2017 2:51 PM To: users@kafka.apache.org Subject: Android app produces data in Kafka Hi. I am going to do a project in the University in order to finish my master in IoT. I need to know if it is posible connect an android

Re: Fast way search data in kafka

2017-03-23 Thread Milind Vaidya
ar < > > an...@systeminsights.com> > > > wrote: > > > > > > > Try Presto https://prestodb.io. It may solve your problem. > > > > > > > > On Sat, 4 Mar 2017, 03:18 Milind Vaidya, <kava...@gmail.com> wrote: > > >

Re: Fast way search data in kafka

2017-03-23 Thread Marko Bonaći
> > > > Try Presto https://prestodb.io. It may solve your problem. > > > > > > On Sat, 4 Mar 2017, 03:18 Milind Vaidya, <kava...@gmail.com> wrote: > > > > > > > I have 6 broker kafka setup. > > > > > > > > I have retent

Re: Fast way search data in kafka

2017-03-23 Thread Milind Vaidya
Mar 4, 2017 at 9:48 AM, Anish Mashankar <an...@systeminsights.com> > wrote: > > > Try Presto https://prestodb.io. It may solve your problem. > > > > On Sat, 4 Mar 2017, 03:18 Milind Vaidya, <kava...@gmail.com> wrote: > > > > > I have 6 broker kafka setup. > > &g

Re: Writing data from kafka-streams to remote database

2017-03-06 Thread Michael Noll
I'd use option 2 (Kafka Connect). Advantages of #2: - The code is decoupled from the processing code and easier to refactor in the future. (same as #4) - The runtime/uptime/scalability of your Kafka Streams app (processing) is decoupled from the runtime/uptime/scalability of the data ingestion

Re: Writing data from kafka-streams to remote database

2017-03-05 Thread Shimi Kiviti
Thank Eno, Yes, I am aware of that. It indeed looks like a very useful feature. The result of the processing in kafka streams is only a small amount of data that is require by our service. Currently it make more sense for us to update the remote database were we have more data that our

Re: Writing data from kafka-streams to remote database

2017-03-05 Thread Eno Thereska
Hi Shimi, Could you tell us more about your scenario? Kafka Streams uses embedded databases (RocksDb) to store it's state, so often you don't need to write anything to an external database and you can query your streams state directly from streams. Have a look at this blog if that matches your

Writing data from kafka-streams to remote database

2017-03-05 Thread Shimi Kiviti
Hi Everyone, I was wondering about writing data to remote database. I see 4 possible options: 1. Read from a topic and write to the database. 2. Use kafka connect 3. Write from anywhere in kafka streams. 4. Register a CachedStateStore FlushListener that will send a batch of

Re: Fast way search data in kafka

2017-03-04 Thread Guozhang Wang
ry Presto https://prestodb.io. It may solve your problem. > > On Sat, 4 Mar 2017, 03:18 Milind Vaidya, <kava...@gmail.com> wrote: > > > I have 6 broker kafka setup. > > > > I have retention period of 48 hrs. > > > > To debug if certain data has rea

Re: Fast way search data in kafka

2017-03-04 Thread Anish Mashankar
Try Presto https://prestodb.io. It may solve your problem. On Sat, 4 Mar 2017, 03:18 Milind Vaidya, <kava...@gmail.com> wrote: > I have 6 broker kafka setup. > > I have retention period of 48 hrs. > > To debug if certain data has reached kafka or not I am using co

Updation of data in kafka topic based on changes in data sources.

2017-02-27 Thread VIVEK KUMAR MISHRA 13BIT0066
Hi All, Is it possible to update kafka topic data based on changes in data sources using python? <https://www.quora.com/unanswered/Is-it-possible-to-update-kafka-topic-data-based-on-changes-in-data-sources-using-python>

sending mailchimp data to kafka cluster using producer api

2017-02-11 Thread VIVEK KUMAR MISHRA 13BIT0066
Hello sir, I want to send mailchimp data to kafka broker(Topic) using producer api. counld you please help me?

Re: How to connect Modbus, DNP or IEC61850 data to Kafka

2016-12-03 Thread hans
protocols for example Modbus, DNP or IEC61850 and next to Storm > processing system. > I'm wondering how can I get these data via Kafka and I don't know whether > that's supported or not. > > Any suggestion and hint are warmly welcomed! > > Regards, > Long Tian > >

How to connect Modbus, DNP or IEC61850 data to Kafka

2016-12-03 Thread Wang LongTian
Dear all gurus, I'm new to Kafka and I'm going to connect the real time data steaming from power system supervision and control devices to Kafka via different communication protocols for example Modbus, DNP or IEC61850 and next to Storm processing system. I'm wondering how can I get these data

Re: Accumulating data in Kafka Connect source tasks

2016-01-29 Thread Randall Hauch
On January 28, 2016 at 7:07:02 PM, Ewen Cheslack-Postava (e...@confluent.io) wrote: Randall,  Great question. Ideally you wouldn't need this type of state since it  should really be available in the source system. In your case, it might  actually make sense to be able to grab that information

Re: Accumulating data in Kafka Connect source tasks

2016-01-29 Thread Ewen Cheslack-Postava
On Fri, Jan 29, 2016 at 7:06 AM, Randall Hauch wrote: > On January 28, 2016 at 7:07:02 PM, Ewen Cheslack-Postava ( > e...@confluent.io) wrote: > > Randall, > > Great question. Ideally you wouldn't need this type of state since it > should really be available in the source

Re: Accumulating data in Kafka Connect source tasks

2016-01-29 Thread James Cheng
> On Jan 29, 2016, at 7:06 AM, Randall Hauch wrote: > > On January 28, 2016 at 7:07:02 PM, Ewen Cheslack-Postava (e...@confluent.io) > wrote: > Randall, > > Great question. Ideally you wouldn't need this type of state since it > should really be available in the source system.

Re: Accumulating data in Kafka Connect source tasks

2016-01-28 Thread James Cheng
> On Jan 28, 2016, at 5:06 PM, Ewen Cheslack-Postava wrote: > > Randall, > > Great question. Ideally you wouldn't need this type of state since it > should really be available in the source system. In your case, it might > actually make sense to be able to grab that

Re: Accumulating data in Kafka Connect source tasks

2016-01-28 Thread Ewen Cheslack-Postava
Randall, Great question. Ideally you wouldn't need this type of state since it should really be available in the source system. In your case, it might actually make sense to be able to grab that information from the DB itself, although that will also have issues if, for example, there have been

Re: Accumulating data in Kafka Connect source tasks

2016-01-28 Thread Randall Hauch
Rather than leave this thread so open ended, perhaps I can narrow down to what I think is the best approach. These accumulations are really just additional information from the source that don’t get written to the normal topics. Instead, each change to the accumulated state can be emitted as

Accumulating data in Kafka Connect source tasks

2016-01-27 Thread Randall Hauch
I’m creating a custom Kafka Connect source connector, and I’m running into a situation for which Kafka Connect doesn’t seem to provide a solution out of the box. I thought I’d first post to the users list in case I’m just missing a feature that’s already there. My connector’s SourceTask

Error while sending data to kafka producer

2015-12-09 Thread Ritesh Sinha
Hi, I am trying to send message to kafka producer using encryption and authentication.After creating the key and everything successfully.While passing the value through console i am getting this error: ERROR Error when sending message to topic test with key: null, value: 2 bytes with error:

Re: Error while sending data to kafka producer

2015-12-09 Thread Ben Stopford
what is your server config? > On 9 Dec 2015, at 18:21, Ritesh Sinha > wrote: > > Hi, > > I am trying to send message to kafka producer using encryption and > authentication.After creating the key and everything successfully.While > passing the value through

Re: Error while sending data to kafka producer

2015-12-09 Thread Ritesh Sinha
# Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the

Re: Error while sending data to kafka producer

2015-12-09 Thread Ben Stopford
Hi Ritesh You config on both sides looks fine. There may be something wrong with your truststore, although you should see exceptions in either the client or server log files if that is the case. As you appear to be running locally, try creating the JKS files using the shell script included

Re: Error while sending data to kafka producer

2015-12-09 Thread Ritesh Sinha
After editing my server.properties. I didn't start kafka again.That was causing the issue.Silly mistake. Thanks a lot Ben for your replies. On Thu, Dec 10, 2015 at 2:27 AM, Ben Stopford wrote: > Hi Ritesh > > You config on both sides looks fine. There may be something wrong

Fwd: Spark Streaming + kakfa (Check that you get the data from kafka producer)

2015-06-19 Thread Raghav Joshi
lines.foreachRDD(new FunctionJavaRDDString, Void() { @Override public Void call(JavaRDDString rdd) throws Exception { ListString collect = rdd.collect(); for (String data : collect) { try { // save data in the log.txt file Path filePath = Paths .get(rdd save file); if (!Files.exists(filePath)) {

framework to load streamed data from kafka into relational database

2015-03-10 Thread Vadim Keylis
Good evening. Can someone suggest existing framework that allows to reliably load data from kafka into relation database like Oracle in real time? Thanks so much in advance, Vadim

Re: framework to load streamed data from kafka into relational database

2015-03-10 Thread Joe Stein
wrote: Good evening. Can someone suggest existing framework that allows to reliably load data from kafka into relation database like Oracle in real time? Thanks so much in advance, Vadim

Fwd: Help: KafkaSpout not getting data from Kafka

2014-12-19 Thread Banias H
Hi folks, I am new to both Kafka and Storm and I have problem having KafkaSpout to get data from Kafka in our three-node environment with Kafka 0.8.1.1 and Storm 0.9.3. What is working: - I have a Kafka producer (a java application) to generate random string to a topic and I was able to run

Re: How to Ingest data into kafka

2014-12-10 Thread nitin sharma
Hi Kishore, You can use Kafka Producer API for same. You can find the sample code on Kafka Quick Start page : http://kafka.apache.org/07/quickstart.html Regards, Nitin Kumar Sharma. On Wed, Dec 10, 2014 at 2:14 AM, kishore kumar akishore...@gmail.com wrote: Hi kafkars, I want to write a

How to Ingest data into kafka

2014-12-09 Thread kishore kumar
Hi kafkars, I want to write a java code to ingest csv files into kafka,any help. Thanks, Kishore.

Re: Storing data in kafka keys

2014-11-06 Thread Jun Rao
will over time garbage collect all messages with the same key, except for the most recent message. Thanks, Jun On Wed, Nov 5, 2014 at 10:15 AM, Ivan Balashov ibalas...@gmail.com wrote: Hi, It looks like it is a general practice to avoid storing data in kafka keys. Some examples

Storing data in kafka keys

2014-11-05 Thread Ivan Balashov
Hi, It looks like it is a general practice to avoid storing data in kafka keys. Some examples of this: Camus, Secor both not using keys. Even such a swiss-army tool as kafkacat doesn't seem to have the ability to display key (although I might be wrong). Also, console producer does not display

Re: Data into Kafka

2014-07-10 Thread Joe Stein
/display/KAFKA/Consumer+Group+Example As far as data going in you are going to find the most breadth of difference because a (the) goal is to get all of your data into Kafka and creating a unified log http://engineering.linkedin.com/distributed-systems/log-what-every-software-engineer-should-know-about

Data into Kafka

2014-07-09 Thread HQ Li
Dear experts, I'm new to Kafka and am doing some study around overall real-time data integration architecture. What is the common ways of pushing data into Kafka? Does anyone use ESB or others to feed various message streams into Kafka in real-time / an event-drvien fashion? Thanks. -HQ

Re: Data into Kafka

2014-07-09 Thread Jun Rao
, I'm new to Kafka and am doing some study around overall real-time data integration architecture. What is the common ways of pushing data into Kafka? Does anyone use ESB or others to feed various message streams into Kafka in real-time / an event-drvien fashion? Thanks. -HQ

Re: Data into Kafka

2014-07-09 Thread Alex Li
-time data integration architecture. What is the common ways of pushing data into Kafka? Does anyone use ESB or others to feed various message streams into Kafka in real-time / an event-drvien fashion? Thanks. -HQ

Re: Help is processing huge data through Kafka-storm cluster

2014-06-19 Thread Shaikh Ahmed
side Storm is consuming with the max speed of 1100 messages per second. It means Storm is consuming messages 4 times slower than Kafka producing. We running these systems in production and I am bit worried about data loss. Kafka is pushing 35 million in 2 hours and Storm is taking 7-8 hours

Re: Help is processing huge data through Kafka-storm cluster

2014-06-19 Thread hsy...@gmail.com
handle much more data than kafka boundary. :) Best, Siyuan On Thu, Jun 19, 2014 at 4:30 PM, Shaikh Ahmed rnsr.sha...@gmail.com wrote: Hi All, Thanks for your valuable comments. Sure, I will give a try with Samza and Data Torrent. Meanwhile, I sharing screenshot of Storm UI. Please have

Re: Help is processing huge data through Kafka-storm cluster

2014-06-17 Thread hsy...@gmail.com
are downloaded 28 Million of messages and Monthly it goes up to 800+ million. We want to process this amount of data through our kafka and storm cluster and would like to store in HBase cluster. We are targeting to process one month of data in one day. Is it possible? We have setup our cluster thinking

Re: Help is processing huge data through Kafka-storm cluster

2014-06-17 Thread Robert Rodgers
of data through our kafka and storm cluster and would like to store in HBase cluster. We are targeting to process one month of data in one day. Is it possible? We have setup our cluster thinking that we can process million of messages in one sec as mentioned on web. Unfortunately, we have

Re: Help is processing huge data through Kafka-storm cluster

2014-06-17 Thread Neha Narkhede
up to 800+ million. We want to process this amount of data through our kafka and storm cluster and would like to store in HBase cluster. We are targeting to process one month of data in one day. Is it possible? We have setup our cluster thinking that we can process million

Re: Help is processing huge data through Kafka-storm cluster

2014-06-15 Thread pushkar priyadarshi
wrote: Hi, Daily we are downloaded 28 Million of messages and Monthly it goes up to 800+ million. We want to process this amount of data through our kafka and storm cluster and would like to store in HBase cluster. We are targeting to process one month of data in one day. Is it possible

Re: Help is processing huge data through Kafka-storm cluster

2014-06-15 Thread pushkar priyadarshi
to 800+ million. We want to process this amount of data through our kafka and storm cluster and would like to store in HBase cluster. We are targeting to process one month of data in one day. Is it possible? We have setup our cluster thinking that we can process million of messages in one

Re: Help is processing huge data through Kafka-storm cluster

2014-06-15 Thread Robert Hodges
are downloaded 28 Million of messages and Monthly it goes up to 800+ million. We want to process this amount of data through our kafka and storm cluster and would like to store in HBase cluster. We are targeting to process one month of data in one day. Is it possible? We have setup our

Re: Help is processing huge data through Kafka-storm cluster

2014-06-15 Thread Robert Hodges
of messages and Monthly it goes up to 800+ million. We want to process this amount of data through our kafka and storm cluster and would like to store in HBase cluster. We are targeting to process one month of data in one day. Is it possible? We have setup our cluster thinking that we

Help is processing huge data through Kafka-storm cluster

2014-06-14 Thread Shaikh Ahmed
Hi, Daily we are downloaded 28 Million of messages and Monthly it goes up to 800+ million. We want to process this amount of data through our kafka and storm cluster and would like to store in HBase cluster. We are targeting to process one month of data in one day. Is it possible? We have

Data generator losses some data if kafka is restarted

2013-12-10 Thread Nishant Kumar
Hi All, I am using kafka 0.8. My producers configurations are as follows kafka8.bytearray.producer.type=sync kafka8.producer.batch.num.messages=100 kafka8.producer.topic.metadata.refresh.interval.ms=60 kafka8.producer.retry.backoff.ms=100

Re: Data generator losses some data if kafka is restarted

2013-12-10 Thread Jun Rao
You will need to configure request.required.acks properly. See http://kafka.apache.org/documentation.html#producerconfigs for details. Thanks, Jun On Tue, Dec 10, 2013 at 1:55 AM, Nishant Kumar nish.a...@gmail.com wrote: Hi All, I am using kafka 0.8. My producers configurations are as

Re: Binary Data and Kafka

2013-05-08 Thread Jun Rao
No. Kafka broker stores the binary data as it is. The binary data may be compressed, if compression is enabled at the producer. Thanks, Jun On Wed, May 8, 2013 at 5:57 AM, Sybrandy, Casey casey.sybra...@six3systems.com wrote: All, Does the Kafka broker Base64 encode the messages? We are

RE: Binary Data and Kafka

2013-05-08 Thread Sybrandy, Casey
That's what I would have assumed. And no, we're not using compression. Thanks. From: Jun Rao [mailto:jun...@gmail.com] Sent: Wednesday, May 08, 2013 11:26 AM To: users@kafka.apache.org Cc: Sybrandy, Casey Subject: Re: Binary Data and Kafka No. Kafka broker stores the binary data