How can I do the same thing in the .avro format?
On Friday, October 19, 2018, Jacob Sheck wrote:
> Can you use a union with null? This would be the IDL exmple.
>
> record RecordConfig {
> ...
> }
>
> union {null, RecordConfig} record = null;
>
> On Fri, Oct 19,
Hi,
I am adding AVRO schema to the kafka messages, however, I would like to
know how I can make a field of "type": "record" optional.
*Note*: "default":null does not help.
Any idea? Can you elaborate the solution/workaround with an example please?
Best regards,
Mina
park does not pass this config to the consumer on purpose...
> It's not a Kafka issues -- IIRC, there is Spark JIRA ticket for this.
>
> -Matthias
>
> On 2/12/18 11:04 AM, Mina Aslani wrote:
> > Hi,
> >
> > I am getting below error
> > Caused by: or
> AFAIK, Spark does not pass this config to the consumer on purpose...
> It's not a Kafka issues -- IIRC, there is Spark JIRA ticket for this.
>
> -Matthias
>
> On 2/12/18 11:04 AM, Mina Aslani wrote:
> > Hi,
> >
> > I am getting below error
> > Caused b
Hi,
I am getting below error
Caused by: org.apache.kafka.clients.consumer.OffsetOutOfRangeException:
Offsets out of range with no configured reset policy for partitions:
{topic1-0=304337}
as soon as I submit a spark app to my cluster.
I am using below dependency
name: 'spark-streaming-kafka-0-10_
f a
> serializer/deserializer (serdes) is called a “converter”.
> Check which converter you have configured for your source connector and if
> it is overriding whatever the default converter is configured for the
> connect worker it is running in.
>
> -hans
>
>
>
>
>
Hi,
I would like to add that I use kafka-connect and schema-registery version `
3.2.1-6`.
Best regards,
Mina
On Fri, Jun 2, 2017 at 10:59 AM, Mina Aslani wrote:
> Hi.
>
> Is there any way that I get the data into a Kafka topic in Json format?
> The source that I ingest the data f
Hi.
Is there any way that I get the data into a Kafka topic in Json format?
The source that I ingest the data from have the data in Json format,
however when I look that data in the kafka topic, schema and payload fields
are added and data is not in json format.
I want to avoid implementing a tra
.g. you can use
> kubernetes, mesos, Yarn, or anything else)
>
> On 4/27/17, 10:52 AM, "Mina Aslani" wrote:
>
> Hi,
>
> I created a kafka stream app and as I was informed I created a docker
> image
> with the app and launched it as a container
Hi,
I created a kafka stream app and as I was informed I created a docker image
with the app and launched it as a container. However, I have couple of
questions:
- Would every Kafka streaming job require a new docker image and deployment
of the container/service?
- How should I structure things d
running, in my container should I run Java
> -cp ... same as
> > https://github.com/confluentinc/examples/blob/3.
> 2.x/kafka-streams/src/main/java/io/confluent/examples/streams/
> WordCountLambdaExample.java#L55-L62?
>
> Yes.
>
>
> -Michael
>
>
>
> On Th
://github.com/confluentinc/examples/blob/3.
2.x/kafka-streams/src/main/java/io/confluent/examples/streams/
WordCountLambdaExample.java#L55-L62?
Regards,
Mina
On Tue, Mar 21, 2017 at 4:49 PM, Mina Aslani wrote:
> Hi Michael,
>
> Thank you very much for the prompt response, really appreciate it!
es of your app.
>
> Also, what do you mean by "in a cluster of Kafka containers" and "in the
> cluster of Kafkas"?
>
> On Tue, Mar 21, 2017 at 9:08 PM, Mina Aslani wrote:
>
> > Hi,
> >
> > I am trying to understand how I can use a kafka stream app(
Hi,
I am trying to understand how I can use a kafka stream app(jar file) in a
cluster of kafka containers.
Kafka does not have master/slave concept (unlike spark), how I should run
my app in the cluster of kafkas (e.g. on one or multiple docker-machine/s)?
I use below command line when having on
Hi,
I get ERROR Error when sending message to topic my-topic with key: null,
value: ... bytes with error: (org.apache.kafka.clients.producer.internals.
ErrorLoggingCallback)
org.apache.kafka.common.errors.TimeoutException: Expiring 11 record(s) for
my-topic-0: 1732 ms has passed since last append
uent.i
> > o/3.2.0/streams/quickstart.html#goal-of-this-quickstart and in
> > docker-machine ran /usr/bin/kafka-run-class
> org.apache.kafka.streams.examp
> > les.wordcount.WordCountDemo.
> >
> > How come running same program out of docker-machine does not output t
o was
> working correctly. You were simply unaware that the WordCount example does
> not write its output to the console.
>
> Best,
> Michael
>
>
>
>
>
> On Wed, Mar 15, 2017 at 6:14 AM, Mina Aslani wrote:
>
> > Hi,
> > I just checked st
come running same program out of docker-machine does not output to the
output topic?
Should I make the program as jar and deploy to docker-machine and run it
using ./bin/kafka-run-class?
Best regards,
Mina
On Tue, Mar 14, 2017 at 11:11 PM, Mina Aslani wrote:
> I even tried h
, Mar 14, 2017 at 9:56 PM, Mina Aslani wrote:
> And the port for kafka is 29092 and for zookeeper 32181.
>
> On Tue, Mar 14, 2017 at 9:06 PM, Mina Aslani wrote:
>
>> Hi,
>>
>> I forgot to add in my previous email 2 questions.
>>
>> To setup my env, shall I
env?
How can I check "whether streams (that is just an app) can reach Kafka"?
Regards,
Mina
On Tue, Mar 14, 2017 at 9:00 PM, Mina Aslani wrote:
> Hi Eno,
>
> Sorry! That is a typo!
>
> I have a docker-machine with different containers (setup as directed @
> http:
And the port for kafka is 29092 and for zookeeper 32181.
On Tue, Mar 14, 2017 at 9:06 PM, Mina Aslani wrote:
> Hi,
>
> I forgot to add in my previous email 2 questions.
>
> To setup my env, shall I use https://raw.githubusercontent.com/
> confluentinc/cp-docker-images/mas
ything to do with streams, but rather with the Kafka
> configuration and whether streams (that is just an app) can reach Kafka at
> all. If you provide the above information we can look further.
>
>
>
> Thanks
> Eno
>
> > On 14 Mar 2017, at 18:42, Mina Aslani wrote:
&
Any book, document and provides information on how to use kafka stream?
On Tue, Mar 14, 2017 at 2:42 PM, Mina Aslani wrote:
> I reset and still not working!
>
> My env is setup using http://docs.confluent.io/3.2.0/cp-docker-images/
> docs/quickstart.html
>
> I just
s/blob/3.
> 2.x/kafka-streams/src/main/java/io/confluent/examples/streams/
> WordCountLambdaExample.java#L178-L181
>
>
> Hope this helps.
>
> -Matthias
>
>
> On 3/13/17 7:30 PM, Mina Aslani wrote:
> > Hi Matthias,
> >
> > Thank you for the qui
Hi,
I am using below code to read from a topic and count words and write to
another topic. The example is the one in github.
My kafka container is in the VM. I do not get any error but I do not see
any result/output in my output ordCount-output topic either. The program
also does not stop either!
Tables buffer internally, and thus, you might
> only see data on commit.
>
> Try to reduce commit interval or disable caching by setting
> "cache.max.bytes.buffering" to zero in your StreamsConfig.
>
>
> -Matthias
>
> On 3/13/17 12:29 PM, Mina Aslani wrote:
> &g
Hi,
This is the first time that am using Kafka Stream. I would like to read
from input topic and write to output topic. However, I do not see the word
count when I try to run below example. Looks like that it does not connect
to Kafka. I do not see any error though. I tried my localhost kafka as w
Hi,
I am new to Kafka/Kafka-connect. I would like to use Kafka-Connect
transformer to get specific fields from my data @ a kafka topic.
I was not able to find information/examples/documents about how to use
Kafka-Connect transformer.
I really appreciate if I can get some info on that!
Best rega
Hi,
I would like to subscribe to user mailing list.
Best regards,
Mina
29 matches
Mail list logo