Hi,
It’s better creating a script that delete the kafka folder where exist the
kafka topic and after create it again if need.
BR
Eduardo Costa Alfaia
Ph.D. Student in Telecommunications Engineering
Università degli Studi di Brescia
Tel: +39 3209333018
On 5/11/16, 09:48, "Sneh
Hi Guys,
How could I solving this problem?
% Failed to produce message: Local: Queue full
% Failed to produce message: Local: Queue full
Thanks
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
Hi Magnus
I think this answer
c) producing messages at a higher rate than the network or broker can
handle
How could I manager this?
> On 26 Oct 2015, at 17:45, Magnus Edenhill wrote:
>
> c) producing messages at a higher rate than the network or broker can
> handle
--
Hi Guys,
I have some doubts about the Kafka, the first is Why sometimes the applications
prefer to connect to zookeeper instead brokers? Connecting to zookeeper could
create an overhead, because we are inserting other element between producer and
consumer. Another question is about the
Hi Guys,
I would like to put in the kafkawordcount scala code the kafka parameter: val
kafkaParams = Map(“fetch.message.max.bytes” - “400”). I’ve put this
variable like this
val KafkaDStreams = (1 to numStreams) map {_ =
Hi All,
I am having an issue when using kafka with librdkafka. I've changed the
message.max.bytes to 2MB in my server.properties config file, that is the size
of my message, when I run the command line ./rdkafka_performance -C -t test -p
0 -b computer49:9092, after consume some messages the
fetch.message.max.bytes=400 to your
command line.
Regards,
Magnus
2015-01-19 17:52 GMT+01:00 Eduardo Costa Alfaia e.costaalf...@unibs.it:
Hi All,
I am having an issue when using kafka with librdkafka. I've changed the
message.max.bytes to 2MB in my server.properties config file, that is the
size
Hi Guys
Anyone could explain me this information?
208K), 0.0086120 secs] [Times: user=0.06 sys=0.00, real=0.01 secs]
2014-11-06T12:20:55.673+0100: 1256.382: [GC2014-11-06T12:20:55.674+0100:
1256.382: [ParNew: 551115K-2816K(613440K), 0.0204130 secs]
560218K-13933K(4126208K), 0.0205130 secs]
Hi Guys,
How could I use the Consumer and Producer configs in my Kafka environment?
Thanks
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
Hi Guys,
I am doing some tests with Spark Streaming and Kafka, but I have seen something
strange, I have modified the JavaKafkaWordCount to use ReducebyKeyandWindow and
to print in the screen the accumulated numbers of the words, in the beginning
spark works very well in each interaction the
.
On Thu, Nov 6, 2014 at 9:32 AM, Eduardo Costa Alfaia e.costaalf...@unibs.it
wrote:
Hi Guys,
I am doing some tests with Spark Streaming and Kafka, but I have seen
something strange, I have modified the JavaKafkaWordCount to use
ReducebyKeyandWindow and to print in the screen
in spark should grow up - The spark
word-count example doesn't accumulate.
It gets an RDD every n seconds and counts the words in that RDD. So we
don't expect the count to go up.
On Mon, Nov 3, 2014 at 6:57 AM, Eduardo Costa Alfaia
e.costaalf...@unibs.it
wrote:
Hi Guys,
Anyone could
Hi Dudes,
I would like to know if the producer and consumer’s properties files into the
config folder should be configured. I have configured only the
server.properties, is it enough? I am doing some tests about the performance,
for example network throughput my scenario is:
Like producer I
Hi Guys,
Anyone could explain me how to work Kafka with Spark, I am using the
JavaKafkaWordCount.java like a test and the line command is:
./run-example org.apache.spark.streaming.examples.JavaKafkaWordCount
spark://192.168.0.13:7077 computer49:2181 test-consumer-group unibs.it 3
and like a
Hi Guys,
Is there a manner of cleaning a kafka queue after that the consumer consume
the messages?
Thanks
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
the log
more info on that config here
https://kafka.apache.org/08/configuration.html
if you want to delete a message after the consumer processed a message
there is no api for it.
-Harsha
On Tue, Oct 21, 2014, at 08:00 AM, Eduardo Costa Alfaia wrote:
Hi Guys,
Is there a manner of cleaning
16 matches
Mail list logo