SK_SER_2)) // does not work
>
> Margus (margusja) Roohttp://margus.roo.ee
> skype: margusjahttps://www.facebook.com/allan.tuuring
> +372 51 48 780
>
> On 15/09/2017 21:50, Margus Roo wrote:
>
> Hi
>
> I tested spark.streaming.receiver.maxRate and
> spark.streami
:50, Margus Roo wrote:
Hi
I tested |spark.streaming.receiver.maxRate and
||spark.streaming.backpressure.enabled settings using socketStream and
it works.|
|But if I am using nifi-spark-receiver
(https://mvnrepository.com/artifact/org.apache.nifi/nifi-spark-receiver)
then it does not using
Hi
I tested |spark.streaming.receiver.maxRate and
||spark.streaming.backpressure.enabled settings using socketStream and
it works.|
|But if I am using nifi-spark-receiver
(https://mvnrepository.com/artifact/org.apache.nifi/nifi-spark-receiver)
then it does not using
simple
but static solution I tried spark.streaming.receiver.maxRate.
I set it spark.streaming.receiver.maxRate=1. As I understand it from
Spark manual: "If the batch processing time is more than batchinterval
then obviously the receiver’s memory will start filling up and will en
Hi Sai, I am running in local mode and there is only one receiver. Verified
that.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Kinesis-Receiver-not-respecting-spark-streaming-receiver-maxRate-tp27754p27760.html
Sent from the Apache Spark User List
Hi Aravindh,
spark.streaming.receiver.maxRate is per receiver. You should multiply with
number of receivers with max rate.
Regards,
Sai
On Mon, Sep 19, 2016 at 9:31 AM, Aravindh [via Apache Spark User List] <
ml-node+s1001560n27754...@n3.nabble.com> wrote:
> I am trying to throttle
>> >
>> > On May 10, 2016, at 8:02 AM, chandan prakash <chandanbaran...@gmail.com>
>> > wrote:
>> >
>> > Hi,
>> > I am using Spark Streaming with Direct kafka approach.
>> > Want to limit number of event records coming in my
Want to limit number of event records coming in my batches.
> > Have question regarding following 2 parameters :
> > 1. spark.streaming.receiver.maxRate
> > 2. spark.streaming.kafka.maxRatePerPartition
> >
> >
> > The documentation
> > (
&
ming with Direct kafka approach.
> > Want to limit number of event records coming in my batches.
> > Have question regarding following 2 parameters :
> > 1. spark.streaming.receiver.maxRate
> > 2. spark.streaming.kafka.maxRatePerPartition
&g
om>
> wrote:
>
> Hi,
> I am using Spark Streaming with Direct kafka approach.
> Want to limit number of event records coming in my batches.
> Have question regarding following 2 parameters :
> 1. spark.streaming.receiver.maxRate
> 2. spark.streaming.kafka.maxRatePerPartition
&
number of event records coming in my batches.
> Have question regarding following 2 parameters :
> 1. spark.streaming.receiver.maxRate
> 2. spark.streaming.kafka.maxRatePerPartition
>
>
> The documentation
> (http://spark.apache.org/docs/latest/streaming-programmin
Hi,
I am using Spark Streaming with Direct kafka approach.
Want to limit number of event records coming in my batches.
Have question regarding following 2 parameters :
1. spark.streaming.receiver.maxRate
2. spark.streaming.kafka.maxRatePerPartition
The documentation (
http://spark.apache.org
Hi,
I have set spark.streaming.receiver.maxRate to 100. My batch interval is
4sec but still sometimes there are more than 400 records per batch. I am using
spark 1.2.0.
Regards,Laeeq
This might be related:
SPARK-6985
Cheers
On Wed, Jul 1, 2015 at 10:27 AM, Laeeq Ahmed laeeqsp...@yahoo.com.invalid
wrote:
Hi,
I have set spark.streaming.receiver.maxRate to 100. My batch interval
is 4sec but still sometimes there are more than 400 records per batch. I am
using spark 1.2.0
/29051579/pausing-throttling-spark-spark-streaming-application
?
Cheers
On Wed, May 27, 2015 at 4:11 PM, dgoldenberg dgoldenberg...@gmail.com
wrote:
Hi,
With the no receivers approach to streaming from Kafka, is there a way to
set something like spark.streaming.receiver.maxRate so
-throttling-spark-spark-streaming-application
?
Cheers
On Wed, May 27, 2015 at 4:11 PM, dgoldenberg dgoldenberg...@gmail.com
wrote:
Hi,
With the no receivers approach to streaming from Kafka, is there a way to
set something like spark.streaming.receiver.maxRate so as not to
overwhelm
like spark.streaming.receiver.maxRate so as not to overwhelm
the Spark consumers?
What would be some of the ways to throttle the streamed messages so that
the
consumers don't run out of memory?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark
Hi,
With the no receivers approach to streaming from Kafka, is there a way to
set something like spark.streaming.receiver.maxRate so as not to overwhelm
the Spark consumers?
What would be some of the ways to throttle the streamed messages so that the
consumers don't run out of memory
18 matches
Mail list logo