thanks Alonso,

Sorry, but there are some security reservations.

But we can assume the receiver, is equivalent to writing a JMS based custom
receiver, where we register a message listener and for each message
delivered by JMS will be stored by calling store method of listener.


Something like :
https://github.com/tbfenet/spark-jms-receiver/blob/master/src/main/scala/org/apache/spark/streaming/jms/JmsReceiver.scala

Though the diff is here this JMS receiver is using block generator and the
calling store.
I m calling store when I receive message.
And also I'm not using block generator.
Not sure if that something will make the memory to balloon up.

Btw I also run the same message consumer code from standalone map and never
seen this memory issue.

On Sun, May 21, 2017 at 10:20 AM, Alonso Isidoro Roman <alons...@gmail.com>
wrote:

> could you share the code?
>
> Alonso Isidoro Roman
> [image: https://]about.me/alonso.isidoro.roman
>
> <https://about.me/alonso.isidoro.roman?promo=email_sig&utm_source=email_sig&utm_medium=email_sig&utm_campaign=external_links>
>
> 2017-05-20 7:54 GMT+02:00 Manish Malhotra <manish.malhotra.w...@gmail.com>
> :
>
>> Hello,
>>
>> have implemented Java based custom receiver, which consumes from
>> messaging system say JMS.
>> once received message, I call store(object) ... Im storing spark Row
>> object.
>>
>> it run for around 8 hrs, and then goes OOM, and OOM is happening in
>> receiver nodes.
>> I also tried to run multiple receivers, to distribute the load but faces
>> the same issue.
>>
>> something fundamentally we are doing wrong, which tells custom receiver/spark
>> to release the memory.
>> but Im not able to crack that, atleast till now.
>>
>> any help is appreciated !!
>>
>> Regards,
>> Manish
>>
>>
>

Reply via email to