Re: Fail a batch in Spark Streaming forcefully based on business rules

2016-07-31 Thread Lars Albertsson
I don't know your context, so I don't have a solution for you. If you provide more information, the list might be able to suggest a solution. IIUYC, however, it sounds like you could benefit from decoupling operational failure from business level failure. E.g. if there is a failure according to

Re: Fail a batch in Spark Streaming forcefully based on business rules

2016-07-28 Thread Hemalatha A
Another usecase why I need to do this is, If Exception A is caught I should just print it and ignore, but ifException B occurs, I have to end the batch, fail it and stop processing the batch. Is it possible to achieve this?? Any hints on this please. On Wed, Jul 27, 2016 at 10:42 AM, Hemalatha A

Fail a batch in Spark Streaming forcefully based on business rules

2016-07-26 Thread Hemalatha A
Hello, I have a uescase where in, I have to fail certain batches in my streaming batches, based on my application specific business rules. Ex: If in a batch of 2 seconds, I don't receive 100 message, I should fail the batch and move on. How to achieve this behavior? -- Regards Hemalatha