Re: OOM on Specific Emails
Hi, On 04/26/2017 07:58 PM, Jerry Malcolm wrote: I've decided to take this in steps. My first objective is to just kill the email the first time it causes an OOM and at least stop it from going into an infinite loop trying to re-process it. My plan is to modify SetMimeHeader.java and add to to catch block. But I need help on what I can do to kill the email. Is there anything I can set in the Mail object or the MimeMessage object that will cause JAMES to just kill it or at least get it into a suspended state so JAMES won't restart it? Just set its state to GHOST and it won't go further in the pipeline. Thx. BTW... my next step is to analyze the message in that same catch block before killing it and hopefully figure out the characteristic that is causing the OOM and simply kill it before the OOM can even occur. But that's phase 2 Just want to stop the infinite loop first. The easiest thing to do is to dump it in a file and log an error with the path of the file IMO. Cheers, -- Matthieu Baechler - To unsubscribe, e-mail: server-user-unsubscr...@james.apache.org For additional commands, e-mail: server-user-h...@james.apache.org
Re: OOM on Specific Emails
I've decided to take this in steps. My first objective is to just kill the email the first time it causes an OOM and at least stop it from going into an infinite loop trying to re-process it. My plan is to modify SetMimeHeader.java and add to to catch block. But I need help on what I can do to kill the email. Is there anything I can set in the Mail object or the MimeMessage object that will cause JAMES to just kill it or at least get it into a suspended state so JAMES won't restart it? Thx. BTW... my next step is to analyze the message in that same catch block before killing it and hopefully figure out the characteristic that is causing the OOM and simply kill it before the OOM can even occur. But that's phase 2 Just want to stop the infinite loop first. On 4/25/2017 11:39 AM, Jerry Malcolm wrote: Benoit, Thanks so much for the information. This has definitely been a frustration. I'm glad that there is some hope at getting around it. You mentioned a SizeGreaterThan matcher. Two questions 1) How can I determine the threshold/limit size? I don't want to kill off good emails just because they are a bit large. But I want to definitely get rid of the ones that are causing the problem. 2) If I do get a hit, what processor do I call that will discard the email and not risk further processing that might still cause the OOM? Thanks again for the help. Jerry On 4/21/2017 11:57 PM, Benoit Tellier wrote: Hi, You are processing a too large email. SetMimeHeader mailet is modifying your email, but fails allocating more resources. This may be caused to a missing pre-allocation parameter in ByteArrayOutputStream. This sounds from the stacktrace like a limitation of the javax implementation. (You might get very long headers on this mail, I guess) We had some recent work with javax, and it might be interesting to know if this limitation is still here. You get the error on a recurrent bases as James is marking processing as failed and will re-attend it. You might want to position a "SizeGreaterThan" matcher to defend against this. Hope this helps, Thanks for the report, Benoit Le 22/04/2017 à 05:27, Jerry Malcolm a écrit : I have James R3B5. It's been working fine for over 2 years. But in the last few months I randomly start getting repeated OutOfMemory exceptions. The only fix I've found is to delete the var folder and reboot. The OOM entry in the log is always preceded by an error related to one particular email. The error and an OOM repeat continually for that email until I delete the var folder. Since it will sometimes go a week or two without an OOM and then for a few days it will happen every few hours, I'm pretty certain is has to do with some specific spam message coming in that JAMES is not handling. I don't know enough about the inner workings of JAMES to interpret this stack trace. I can make a patch and rebuild if I can just figure out where to patch a workaround for this. Can someone point me in the right direction? Log file: (this block is replicated 20-25+ times in the log file for the same msg id; Looks to me like the camel processor is in an infinite loop until OOM occurs) BTW... my heap is now set to 1024m. I kept increasing it hoping for a change. But it looks like its going to fill up no matter what size I make it. ERROR 15:04:19,491 | org.apache.camel.processor.DefaultErrorHandler | Failed delivery for (MessageId: ID-p3965917-56475-1492805025650-0-1 on ExchangeId: ID-p3965917-56475-1492805025650-0-62). Exhausted after delivery attempt: 1 caught: org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[Message: org.apache.james.core.MailImpl@bf4cc8] org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[Message: org.apache.james.core.MailImpl@bf4cc8] at org.apache.camel.util.ObjectHelper.wrapCamelExecutionException(ObjectHelper.java:1287) at org.apache.camel.impl.DefaultExchange.setException(DefaultExchange.java:282) at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:64) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:73) at
Re: OOM on Specific Emails
Benoit, Thanks so much for the information. This has definitely been a frustration. I'm glad that there is some hope at getting around it. You mentioned a SizeGreaterThan matcher. Two questions 1) How can I determine the threshold/limit size? I don't want to kill off good emails just because they are a bit large. But I want to definitely get rid of the ones that are causing the problem. 2) If I do get a hit, what processor do I call that will discard the email and not risk further processing that might still cause the OOM? Thanks again for the help. Jerry On 4/21/2017 11:57 PM, Benoit Tellier wrote: Hi, You are processing a too large email. SetMimeHeader mailet is modifying your email, but fails allocating more resources. This may be caused to a missing pre-allocation parameter in ByteArrayOutputStream. This sounds from the stacktrace like a limitation of the javax implementation. (You might get very long headers on this mail, I guess) We had some recent work with javax, and it might be interesting to know if this limitation is still here. You get the error on a recurrent bases as James is marking processing as failed and will re-attend it. You might want to position a "SizeGreaterThan" matcher to defend against this. Hope this helps, Thanks for the report, Benoit Le 22/04/2017 à 05:27, Jerry Malcolm a écrit : I have James R3B5. It's been working fine for over 2 years. But in the last few months I randomly start getting repeated OutOfMemory exceptions. The only fix I've found is to delete the var folder and reboot. The OOM entry in the log is always preceded by an error related to one particular email. The error and an OOM repeat continually for that email until I delete the var folder. Since it will sometimes go a week or two without an OOM and then for a few days it will happen every few hours, I'm pretty certain is has to do with some specific spam message coming in that JAMES is not handling. I don't know enough about the inner workings of JAMES to interpret this stack trace. I can make a patch and rebuild if I can just figure out where to patch a workaround for this. Can someone point me in the right direction? Log file: (this block is replicated 20-25+ times in the log file for the same msg id; Looks to me like the camel processor is in an infinite loop until OOM occurs) BTW... my heap is now set to 1024m. I kept increasing it hoping for a change. But it looks like its going to fill up no matter what size I make it. ERROR 15:04:19,491 | org.apache.camel.processor.DefaultErrorHandler | Failed delivery for (MessageId: ID-p3965917-56475-1492805025650-0-1 on ExchangeId: ID-p3965917-56475-1492805025650-0-62). Exhausted after delivery attempt: 1 caught: org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[Message: org.apache.james.core.MailImpl@bf4cc8] org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[Message: org.apache.james.core.MailImpl@bf4cc8] at org.apache.camel.util.ObjectHelper.wrapCamelExecutionException(ObjectHelper.java:1287) at org.apache.camel.impl.DefaultExchange.setException(DefaultExchange.java:282) at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:64) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:73) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.processor.interceptor.TraceInterceptor.process(TraceInterceptor.java:91) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.RedeliveryErrorHandler.processErrorHandler(RedeliveryErrorHandler.java:334) at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:220) at org.apache.camel.processor.RouteContextProcessor.processNext(RouteContextProcessor.java:45) at
Re: OOM on Specific Emails
Hi, You are processing a too large email. SetMimeHeader mailet is modifying your email, but fails allocating more resources. This may be caused to a missing pre-allocation parameter in ByteArrayOutputStream. This sounds from the stacktrace like a limitation of the javax implementation. (You might get very long headers on this mail, I guess) We had some recent work with javax, and it might be interesting to know if this limitation is still here. You get the error on a recurrent bases as James is marking processing as failed and will re-attend it. You might want to position a "SizeGreaterThan" matcher to defend against this. Hope this helps, Thanks for the report, Benoit Le 22/04/2017 à 05:27, Jerry Malcolm a écrit : > I have James R3B5. It's been working fine for over 2 years. But in the > last few months I randomly start getting repeated OutOfMemory > exceptions. The only fix I've found is to delete the var folder and > reboot. The OOM entry in the log is always preceded by an error related > to one particular email. The error and an OOM repeat continually for > that email until I delete the var folder. > > Since it will sometimes go a week or two without an OOM and then for a > few days it will happen every few hours, I'm pretty certain is has to do > with some specific spam message coming in that JAMES is not handling. I > don't know enough about the inner workings of JAMES to interpret this > stack trace. I can make a patch and rebuild if I can just figure out > where to patch a workaround for this. Can someone point me in the right > direction? > > Log file: (this block is replicated 20-25+ times in the log file for the > same msg id; Looks to me like the camel processor is in an infinite > loop until OOM occurs) > > BTW... my heap is now set to 1024m. I kept increasing it hoping for a > change. But it looks like its going to fill up no matter what size I > make it. > > ERROR 15:04:19,491 | org.apache.camel.processor.DefaultErrorHandler | > Failed delivery for (MessageId: ID-p3965917-56475-1492805025650-0-1 on > ExchangeId: ID-p3965917-56475-1492805025650-0-62). Exhausted after > delivery attempt: 1 caught: org.apache.camel.CamelExecutionException: > Exception occurred during execution on the exchange: Exchange[Message: > org.apache.james.core.MailImpl@bf4cc8] > org.apache.camel.CamelExecutionException: Exception occurred during > execution on the exchange: Exchange[Message: > org.apache.james.core.MailImpl@bf4cc8] > at > org.apache.camel.util.ObjectHelper.wrapCamelExecutionException(ObjectHelper.java:1287) > > at > org.apache.camel.impl.DefaultExchange.setException(DefaultExchange.java:282) > > at > org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:64) > > at > org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) > > at > org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) > > at > org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) > > at > org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) > > at > org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) > > at > org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) > > at > org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:73) > > at > org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) > > at > org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) > > at > org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) > > at > org.apache.camel.processor.interceptor.TraceInterceptor.process(TraceInterceptor.java:91) > > at > org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) > > at > org.apache.camel.processor.RedeliveryErrorHandler.processErrorHandler(RedeliveryErrorHandler.java:334) > > at > org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:220) > > at > org.apache.camel.processor.RouteContextProcessor.processNext(RouteContextProcessor.java:45) > > at > org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) > > at > org.apache.camel.processor.interceptor.DefaultChannel.process(DefaultChannel.java:303) > > at > org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) > > at > org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) > > at > org.apache.camel.processor.ChoiceProcessor.process(ChoiceProcessor.java:81) > at >
OOM on Specific Emails
I have James R3B5. It's been working fine for over 2 years. But in the last few months I randomly start getting repeated OutOfMemory exceptions. The only fix I've found is to delete the var folder and reboot. The OOM entry in the log is always preceded by an error related to one particular email. The error and an OOM repeat continually for that email until I delete the var folder. Since it will sometimes go a week or two without an OOM and then for a few days it will happen every few hours, I'm pretty certain is has to do with some specific spam message coming in that JAMES is not handling. I don't know enough about the inner workings of JAMES to interpret this stack trace. I can make a patch and rebuild if I can just figure out where to patch a workaround for this. Can someone point me in the right direction? Log file: (this block is replicated 20-25+ times in the log file for the same msg id; Looks to me like the camel processor is in an infinite loop until OOM occurs) BTW... my heap is now set to 1024m. I kept increasing it hoping for a change. But it looks like its going to fill up no matter what size I make it. ERROR 15:04:19,491 | org.apache.camel.processor.DefaultErrorHandler | Failed delivery for (MessageId: ID-p3965917-56475-1492805025650-0-1 on ExchangeId: ID-p3965917-56475-1492805025650-0-62). Exhausted after delivery attempt: 1 caught: org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[Message: org.apache.james.core.MailImpl@bf4cc8] org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[Message: org.apache.james.core.MailImpl@bf4cc8] at org.apache.camel.util.ObjectHelper.wrapCamelExecutionException(ObjectHelper.java:1287) at org.apache.camel.impl.DefaultExchange.setException(DefaultExchange.java:282) at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:64) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:73) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.processor.interceptor.TraceInterceptor.process(TraceInterceptor.java:91) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.RedeliveryErrorHandler.processErrorHandler(RedeliveryErrorHandler.java:334) at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:220) at org.apache.camel.processor.RouteContextProcessor.processNext(RouteContextProcessor.java:45) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.processor.interceptor.DefaultChannel.process(DefaultChannel.java:303) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.ChoiceProcessor.process(ChoiceProcessor.java:81) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:73) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99) at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90) at org.apache.camel.processor.interceptor.TraceInterceptor.process(TraceInterceptor.java:91) at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73) at