Hi Bariša,
have you had the chance to analyze the memory usage in more detail? An
OutOfMemoryError might be an indication for some memory leak which should
be solved instead of lowering some memory configuration parameters. Or is
it that the off-heap memory is not actually used but blocks the JVM from
using the allocated memory for other things?

Best,
Matthias

On Fri, Feb 26, 2021 at 10:05 AM Timo Walther <twal...@apache.org> wrote:

> Hi Barisa,
>
> by looking at the 1.8 documentation [1] it was possible to configure the
> off heap memory as well. Also other memory options were already present.
> So I don't think that you need an upgrade to 1.11 immediately. Please
> let us know if you could fix your problem, otherwise we can try to loop
> in other people that should know better.
>
> Regards,
> Timo
>
>
> [1]
>
> https://ci.apache.org/projects/flink/flink-docs-release-1.8/ops/config.html#taskmanager-memory-off-heap
>
>
>
> On 25.02.21 15:50, Bariša wrote:
> > Small update:
> >   we believe that the off heap memory is used by the parquet writer (
> > used in sink to write to S3 )
> >
> > On Wed, 24 Feb 2021 at 23:25, Bariša <barisa.obrado...@gmail.com
> > <mailto:barisa.obrado...@gmail.com>> wrote:
> >
> >     I'm running flink 1.8.2 in a container, and under heavy load,
> >     container gets OOM from the kernel.
> >     I'm guessing that that reason for the kernel OOM is large size of
> >     the off-heap memory. Is there a way I can limit it in flink 1.8.2?
> >
> >     I can see that newer version of flink has a config param, checking
> >     here is it possible to do something similar in flink 1.8.2, without
> >     a flink upgrade?
> >
> >     Cheers,
> >     Barisa
> >
>

Reply via email to