Hi,

I think the reason is your record is too large to do a in-memory combine.
You can try to disable your combiner.

Best,
Kurt

On Mon, Jun 12, 2017 at 9:55 PM, Sebastian Neef <
gehax...@mailbox.tu-berlin.de> wrote:

> Hi,
>
> when I'm running my Flink job on a small dataset, it successfully
> finishes. However, when a bigger dataset is used, I get multiple
> exceptions:
>
> -  Caused by: java.io.IOException: Cannot write record to fresh sort
> buffer. Record too large.
> - Thread 'SortMerger Reading Thread' terminated due to an exception: null
>
> A full stack trace can be found here [0].
>
> I tried to reduce the taskmanager.memory.fraction (or so) and also the
> amount of parallelism, but that did not help much.
>
> Flink 1.0.3-Hadoop2.7 was used.
>
> Any tipps are appreciated.
>
> Kind regards,
> Sebastian
>
> [0]:
> http://paste.gehaxelt.in/?1f24d0da3856480d#/dR8yriXd/
> VQn5zTfZACS52eWiH703bJbSTZSifegwI=
>

Reply via email to