Hi Sohi,
There seems to be no avro implementations of Encoder interface used in
StreamingFileSink but maybe it could be implemented based
on AvroKeyValueWriter with not such a big effort.
There is also a DefaultRollingPolicy which is based on time and number of
records. It might create a
Hi Erik,
I am still not able to understand reason behind this exception.
Is this exception causing failure and restart of job ? or This is occurring
after failure/restart is triggered .
Thanks
Sohi
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Hi Andrey,
I am using AvroSinkWriter (with Bucketing Sink) with compression enabled .
Looks like StreamingFileSink does not have direct support for
AvroSinkWriter. Sequence File Format is there for StreamingFileSink , but
looks like it roll files on every checkpoint (OnCheckpointRollingPolicy)
Thanks Andrey .
Yeah will upgrade and see if same gets reproduced .
-Sohi
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Hi Sohi,
I would also recommend trying the newer StreamingFileSink which is
available in Flink 1.7.x [1].
Best,
Andrey
[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.7/dev/connectors/streamfile_sink.html
On Sun, Feb 24, 2019 at 4:14 AM sohimankotia wrote:
> Hi Erik,
>
> Are
Hi Erik,
Are your suggesting all options together ?
Which of version of flink has this solved ? I am currently using 1.5.5 .
-Thanks
Sohi
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Hi sohimankotia,
My advise from also having to sub-class BucketingSink:
* rebase your changes on the BucketingSink that comes with the Flink
version you are using
* use the same super completely ugly hack I had to deploy as described
here:
Hi Team,
Any help/update on this ?
This is still an issue where i am using bucketing sink in production.
Thanks
Sohi
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Hi ,
Yes issue with Bucketing Sink . I removed and replaced Sink with Kafka Sink
it worked fine .
What could be causing
TimerException{java.nio.channels.ClosedByInterruptException}
at
Hi Andrey,
Yes. CustomBucketingSink is custom class copied from Bucketing Sink itself .
Few changes were added :
1. Add timestamp in part files
2. Few Logging statements
Note: Looks like I copied it from version 1.4 ( Don't know if that could be
the reason for failure)
Did it override
Hi Sohi,
Something was originally interrupted in DFSOutputStream$DataStreamer.run.
It was thrown in the timer callback which processed files in
CustomBucketingSink.
Task reported the failure to JM and JM triggered then job cancelation.
I do not see this CustomBucketingSink in Flink code. Is it
Hi Andrey ,
Pls find logs . Attaching dropbox link as logs as large .
Job Manager . : https://www.dropbox.com/s/q0rd60coydupl6w/full.log.gz?dl=0
Application :
https://www.dropbox.com/s/cn3yrd273wd99f2/jm-sohan.log.gz?dl=0
Thanks
Sohi
--
Sent from:
Hi Sohi,
This still looks like Task Manager logs, could you post Job Master logs,
please?
Best,
Andrey
On Tue, Jan 15, 2019 at 7:49 AM sohimankotia wrote:
> Hi ,
>
> Any Update/help please ?
>
>
>
> --
> Sent from:
> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
>
Hi ,
Any Update/help please ?
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
Hi Stefan,
Attaching Logs :
You can search for : "2019-01-09 19:34:44,170 INFO
org.apache.flink.runtime.taskmanager.Task - Attempting
to cancel task Source:
" in first 2 log files.
f3-part-aa.gz
Hi,
Could you also provide the job master log?
Best,
Stefan
> On 9. Jan 2019, at 12:02, sohimankotia wrote:
>
> Hi,
>
> I am running Flink Streaming Job with 1.5.5 version.
>
> - Job is basically reading from Kafka , windowing on 2 minutes , and writing
> to hdfs using AvroBucketing Sink .
Hi,
I am running Flink Streaming Job with 1.5.5 version.
- Job is basically reading from Kafka , windowing on 2 minutes , and writing
to hdfs using AvroBucketing Sink .
- Job is running with parallelism 132
- Checkpointing is enabled with interval of 1 minute.
- Savepoint is enabled and getting
17 matches
Mail list logo