Hello

Based on the exception stack provided the processor was expecting to read a
X bytes from the stream and it reached end of file "EOF" before that
point.  This suggests that somehow those bytes were not written truly
written to disk or became corrupted/modified from what the flowfile
repository expected.  This would be possible under a hard restart with
default properties (allowing write delay) or in the event of a bug in core
framework code.  The NiFi 1.10.0 release has a pretty massive number of
bugs/features included so it may be worth looking through to see if any
sound relevant to your case.

Thanks

On Sat, Dec 21, 2019 at 10:36 AM Purushotham Pushpavanthar <
pushpavant...@gmail.com> wrote:

> Hi Lei,
> Could you please help me understand what caused this problem? and how to
> avoid it.
>
> Regards,
> Purushotham Pushpavanth
>
>
>
> On Thu, 19 Dec 2019 at 16:55, wangl...@geekplus.com.cn <
> wangl...@geekplus.com.cn> wrote:
>
> > Hi Purushotham,
> >
> > Since you are using cluster mode, just delete the flow.xml.gz file and
> > restart the node, the flow file will be synced from the other two nodes.
> >
> > Regards,
> > Lei
> >
> >
> >
> > wangl...@geekplus.com.cn
> >
> > From: Purushotham Pushpavanthar
> > Date: 2019-12-19 17:05
> > To: dev
> > Subject: Unable to access flowfile content
> > Hi,
> >
> > We've have 3 node production cluster running seamlessly for almost 8
> month
> > with manageable ups and downs. However, yesterday we ran into an issue in
> > one of the processors due to which CPU shot up and node went down. On
> > restart, the contents of few enqueued flowfiles went missing all of
> sudden
> > (I was unable to view content from the content viewer in UI). This also
> > resulted in below exception, when was blocking downstream processor from
> > processing any flowfile.
> > We are using version 1.9.2. It would be very helpful if you can help me
> > debug this issue.
> > 2019-12-19 07:05:03,653 ERROR [Timer-Driven Process Thread-4]
> > o.apache.nifi.processors.hive.PutHiveQL
> > PutHiveQL[id=c820350d-d6fd-183d-a3d5-006a2b14d10a]
> > PutHiveQL[id=c820350d-d6fd-183d-a3d5-006a2b14d10a] failed to process
> > session due to java.lang.RuntimeException: Failed to execute due to
> > org.apache.nifi.processor.exception.FlowFileAccessException: Could not
> read
> > from
> >
> >
> StandardFlowFileRecord[uuid=253e1652-6e3f-49c3-b190-3788fcbc1480,claim=StandardContentClaim
> > [resourceClaim=StandardResourceClaim[id=1576648697457-40,
> > container=default, section=40], offset=10977,
> > length=83],offset=0,name=hid_1004.ejuserstruct2.2019121100.sql,size=83];
> > Processor Administratively Yielded for 1 sec: java.lang.RuntimeException:
> > Failed to execute due to
> > org.apache.nifi.processor.exception.FlowFileAccessException: Could not
> read
> > from
> >
> >
> StandardFlowFileRecord[uuid=253e1652-6e3f-49c3-b190-3788fcbc1480,claim=StandardContentClaim
> > [resourceClaim=StandardResourceClaim[id=1576648697457-40,
> > container=default, section=40], offset=10977,
> > length=83],offset=0,name=hid_1004.ejuserstruct2.2019121100.sql,size=83]
> > java.lang.RuntimeException: Failed to execute due to
> > org.apache.nifi.processor.exception.FlowFileAccessException: Could not
> read
> > from
> >
> >
> StandardFlowFileRecord[uuid=253e1652-6e3f-49c3-b190-3788fcbc1480,claim=StandardContentClaim
> > [resourceClaim=StandardResourceClaim[id=1576648697457-40,
> > container=default, section=40], offset=10977,
> > length=83],offset=0,name=hid_1004.ejuserstruct2.2019121100.sql,size=83]
> >   at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:145)
> >   at
> >
> >
> org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:295)
> >   at
> >
> >
> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
> >   at
> >
> >
> org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
> >   at
> > org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:295)
> >   at
> >
> >
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1162)
> >   at
> >
> >
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:209)
> >   at
> >
> >
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
> >   at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
> >   at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> >   at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> >   at
> >
> >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> >   at
> >
> >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> >   at
> >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >   at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >   at java.lang.Thread.run(Thread.java:748)
> > Caused by: org.apache.nifi.processor.exception.FlowFileAccessException:
> > Could not read from
> >
> >
> StandardFlowFileRecord[uuid=253e1652-6e3f-49c3-b190-3788fcbc1480,claim=StandardContentClaim
> > [resourceClaim=StandardResourceClaim[id=1576648697457-40,
> > container=default, section=40], offset=10977,
> > length=83],offset=0,name=hid_1004.ejuserstruct2.2019121100.sql,size=83]
> >   at
> > org.apache.nifi.controller.repository.io
> > .FlowFileAccessInputStream.read(FlowFileAccessInputStream.java:93)
> >   at
> > org.apache.nifi.controller.repository.io
> > .TaskTerminationInputStream.read(TaskTerminationInputStream.java:68)
> >   at org.apache.nifi.stream.io
> .StreamUtils.fillBuffer(StreamUtils.java:89)
> >   at org.apache.nifi.stream.io
> .StreamUtils.fillBuffer(StreamUtils.java:72)
> >   at
> >
> >
> org.apache.nifi.processors.hive.AbstractHiveQLProcessor$1.process(AbstractHiveQLProcessor.java:92)
> >   at
> >
> >
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2212)
> >   at
> >
> >
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2180)
> >   at
> >
> >
> org.apache.nifi.processors.hive.AbstractHiveQLProcessor.getHiveQL(AbstractHiveQLProcessor.java:89)
> >   at
> >
> org.apache.nifi.processors.hive.PutHiveQL.lambda$new$4(PutHiveQL.java:220)
> >   at org.apache.nifi.processor.util.pattern.Put.putFlowFiles(Put.java:59)
> >   at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:102)
> >   ... 15 common frames omitted
> > Caused by: java.io.EOFException: null
> >   at org.apache.nifi.stream.io.StreamUtils.skip(StreamUtils.java:270)
> >   at
> >
> >
> org.apache.nifi.controller.repository.FileSystemRepository.read(FileSystemRepository.java:859)
> >   at
> > org.apache.nifi.controller.repository.io
> > .ContentClaimInputStream.formDelegate(ContentClaimInputStream.java:154)
> >   at
> > org.apache.nifi.controller.repository.io
> > .ContentClaimInputStream.getDelegate(ContentClaimInputStream.java:51)
> >   at
> > org.apache.nifi.controller.repository.io
> > .ContentClaimInputStream.read(ContentClaimInputStream.java:89)
> >   at
> > org.apache.nifi.controller.repository.io
> > .DisableOnCloseInputStream.read(DisableOnCloseInputStream.java:49)
> >   at
> > org.apache.nifi.controller.repository.io
> > .LimitedInputStream.read(LimitedInputStream.java:86)
> >   at
> > org.apache.nifi.controller.repository.io
> > .DisableOnCloseInputStream.read(DisableOnCloseInputStream.java:49)
> >   at
> > org.apache.nifi.stream.io
> > .ByteCountingInputStream.read(ByteCountingInputStream.java:51)
> >   at java.io.FilterInputStream.read(FilterInputStream.java:133)
> >   at
> > org.apache.nifi.controller.repository.io
> > .FlowFileAccessInputStream.read(FlowFileAccessInputStream.java:82)
> >   ... 25 common frames omitted
> > 2019-12-19 07:05:03,654 WARN [Timer-Driven Process Thread-4]
> > o.a.n.controller.tasks.ConnectableTask Administratively Yielding
> > PutHiveQL[id=c820350d-d6fd-183d-a3d5-006a2b14d10a] due to uncaught
> > Exception: java.lang.RuntimeException: Failed to execute due to
> > org.apache.nifi.processor.exception.FlowFileAccessException: Could not
> read
> > from
> >
> >
> StandardFlowFileRecord[uuid=253e1652-6e3f-49c3-b190-3788fcbc1480,claim=StandardContentClaim
> > [resourceClaim=StandardResourceClaim[id=1576648697457-40,
> > container=default, section=40], offset=10977,
> > length=83],offset=0,name=hid_1004.ejuserstruct2.2019121100.sql,size=83]
> > java.lang.RuntimeException: Failed to execute due to
> > org.apache.nifi.processor.exception.FlowFileAccessException: Could not
> read
> > from
> >
> >
> StandardFlowFileRecord[uuid=253e1652-6e3f-49c3-b190-3788fcbc1480,claim=StandardContentClaim
> > [resourceClaim=StandardResourceClaim[id=1576648697457-40,
> > container=default, section=40], offset=10977,
> > length=83],offset=0,name=hid_1004.ejuserstruct2.2019121100.sql,size=83]
> >   at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:145)
> >   at
> >
> >
> org.apache.nifi.processors.hive.PutHiveQL.lambda$onTrigger$6(PutHiveQL.java:295)
> >   at
> >
> >
> org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
> >   at
> >
> >
> org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
> >   at
> > org.apache.nifi.processors.hive.PutHiveQL.onTrigger(PutHiveQL.java:295)
> >   at
> >
> >
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1162)
> >   at
> >
> >
> org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:209)
> >   at
> >
> >
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
> >   at org.apache.nifi.engine.FlowEngine$2.run(FlowEngine.java:110)
> >   at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> >   at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> >   at
> >
> >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> >   at
> >
> >
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> >   at
> >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> >   at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> >   at java.lang.Thread.run(Thread.java:748)
> > Caused by: org.apache.nifi.processor.exception.FlowFileAccessException:
> > Could not read from
> >
> >
> StandardFlowFileRecord[uuid=253e1652-6e3f-49c3-b190-3788fcbc1480,claim=StandardContentClaim
> > [resourceClaim=StandardResourceClaim[id=1576648697457-40,
> > container=default, section=40], offset=10977,
> > length=83],offset=0,name=hid_1004.ejuserstruct2.2019121100.sql,size=83]
> >   at
> > org.apache.nifi.controller.repository.io
> > .FlowFileAccessInputStream.read(FlowFileAccessInputStream.java:93)
> >   at
> > org.apache.nifi.controller.repository.io
> > .TaskTerminationInputStream.read(TaskTerminationInputStream.java:68)
> >   at org.apache.nifi.stream.io
> .StreamUtils.fillBuffer(StreamUtils.java:89)
> >   at org.apache.nifi.stream.io
> .StreamUtils.fillBuffer(StreamUtils.java:72)
> >   at
> >
> >
> org.apache.nifi.processors.hive.AbstractHiveQLProcessor$1.process(AbstractHiveQLProcessor.java:92)
> >   at
> >
> >
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2212)
> >   at
> >
> >
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2180)
> >   at
> >
> >
> org.apache.nifi.processors.hive.AbstractHiveQLProcessor.getHiveQL(AbstractHiveQLProcessor.java:89)
> >   at
> >
> org.apache.nifi.processors.hive.PutHiveQL.lambda$new$4(PutHiveQL.java:220)
> >   at org.apache.nifi.processor.util.pattern.Put.putFlowFiles(Put.java:59)
> >   at org.apache.nifi.processor.util.pattern.Put.onTrigger(Put.java:102)
> >   ... 15 common frames omitted
> > Caused by: java.io.EOFException: null
> >   at org.apache.nifi.stream.io.StreamUtils.skip(StreamUtils.java:270)
> >   at
> >
> >
> org.apache.nifi.controller.repository.FileSystemRepository.read(FileSystemRepository.java:859)
> >   at
> > org.apache.nifi.controller.repository.io
> > .ContentClaimInputStream.formDelegate(ContentClaimInputStream.java:154)
> >   at
> > org.apache.nifi.controller.repository.io
> > .ContentClaimInputStream.getDelegate(ContentClaimInputStream.java:51)
> >   at
> > org.apache.nifi.controller.repository.io
> > .ContentClaimInputStream.read(ContentClaimInputStream.java:89)
> >   at
> > org.apache.nifi.controller.repository.io
> > .DisableOnCloseInputStream.read(DisableOnCloseInputStream.java:49)
> >   at
> > org.apache.nifi.controller.repository.io
> > .LimitedInputStream.read(LimitedInputStream.java:86)
> >   at
> > org.apache.nifi.controller.repository.io
> > .DisableOnCloseInputStream.read(DisableOnCloseInputStream.java:49)
> >   at
> > org.apache.nifi.stream.io
> > .ByteCountingInputStream.read(ByteCountingInputStream.java:51)
> >   at java.io.FilterInputStream.read(FilterInputStream.java:133)
> >   at
> > org.apache.nifi.controller.repository.io
> > .FlowFileAccessInputStream.read(FlowFileAccessInputStream.java:82)
> >   ... 25 common frames omitted
> >
> >
> > Regards,
> > Purushotham Pushpavanth
> >
>

Reply via email to