Joe
Thanks for the options.
Kafka Flink option is interesting but to big at the moment.
I’ll build a custom processor for this occasion.
Much appreciated!
Craig Knell
> On 4 Jun 2019, at 21:56, Joe Witt wrote:
>
> ...from the description it isn't clear what you're trying to achieve
Thanks. I'll look at that.
On Tue, Jun 4, 2019 at 12:33 PM Mark Payne wrote:
> Mike,
>
> You may want to look at DataTypeUtils.mergeDataTypes( final DataType
> thisDataType, final DataType otherDataType )
> I don't believe this is exactly what you are looking for, but will likely
> give you a
Mike,
You may want to look at DataTypeUtils.mergeDataTypes( final DataType
thisDataType, final DataType otherDataType )
I don't believe this is exactly what you are looking for, but will likely give
you a good starting point, if you
were to implement such a helper method.
Thanks
-Mark
On Jun
Are there are any helper functions that would be useful for checking a
source RecordFieldType and a target RecordFieldType and letting you know if
the source is compatible with the target?
At this point, I don't have a Record object. It's simply evaluating
RecordFieldTypes off of two RecordSchema
...from the description it isn't clear what you're trying to achieve so
lets first try to expand the detail on the use case.
We should distinguish whether you're wanting to 'combine various objects in
a data stream together on some time bound' from 'processing various objects
in a data stream to
Craig,
If you have a timestamp set as an attribute on the processor, then this is kind
of possible.
Have a regular MergeContent processor, with "Maximum Group Size" set to 1 mb,
set "Max Bin Age" to 3 min; you may need to tweak settings to get the right
cadence, but these are generally the
Hi Folks
We have a stream of data that I need to window to 5 minutes and the window is
to slide every 3 minutes. Each minute is 1 mb, I therefore have to deliver 5mb
per 3 minutes.
What is the best way of achieving this in nifi?
Best regards
Craig