Hi,
Perhaps broadcast state is natural fit for this scenario.
https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/stream/state/broadcast_state.html
Thanks,
Selvaraj C
On Fri, 22 Jan 2021 at 8:45 PM, Kumar Bolar, Harshith
wrote:
> Hi all,
>
> The external database consists of a se
Hi Users,
We want to have a real time aggregation (KPI) .
we are maintaining aggregation counters in the keyed value state .
key could be customer activation date and type.
Lot of counters are maintained against that key.
If we want to add one more counter for the existing keys which is in the
s
Could you pls try modifying conf/logback.xml .
Regards,
Selvaraj C
On Mon, Feb 11, 2019 at 4:32 PM simpleusr wrote:
> Hi Gary,
>
> By "job logs" I mean all the loggers under a subpackage of
> com.mycompany.xyz
> .
>
> We are using ./bin/flink run command for job execution thats why I modified
>
I have faced same problem .
https://stackoverflow.com/questions/54286486/two-kafka-consumer-in-same-group-and-one-partition
On Wed, Jan 30, 2019 at 6:11 PM Avi Levi wrote:
> Ok, if you guys think it's should be like that then so be it. All I am
> saying is that it is not standard behaviour from
com> wrote:
> Hi Selvaraj
>
> In your pojo add data member as status or something like that,now set it
> error in case it is invaild .pass the output of flatmap
> to split opertor there you can split the stream
>
> On Tue, Jan 29, 2019 at 6:39 PM Selvaraj chennappan <
fast and other(c1)
>
>
>
>
>
> On Tue, Jan 29, 2019 at 2:44 PM Selvaraj chennappan <
> selvarajchennap...@gmail.com> wrote:
>
>> Team,
>>
>> I have two kafka consumer for same topic and want to join second stream
>> to first after couple of subtasks computatio
UseCase:- We have kafka consumer to read messages(json ) then it applies to
flatmap for transformation based on the rules ( rules are complex ) and
convert it to pojo .
We want to verify the record(pojo) is valid by checking field by field of
that record .if record is invalid due to transformation
Team,
I have two kafka consumer for same topic and want to join second stream to
first after couple of subtasks computation in the first stream then
validate the record . KT - C1 ,C2
KT - C1 - Transformation(FlatMap) - Dedup - Validate --ifvalidsave it to DB
-C2 - Process