Hi,

I have an API that emits output that I want to use as a data source for
Flink.

I have written a custom source function that is as follows -

public class DynamicRuleSource extends AlertingRuleSource {
    private ArrayList<Rule> rules = new ArrayList<Rule>();


    public void run(SourceContext<Rule> ctx) throws Exception {
        System.out.println("In run ");
        while(true) {
            while (!rules.isEmpty()) {
                Rule rule = rules.remove(0);
                ctx.collectWithTimestamp(rule, 0);
                ctx.emitWatermark(new Watermark(0));
            }
            Thread.sleep(1000);
        }
    }

    public void addRule(Rule rule) {
        rules.add(rule);
    }

    @Override
    public void cancel() {
    }
}


When the API is invoked, it calls the addRule method in my CustomSource
function.

The run method in CustomSource polls for any data to be ingested.

The same object instance is shared with the API and the Flink Execution
environment, however, the output of the API does not get ingested into the
Flink DataStream.

Is this the right pattern to use, or is Kafka the recommended way of
streaming data into Flink ?

--Aarti

-- 
Aarti Gupta <https://www.linkedin.com/company/qualys>
Director, Engineering, Correlation


aagu...@qualys.com
T


Qualys, Inc. – Blog <https://qualys.com/blog> | Community
<https://community.qualys.com> | Twitter <https://twitter.com/qualys>


<https://www.qualys.com/email-banner>

Reply via email to