Could you find that the input amount of the node `sink` is being accumulated in 
Flink UI? Is it possible that there are no data actually after join.


If you have checked the above and everything seems ok, you can try again with  
a connector named `print` as the sink table and check whether the out file has 
data.




--

    Best!
    Xuyang




At 2022-06-02 18:10:28, "Zain Haider Nemati" <zain.hai...@retailo.co> wrote:

Hi, 
We are using table apis to integrate and transform data sources and converting 
them to datastream. We want to see the data formatting and adding a .print() 
sink to the datastream but the .out files do not show any output.
We do see records coming in from the metrics in flink UI though. Suggestions on 
where to look at for potential issues?


code:
tEnv.executeSql(“CREATE TABLE orders (\n” +
                ”    id  BIGINT,\n” +
                ”    customer_id BIGINT\n” +
                “) WITH (\n” +
                ”    ‘connector’ = ‘kafka’,\n” +
                ”    ‘topic’     = ‘orders’,\n” +
                ”    ‘properties.bootstrap.servers’ = ‘...’,\n” +
                ”    ‘scan.startup.mode’ = ‘earliest-offset’,\n” +
                ”    ‘format’    = ‘json’\n” +
                “)”);
Table result = tEnv.sqlQuery(“SELECT o.id AS order_id,\n” +
                ”     dbo.batch_id AS batch_id,\n” +
                ”     o.customer_id AS customer_id,\n” +
                ”     dbo.delivery_priority AS delivery_priority\n” +
                ”     FROM orders o\n” +
                ”     INNER JOIN delivery_batch_orders dbo ON o.id = 
dbo.order_id\n”
                );
               
tEnv.toAppendStream(result, StringValue.class).print();
env.execute();


Flink Version : 1.13.1

Reply via email to