Re: Conversion of Table (Blink/batch) to DataStream

2020-04-05 Thread Maciek Próchniak
Hi Jark, thanks for quick answer - I strongly suspected there is a hack like that somewhere - but couldn't find it easily in the maze of old and new scala and java APIs :D For my current experiments it's ok, I'm sure in next releases everything will be cleaned up :) best, maciek On

Re: Conversion of Table (Blink/batch) to DataStream

2020-04-04 Thread Jark Wu
Hi Maciek, This will be supported in the future. Currently, you can create a `StreamTableEnvironmentImpl` by yourself using the constructor (the construct does'n restrict batch mode). SQL CLI also does in the same way [1] (even though it's a hack). Best, Jark [1]:

Conversion of Table (Blink/batch) to DataStream

2020-04-04 Thread Maciek Próchniak
Hello, I'm playing around with Table/SQL API (Flink 1.9/1.10) and I was wondering how I can do the following: 1. read batch data (e.g. from files) 2. sort them using Table/SQL SortOperator 3. perform further operations using "normal" DataStream API (treating my batch as finite stream) - to