[Note: You're receiving this email because you are subscribed to one or
more project dev@ mailing lists at the Apache Software Foundation.]
This is your final reminder that the Call for Presentations for
Community Over Code (formerly known as ApacheCon) is closing soon - on
Thursday, 13 July 2023
You are receiving this message because you are subscribed to one more
more developer mailing lists at the Apache Software Foundation.
The call for presentations is now open at
"https://apachecon.com/acasia2023/cfp.html;, and will be closed by
Sunday, Jun 18th, 2023 11:59 PM GMT.
The event will
(Note: You are receiving this because you are subscribed to the dev@
list for one or more Apache Software Foundation projects.)
The Call for Presentations (CFP) for Community Over Code (formerly
Apachecon) 2023 is open at
https://communityovercode.org/call-for-presentations/, and will close
Thu,
Dear Apache Project Management Committee (PMC) members,
The Board wants to take just a moment of your time to communicate a few
things that seem to have been forgotten by a number of PMC members,
across the Foundation, over the past few years. Please note that this
is being sent to all projects
[Note: You're receiving this because you are subscribed to one or more
Apache Software Foundation project mailing lists.]
This is your final reminder that the Call for Presetations for
ApacheCon North America 2022 will close at 00:01 GMT on Monday, May
23rd, 2022. Please don't wait! Get your talk
to:
Community
Big Data
Search
IoT
Cloud
Fintech
Pulsar
Tomcat
You can submit your session proposals starting today at
https://cfp.apachecon.com/
Rich Bowen, on behalf of the ApacheCon Planners
apachecon.com
@apachecon
. Submit your talks today at
https://acna2020.jamhosted.net/
We hope to see you at the event!
Rich Bowen, VP Conferences, The Apache Software Foundation
Bowen Li created FLINK-17392:
Summary: enable configuring minicluster in Flink SQL in IDE
Key: FLINK-17392
URL: https://issues.apache.org/jira/browse/FLINK-17392
Project: Flink
Issue Type
Bowen Li created FLINK-17333:
Summary: add doc for "create ddl"
Key: FLINK-17333
URL: https://issues.apache.org/jira/browse/FLINK-17333
Project: Flink
Issue Type: I
Bowen Li created FLINK-17175:
Summary: StringUtils.arrayToString() should consider Object[]
lastly
Key: FLINK-17175
URL: https://issues.apache.org/jira/browse/FLINK-17175
Project: Flink
Issue
Bowen Li created FLINK-17037:
Summary: add e2e tests for reading array data types from postgres
with JDBCTableSource and PostgresCatalog
Key: FLINK-17037
URL: https://issues.apache.org/jira/browse/FLINK-17037
Bowen Li created FLINK-16820:
Summary: support reading array of timestamp, data, and time in
JDBCTableSource
Key: FLINK-16820
URL: https://issues.apache.org/jira/browse/FLINK-16820
Project: Flink
Bowen Li created FLINK-16817:
Summary: StringUtils.arrayToString() doesn't convert byte[][]
correctly
Key: FLINK-16817
URL: https://issues.apache.org/jira/browse/FLINK-16817
Project: Flink
Bowen Li created FLINK-16816:
Summary: planner doesn't parse timestamp array correctly
Key: FLINK-16816
URL: https://issues.apache.org/jira/browse/FLINK-16816
Project: Flink
Issue Type: Bug
Bowen Li created FLINK-16815:
Summary: add e2e tests for reading from postgres with
JDBCTableSource and PostgresCatalog
Key: FLINK-16815
URL: https://issues.apache.org/jira/browse/FLINK-16815
Project
Bowen Li created FLINK-16814:
Summary: StringUtils.arrayToString() doesn't convert byte[]
correctly
Key: FLINK-16814
URL: https://issues.apache.org/jira/browse/FLINK-16814
Project: Flink
Issue
Bowen Li created FLINK-16813:
Summary: JDBCInputFormat doesn't correctly map Short
Key: FLINK-16813
URL: https://issues.apache.org/jira/browse/FLINK-16813
Project: Flink
Issue Type: Sub-task
Bowen Li created FLINK-16812:
Summary: introduce PostgresRowConverter
Key: FLINK-16812
URL: https://issues.apache.org/jira/browse/FLINK-16812
Project: Flink
Issue Type: Sub-task
Bowen Li created FLINK-16811:
Summary: introduce JDBCRowConverter
Key: FLINK-16811
URL: https://issues.apache.org/jira/browse/FLINK-16811
Project: Flink
Issue Type: Improvement
Bowen Li created FLINK-16810:
Summary: add back PostgresCatalogITCase
Key: FLINK-16810
URL: https://issues.apache.org/jira/browse/FLINK-16810
Project: Flink
Issue Type: Sub-task
Bowen Li created FLINK-16781:
Summary: add built-in cache mechanism for LookupableTableSource in
lookup join
Key: FLINK-16781
URL: https://issues.apache.org/jira/browse/FLINK-16781
Project: Flink
Bowen Li created FLINK-16780:
Summary: improve Flink lookup join
Key: FLINK-16780
URL: https://issues.apache.org/jira/browse/FLINK-16780
Project: Flink
Issue Type: New Feature
or bring this up. Generally, it's a very good proposal.
> > >
> > > About data gen source, do you think we need to add more columns with
> > > various types?
> > >
> > > About print sink, do we need to specify the schema?
> > >
> &g
.
Cheers,
Bowen
On Thu, Mar 19, 2020 at 10:32 PM Jingsong Li wrote:
> Hi all,
>
> I heard some users complain that table is difficult to test. Now with SQL
> client, users are more and more inclined to use it to test rather than
> program.
> The most common example is Kafka sou
Bowen Li created FLINK-16702:
Summary: develop JDBCCatalogFactory for service discovery
Key: FLINK-16702
URL: https://issues.apache.org/jira/browse/FLINK-16702
Project: Flink
Issue Type: Sub
code,
such common implementations can be moved to AbstractCatalog to make APIs
look cleaner. I recall that there was an intention to refactor catalog API
signatures, but haven't kept up with it.
Bowen
On Sun, Mar 15, 2020 at 10:19 PM Jingsong Li wrote:
> Thanks Flavio for driving. Personally I
Bowen Li created FLINK-16575:
Summary: develop HBaseCatalog to integrate HBase metadata into
Flink
Key: FLINK-16575
URL: https://issues.apache.org/jira/browse/FLINK-16575
Project: Flink
Issue
gt; Best,
> > Danny Chan
> > 在 2020年3月11日 +0800 PM4:03,Timo Walther ,写道:
> >> Hi Danny,
> >>
> >> it is true that our DDL is not standard compliant by using the WITH
> >> clause. Nevertheless, we aim for not diverging too much
y some rare type of catalog can store k-v property pair, so
table created this way often cannot be persisted. In the foreseeable
future, such catalog will only be HiveCatalog, and not everyone has a Hive
metastore. To be honest, without persistence, recreating tables every time
this way is still
Bowen Li created FLINK-16498:
Summary: make Postgres table work end-2-end in Flink SQL with
PostgresJDBCCatalog
Key: FLINK-16498
URL: https://issues.apache.org/jira/browse/FLINK-16498
Project: Flink
Bowen Li created FLINK-16474:
Summary: develop OracleJDBCCatalog to connect Flink with Oracle
databases and ecosystem
Key: FLINK-16474
URL: https://issues.apache.org/jira/browse/FLINK-16474
Project
Bowen Li created FLINK-16473:
Summary: add documentation for PostgresJDBCCatalog
Key: FLINK-16473
URL: https://issues.apache.org/jira/browse/FLINK-16473
Project: Flink
Issue Type: Sub-task
Bowen Li created FLINK-16472:
Summary: support precision of timestamp and time data types
Key: FLINK-16472
URL: https://issues.apache.org/jira/browse/FLINK-16472
Project: Flink
Issue Type: Sub
Bowen Li created FLINK-16471:
Summary: develop PostgresJDBCCatalog
Key: FLINK-16471
URL: https://issues.apache.org/jira/browse/FLINK-16471
Project: Flink
Issue Type: Sub-task
Hi Jingsong,
I think I misunderstood you. So your argument is that, to support hive
1.0.0 - 1.2.2, we are actually using Hive 1.2.2 and thus we name the flink
module as "flink-connector-hive-1.2", right? It makes sense to me now.
+1 for this change.
Cheers,
Bowen
On Thu, Mar 5, 2020
Bowen Li created FLINK-16448:
Summary: add documentation for Hive table sink parallelism setting
strategy
Key: FLINK-16448
URL: https://issues.apache.org/jira/browse/FLINK-16448
Project: Flink
ame it "flink-connector-hive-1.0", a name including the
lowest Hive version it supports.
What do you think?
On Wed, Mar 4, 2020 at 11:14 PM Jingsong Li wrote:
> Hi Bowen, thanks for your reply.
>
> > will there be a base module like "flink-connector-hive-
e.g. for
Hive 1.0.0 - 1.2.2, the module name can be "flink-connector-hive-1.0"
rather than "flink-connector-hive-1.2"
On Wed, Mar 4, 2020 at 10:20 PM Jingsong Li wrote:
> Thanks Bowen for involving.
>
> > why you proposed segregating hive versions into the 5 ranges a
I'm glad to announce that the voting of FLIP-93 has passed, with 7 +1 (3
binding: Jingsong, Kurt, Jark, 4 non-binding: Benchao, zoudan, Terry,
Leonard) and no -1.
Thanks everyone for participating!
Cheers,
Bowen
On Mon, Mar 2, 2020 at 7:33 AM Leonard Xu wrote:
> +1 (non-binding).
>
&
please explain:
1) why you proposed segregating hive versions into the 5 ranges above?
2) what different Hive features are supported in the 5 ranges?
3) have you tested that whether the proposed corresponding Flink module
will be fully compatible with each Hive version range?
Thanks,
Bowen
On Wed
ula Fóra wrote:
>
> > You are right but still if the default catalog is something else and
> > that's the one containing the table then it still wont work currently.
> >
> > Gyula
> >
> > On Wed, Mar 4, 2020 at 5:08 AM Bowen Li wrote:
> >
>
you
described.
Bowen
On Tue, Mar 3, 2020 at 5:16 AM Gyula Fóra wrote:
> Hi all!
>
> I was testing the TemporalTable functionality in the SQL client while using
> the Hive Catalog and I ran into the following problem.
>
> I have a table created in the Hive catalog and I want to
Hi all,
I'd like to kick off the vote for FLIP-93 [1] to add JDBC catalog and
Postgres catalog.
The vote will last for at least 72 hours, following the consensus voting
protocol.
[1]
https://cwiki.apache.org/confluence/display/FLINK/FLIP-93%3A+JDBC+catalog+and+Postgres+catalog
Discussion
Congrats, Jingsong!
On Fri, Feb 21, 2020 at 7:28 AM Till Rohrmann wrote:
> Congratulations Jingsong!
>
> Cheers,
> Till
>
> On Fri, Feb 21, 2020 at 4:03 PM Yun Gao wrote:
>
>> Congratulations Jingsong!
>>
>>Best,
>>Yun
>>
>>
at 11:05 AM Bowen Li wrote:
> Hi Flavio,
>
> First, this is a generic question on how flink-jdbc is set up, not
> specific to jdbc catalog, thus is better to be on its own thread.
>
> But to just quickly answer your question, you need to see where the
> incompat
Bowen Li created FLINK-16107:
Summary: github link on statefun.io should point to
https://github.com/apache/flink-statefun
Key: FLINK-16107
URL: https://issues.apache.org/jira/browse/FLINK-16107
Project
Bowen Li created FLINK-16028:
Summary: hbase connector's 'connector.table-name' property should
be optional rather than required
Key: FLINK-16028
URL: https://issues.apache.org/jira/browse/FLINK-16028
Bowen Li created FLINK-16027:
Summary: kafka connector's 'connector.topic' property should be
optional rather than required
Key: FLINK-16027
URL: https://issues.apache.org/jira/browse/FLINK-16027
Project
Bowen Li created FLINK-16024:
Summary: support filter pushdown in jdbc connector
Key: FLINK-16024
URL: https://issues.apache.org/jira/browse/FLINK-16024
Project: Flink
Issue Type: Improvement
Bowen Li created FLINK-16023:
Summary: jdbc connector's 'connector.table' property should be
optional rather than required
Key: FLINK-16023
URL: https://issues.apache.org/jira/browse/FLINK-16023
Project
Bowen Li created FLINK-15986:
Summary: support setting or changing session properties in Flink
SQL
Key: FLINK-15986
URL: https://issues.apache.org/jira/browse/FLINK-15986
Project: Flink
Issue
Bowen Li created FLINK-15985:
Summary: offload runtime params from DDL to table hints in
DML/queries
Key: FLINK-15985
URL: https://issues.apache.org/jira/browse/FLINK-15985
Project: Flink
Issue
Bowen Li created FLINK-15984:
Summary: support hive stream table sink
Key: FLINK-15984
URL: https://issues.apache.org/jira/browse/FLINK-15984
Project: Flink
Issue Type: New Feature
Bowen Li created FLINK-15983:
Summary: add native reader for Hive parquet files
Key: FLINK-15983
URL: https://issues.apache.org/jira/browse/FLINK-15983
Project: Flink
Issue Type: New Feature
Bowen Li created FLINK-15960:
Summary: support creating Hive tables, views, functions within
Flink
Key: FLINK-15960
URL: https://issues.apache.org/jira/browse/FLINK-15960
Project: Flink
Issue
Bowen Li created FLINK-15933:
Summary: update content of how generic table schema is stored in
hive via HiveCatalog
Key: FLINK-15933
URL: https://issues.apache.org/jira/browse/FLINK-15933
Project: Flink
+1, LGTM
On Tue, Feb 4, 2020 at 11:28 PM Jark Wu wrote:
> +1 form my side.
> Thanks for driving this.
>
> Btw, could you also attach a JIRA issue with the changes described in it,
> so that users can find the issue through the mailing list in the future.
>
> Best,
> Jark
>
> On Wed, 5 Feb 2020
Bowen Li created FLINK-15809:
Summary: component stack page needs to be updated for blink planner
Key: FLINK-15809
URL: https://issues.apache.org/jira/browse/FLINK-15809
Project: Flink
Issue
congrats!
On Thu, Jan 23, 2020 at 07:49 Kostas Kloudas wrote:
> Congratulations Yu and welcome!
>
> On Thu, Jan 23, 2020 at 2:28 PM Till Rohrmann
> wrote:
> >
> > Congrats Yu :-)
> >
> > On Thu, Jan 23, 2020 at 2:02 PM Yang Wang wrote:
> >
> > > Congratulations, Yu.
> > >
> > >
> > > Best,
>
+1. Thanks Hequn
On Wed, Jan 22, 2020 at 8:39 AM Till Rohrmann wrote:
> Thanks for resuming the discussion Hequn. +1 for starting with the RC
> creation. Thanks for driving the release process!
>
> Cheers,
> Till
>
> On Tue, Jan 21, 2020 at 11:02 PM jincheng sun
> wrote:
>
> > Cool, looking
be a rare situation (not in my experience
> however..) but what if I have to connect to the same type of source (e.g.
> Mysql) with 2 incompatible version...? How can I load the 2 (or more)
> connectors jars without causing conflicts?
>
> Il Mar 14 Gen 2020, 23:32 Bowen Li h
Dear Apache enthusiast,
(You’re receiving this message because you are subscribed to one or more
project mailing lists at the Apache Software Foundation.)
The call for presentations for ApacheCon North America 2020 is now open
at https://apachecon.com/acna2020/cfp
ApacheCon will be held at
Bowen Li created FLINK-15645:
Summary: enable COPY TO/FROM in Postgres JDBC source/sink for
faster batch processing
Key: FLINK-15645
URL: https://issues.apache.org/jira/browse/FLINK-15645
Project: Flink
Congrats!
On Thu, Jan 16, 2020 at 13:45 Peter Huang
wrote:
> Congratulations, Dian!
>
>
> Best Regards
> Peter Huang
>
> On Thu, Jan 16, 2020 at 11:04 AM Yun Tang wrote:
>
>> Congratulations, Dian!
>>
>> Best
>> Yun Tang
>> --
>> *From:* Benchao Li
>> *Sent:*
Hi Jingsong,
The 1st and 2nd pain points you described are very valid, as I'm more
familiar with them. I agree these are shortcomings of the current Flink SQL
design.
A couple comments on your 1st proposal:
1. is it better to have explicit APIs like "createBatchTableSource(...)"
and
Bowen Li created FLINK-15607:
Summary: throw exception when users trying to use Hive aggregate
functions in streaming mode
Key: FLINK-15607
URL: https://issues.apache.org/jira/browse/FLINK-15607
Project
Bowen Li created FLINK-15593:
Summary: add doc to remind users not using Hive aggregate
functions in streaming mode
Key: FLINK-15593
URL: https://issues.apache.org/jira/browse/FLINK-15593
Project: Flink
Bowen Li created FLINK-15591:
Summary: support CREATE TEMPORARY TABLE/VIEW in DDL
Key: FLINK-15591
URL: https://issues.apache.org/jira/browse/FLINK-15591
Project: Flink
Issue Type: Task
Bowen Li created FLINK-15590:
Summary: add section for current catalog and current database
Key: FLINK-15590
URL: https://issues.apache.org/jira/browse/FLINK-15590
Project: Flink
Issue Type
Bowen Li created FLINK-15589:
Summary: remove beta tag from catalog and hive doc
Key: FLINK-15589
URL: https://issues.apache.org/jira/browse/FLINK-15589
Project: Flink
Issue Type: Task
Hi devs,
I've updated the wiki according to feedbacks. Please take another look.
Thanks!
On Fri, Jan 10, 2020 at 2:24 PM Bowen Li wrote:
> Thanks everyone for the prompt feedback. Please see my response below.
>
> > In Postgress, the TIME/TIMESTAMP WITH TIME ZONE has the
> ja
Bowen Li created FLINK-15588:
Summary: check registered udf via catalog API cannot be a scala
inner class
Key: FLINK-15588
URL: https://issues.apache.org/jira/browse/FLINK-15588
Project: Flink
Bowen Li created FLINK-15576:
Summary: remove isTemporary property from CatalogFunction API
Key: FLINK-15576
URL: https://issues.apache.org/jira/browse/FLINK-15576
Project: Flink
Issue Type: Bug
Hi Zhenghua,
For external systems with schema, I think the schema information is
available most of the time and should be the single source of truth to
programmatically mapping column precision via Flink catalogs, to minimize
users efforts creating schema redundantly again and avoid any human
mple to avoid any possible confusion.
> 'default-database' is optional, then which database will be used or what
is the behavior when the default database is not selected.
This should be DBMS specific. For postgres, it will be the
database.
On Thu, Jan 9, 2020 at 9:48 PM Zhenghua Gao wr
Bowen Li created FLINK-15545:
Summary: Separate runtime params and semantics params from Flink
DDL for easier integration with catalogs and better user experience
Key: FLINK-15545
URL: https://issues.apache.org/jira
moment we can only use some default params for some cases, and the other
cases cannot take advantage of the JDBC catalog and users still have to
write DDL manually.
Thanks,
Bowen
On Wed, Jan 8, 2020 at 7:46 PM Jingsong Li wrote:
> Thanks Bowen for driving this,
>
> +1 for this, The
Hi Yijie,
There's just one more concern on the yaml configs. Otherwise, I think we
should be good to go.
Can you update your PR and ensure all tests pass? I can help review and
merge in the next couple weeks.
Thanks,
Bowen
On Mon, Dec 23, 2019 at 7:03 PM Yijie Shen
wrote:
> Hi Bo
databases like Postgres, MySQL, MariaDB, AWS Aurora, etc.
Note that the problem and solution are actually very general to Flink when
connecting to all kinds of external systems. We just focus on solving that
for relational databases in this FLIP.
Thanks,
Bowen
[1]
https://cwiki.apache.org
+1. It will improve user experience quite a bit.
On Thu, Jan 2, 2020 at 22:07 Yangze Guo wrote:
> Thanks for driving this, Xiaoling!
>
> +1 for supporting SQL client gateway.
>
> Best,
> Yangze Guo
>
>
> On Thu, Jan 2, 2020 at 9:58 AM 贺小令 wrote:
> >
> > Hey everyone,
> > FLIP-24
> >
Bowen Li created FLINK-15411:
Summary: HiveCatalog can't prune partition on DATE/TIMESTAMP
columns
Key: FLINK-15411
URL: https://issues.apache.org/jira/browse/FLINK-15411
Project: Flink
Issue
Bowen Li created FLINK-15376:
Summary: support "CREATE TABLE AS" in Flink SQL
Key: FLINK-15376
URL: https://issues.apache.org/jira/browse/FLINK-15376
Project: Flink
Issue Type: N
Bowen Li created FLINK-15351:
Summary: develop PostgresJDBCCatalog
Key: FLINK-15351
URL: https://issues.apache.org/jira/browse/FLINK-15351
Project: Flink
Issue Type: Sub-task
Bowen Li created FLINK-15353:
Summary: develop AbstractJDBCCatalog
Key: FLINK-15353
URL: https://issues.apache.org/jira/browse/FLINK-15353
Project: Flink
Issue Type: Sub-task
Bowen Li created FLINK-15352:
Summary: develop MySQLJDBCCatalog
Key: FLINK-15352
URL: https://issues.apache.org/jira/browse/FLINK-15352
Project: Flink
Issue Type: Sub-task
Components
Bowen Li created FLINK-15350:
Summary: develop JDBCCatalog to connect to relational databases
Key: FLINK-15350
URL: https://issues.apache.org/jira/browse/FLINK-15350
Project: Flink
Issue Type
Bowen Li created FLINK-15349:
Summary: add Catalog DDL support in FLIP-69
Key: FLINK-15349
URL: https://issues.apache.org/jira/browse/FLINK-15349
Project: Flink
Issue Type: New Feature
Bowen Li created FLINK-15348:
Summary: Fix orc optimization for version less than 2.3 by
introducing orc shim
Key: FLINK-15348
URL: https://issues.apache.org/jira/browse/FLINK-15348
Project: Flink
Really cool. I especially like the list of tags on "Ververica Platform"!
BTW, why is "Ververica Platform" placed at the last? I won't feel bothered
if we move it to the top.
On Thu, Dec 19, 2019 at 5:56 PM Seth Wiesman wrote:
> I'm not sure, I think most all the options other than EMR abstract
- integrate PyFlink with Jupyter notebook
- Description: users should be able to run PyFlink seamlessly in Jupyter
- Benefits: Jupyter is the industrial standard notebook for data
scientists. I’ve talked to a few companies in North America, they think
Jupyter is the #1 way to empower
Bowen Li created FLINK-15303:
Summary: support predicate pushdown for sources in hive connector
Key: FLINK-15303
URL: https://issues.apache.org/jira/browse/FLINK-15303
Project: Flink
Issue Type
Bowen Li created FLINK-15302:
Summary: properties in create table DDL need to be backward
compatible
Key: FLINK-15302
URL: https://issues.apache.org/jira/browse/FLINK-15302
Project: Flink
I'm not sure providing an uber jar would be possible.
Different from kafka and elasticsearch connector who have dependencies for
a specific kafka/elastic version, or the kafka universal connector that
provides good compatibilities, hive connector needs to deal with hive jars
in all 1.x, 2.x, 3.x
Bowen Li created FLINK-15263:
Summary: add dedicated page for HiveCatalog
Key: FLINK-15263
URL: https://issues.apache.org/jira/browse/FLINK-15263
Project: Flink
Issue Type: Task
Bowen Li created FLINK-15262:
Summary: kafka connector doesn't read from beginning immediately
when 'connector.startup-mode' = 'earliest-offset'
Key: FLINK-15262
URL: https://issues.apache.org/jira/browse/FLINK
Bowen Li created FLINK-15261:
Summary: add dedicated documentation for blink planner
Key: FLINK-15261
URL: https://issues.apache.org/jira/browse/FLINK-15261
Project: Flink
Issue Type: Task
Bowen Li created FLINK-15259:
Summary: HiveInspector.toInspectors() should convert Flink
constant to Hive constant
Key: FLINK-15259
URL: https://issues.apache.org/jira/browse/FLINK-15259
Project: Flink
Bowen Li created FLINK-15258:
Summary: HiveModuleFactory doesn't take hive-version
Key: FLINK-15258
URL: https://issues.apache.org/jira/browse/FLINK-15258
Project: Flink
Issue Type: Bug
Congrats!
On Fri, Dec 13, 2019 at 10:42 AM Xuefu Z wrote:
> Congratulations, Zhu Zhu!
>
> On Fri, Dec 13, 2019 at 10:37 AM Peter Huang
> wrote:
>
> > Congratulations!:)
> >
> > On Fri, Dec 13, 2019 at 9:45 AM Piotr Nowojski
> > wrote:
> >
> > > Congratulations! :)
> > >
> > > > On 13 Dec
Bowen Li created FLINK-15257:
Summary: convert HiveCatalogITCase.testCsvTableViaAPI() to use
blink planner
Key: FLINK-15257
URL: https://issues.apache.org/jira/browse/FLINK-15257
Project: Flink
1 - 100 of 565 matches
Mail list logo