Hi all,
If there's no more comments, I would like to kick off a vote for this FLIP
[1].
FYI, the flip number is changed to 93 since there was a race condition of
taking 92.
[1]
https://cwiki.apache.org/confluence/display/FLINK/FLIP-93%3A+JDBC+catalog+and+Postgres+catalog
On Wed, Jan 22, 2020 at
Hi Flavio,
First, this is a generic question on how flink-jdbc is set up, not specific
to jdbc catalog, thus is better to be on its own thread.
But to just quickly answer your question, you need to see where the
incompatibility is. There may be incompatibility on 1) jdbc drivers and 2)
the databa
Hi all,
I'm happy to see a lot of interest in easing the integration with JDBC data
sources. Maybe this could be a rare situation (not in my experience
however..) but what if I have to connect to the same type of source (e.g.
Mysql) with 2 incompatible version...? How can I load the 2 (or more)
con
Hi devs,
I've updated the wiki according to feedbacks. Please take another look.
Thanks!
On Fri, Jan 10, 2020 at 2:24 PM Bowen Li wrote:
> Thanks everyone for the prompt feedback. Please see my response below.
>
> > In Postgress, the TIME/TIMESTAMP WITH TIME ZONE has the
> java.time.Instant s
Thanks everyone for the prompt feedback. Please see my response below.
> In Postgress, the TIME/TIMESTAMP WITH TIME ZONE has the java.time.Instant
semantic, and should be mapped to Flink's TIME/TIMESTAMP WITH LOCAL TIME
ZONE
Zhenghua, you are right that pg's 'timestamp with timezone' should be
tr
Hi Bowen, Thanks for driving this.
I think it would be very convenience to use tables in external DBs with
JDBC Catalog.
I have one concern about "Flink-Postgres Data Type Mapping" part:
In Postgress, the TIME/TIMESTAMP WITH TIME ZONE has the java.time.Instant
semantic,
and should be mapped to Fl
Hi Bowen, thanks for reply and updating.
> I don't see much value in providing a builder for jdbc catalogs, as they
only have 4 or 5 required params, no optional ones. I prefer users just
provide a base url without default db, usrname, pwd so we don't need to
parse url all around, as I mentioned j
Thanks Bowen for bringing up this discussion ~
I think the JDBC catalog is a useful feature.
Just one question about the "Flink-Postgres Metaspace Mapping” part:
Since the PostgreSQL does not have catalog but schema under database, why not
mapping the PG-database to Flink catalog and PG-schema
Thanks Bowen for the reply,
A user-facing JDBCCatalog and 'catalog.type' = 'jdbc' sounds good to me.
I have some other minor comments when I went through the updated
documentation:
1) 'base_url' configuration: We are following the configuration format
guideline [1] which suggest to use dash (-)
Hi Jark and Jingsong,
Thanks for your review. Please see my reply in line.
> why introducing a `PostgresJDBCCatalog`, not a generic `JDBCCatalog`
(catalog.type = 'postgres' vs 'jdbc') ?
Thanks for the reminding and I looked at JDBCDialect. A generic,
user-facing JDBCCatalog with catalog.type = j
Thanks Bowen for driving this,
+1 for this, The DDL schema definition is a headache for users, and catalog
is a solution to this problem.
I have some questions and suggestions:
- We can provide a Builder for Catalog, In my opinion, defaultDatabase,
username, pwd can be included in JDBC DB url.
Thanks Bowen for driving this.
+1 to this feature.
My concern is that why introducing a `PostgresJDBCCatalog`, not a generic
`JDBCCatalog` (catalog.type = 'postgres' vs 'jdbc') ?
>From my understanding, JDBC catalog is similar to JDBC source/sink. For
JDBC source/sink, we have a generic
implement
Hi dev,
I'd like to kick off a discussion on adding JDBC catalogs, specifically
Postgres catalog in Flink [1].
Currently users have to manually create schemas in Flink source/sink
mirroring tables in their relational databases in use cases like JDBC
read/write and consuming CDC. Many users have c
13 matches
Mail list logo