Thanks Dian for kicking off the RC.
+1 from my side:
I heavily tested CDC use cases end-to-end and it works well.
- checked/verified signatures and hashes
- manually verified the diff pom and NOTICE files between 1.11.0 and 1.11.1
to check dependencies, looks good
- no missing artifacts in
>> > .property("group.id", "test-group")
>> > .format(JsonFormat.newInstance()) // shortcut for no parameters
>> > .schema(
>> > Schema.newBuilder()
>> >.column("user_id", DataTypes.BIG
Jark Wu created FLINK-18605:
---
Summary: Refactor Descriptor API (TableEnvironment#connect)
Key: FLINK-18605
URL: https://issues.apache.org/jira/browse/FLINK-18605
Project: Flink
Issue Type: New
INK-18573 and FLINK-18581 in 1.11.1.
>
> > 在 2020年7月13日,下午5:52,Chesnay Schepler 写道:
> >
> > It would be good to fix FLINK-18581 as well.
> >
> > On 13/07/2020 11:14, Jark Wu wrote:
> >> Hi Chesnay,
> >>
> >> As discussed in the thread, 1.11.1
gt;>> +1, thanks Jark for bringing this up and Dian for volunteering as
> our
> >>>>>> release manager.
> >>>>>>
> >>>>>> Best Regards,
> >>>>>> Yu
> >>>>>>
> >>>>>&
Jark Wu created FLINK-18579:
---
Summary: Remove deprecated classes in flink-connector-jdbc
Key: FLINK-18579
URL: https://issues.apache.org/jira/browse/FLINK-18579
Project: Flink
Issue Type: Task
Jark Wu created FLINK-18557:
---
Summary: Blink planner should only use ReadableConfig instead of
TableConfig
Key: FLINK-18557
URL: https://issues.apache.org/jira/browse/FLINK-18557
Project: Flink
Jark Wu created FLINK-18556:
---
Summary: Drop the unused options in TableConfig
Key: FLINK-18556
URL: https://issues.apache.org/jira/browse/FLINK-18556
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-18555:
---
Summary: Make TableConfig options can be configured by
string-based Configuration
Key: FLINK-18555
URL: https://issues.apache.org/jira/browse/FLINK-18555
Project: Flink
vertheless I'd still like
> to see a bit more explanation on the LikeOptions.
>
> On 07/07/2020 04:32, Jark Wu wrote:
> > Hi everyone,
> >
> > Leonard and I prepared a FLIP about refactoring current Descriptor API,
> > i.e. TableEnvironment#connect(). We would l
pare a quick bug-fix version from my side.
>> > >>
>> > >>
>> > >> Best,
>> > >> Leonard Xu
>> > >> [1] user:
>> >
>> http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flip-105-can-the-
Jark Wu created FLINK-18541:
---
Summary: Verify the Elasticsearch authentication option in end to
end test
Key: FLINK-18541
URL: https://issues.apache.org/jira/browse/FLINK-18541
Project: Flink
Jark Wu created FLINK-18539:
---
Summary: StreamExecutionEnvironment#addSource(SourceFunction,
TypeInformation) doesn't use the user defined type information
Key: FLINK-18539
URL: https://issues.apache.org/jira/browse
Besides, it would be great if we can figure out the performance regression
Thomas reported before.
Do you know what's the status now? @zhijiang
@Thomas
Best,
Jark
On Thu, 9 Jul 2020 at 11:10, Jark Wu wrote:
> Hi everyone,
>
> As discussed in the voting thread of 1.11.0-RC4 [1],
Hi everyone,
As discussed in the voting thread of 1.11.0-RC4 [1], we found a blocker
issue about the CDC feature [1].
Considering this is a new kind of connector, we don't want to block the
ready-to-publish RC4 and prefer to have an immediately 1.11.1 release.
Therefore, I would like to start the
Congratulations!
Thanks Zhijiang and Piotr for the great work as release manager, and thanks
everyone who makes the release possible!
Best,
Jark
On Wed, 8 Jul 2020 at 10:12, Paul Lam wrote:
> Finally! Thanks for Piotr and Zhijiang being the release managers, and
> everyone that contributed to
Congratulations Piotr!
Best,
Jark
On Tue, 7 Jul 2020 at 10:50, Yuan Mei wrote:
> Congratulations, Piotr!
>
> On Tue, Jul 7, 2020 at 1:07 AM Stephan Ewen wrote:
>
> > Hi all!
> >
> > It is my pleasure to announce that Piotr Nowojski joined the Flink PMC.
> >
> > Many of you may know Piotr from
Hi everyone,
Leonard and I prepared a FLIP about refactoring current Descriptor API,
i.e. TableEnvironment#connect(). We would like to propose a new descriptor
API to register connectors in Table API.
Since Flink 1.9, the community focused more on the new SQL DDL feature.
After a series of
+1
- started cluster and ran some examples, verified web ui and log output,
nothing unexpected, except ChangelogSocketExample which has been reported
[2] by Dawid.
- started cluster to run e2e SQL queries with millions of records with
Kafka, MySQL, Elasticsearch as sources/lookup/sinks. Works
Jark Wu created FLINK-18486:
---
Summary: Add documentation for the '%' modulus function
Key: FLINK-18486
URL: https://issues.apache.org/jira/browse/FLINK-18486
Project: Flink
Issue Type: Task
Jark Wu created FLINK-18466:
---
Summary: JDBC should fail-fast if the target table doesn't exist
Key: FLINK-18466
URL: https://issues.apache.org/jira/browse/FLINK-18466
Project: Flink
Issue Type
Jark Wu created FLINK-18465:
---
Summary: Support to integrate FLIP-24 source interface with Table
API
Key: FLINK-18465
URL: https://issues.apache.org/jira/browse/FLINK-18465
Project: Flink
Issue
Jark Wu created FLINK-18462:
---
Summary: Improve the exception message when INSERT INTO mismatch
types for empty char
Key: FLINK-18462
URL: https://issues.apache.org/jira/browse/FLINK-18462
Project: Flink
Hi,
I'm very sorry but we just found a blocker issue FLINK-18461 [1] in the new
feature of changelog source (CDC).
This bug will result in queries on changelog source can’t be inserted into
upsert sink (e.g. ES, JDBC, HBase),
which is a common case in production. CDC is one of the important
Jark Wu created FLINK-18461:
---
Summary: Changelog source can't be insert into upsert sink
Key: FLINK-18461
URL: https://issues.apache.org/jira/browse/FLINK-18461
Project: Flink
Issue Type: Bug
Thanks for the feedback Weike,
I have responded under the JIRA issue. Let's keep the discussion in the
issue.
Best,
Jark
On Tue, 30 Jun 2020 at 11:52, DONG, Weike wrote:
> Hi community,
>
> Recently we found out an issue with the Top-N feature in blink planner, and
> here is the JIRA ticket
ot;"" be written as """ src="{% link fig/stream_barriers.svg %} """ ?
>
>
> Thank you.
> Best Roc.
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> At 2020-06-17 12:38:38, "Yangze Guo&quo
above) to identify the right version works in most cases,
> but probably not in all.
> * What if a table has more than one event-time attribute? (TableSchema is
> designed to support multiple watermarks; queries with interval joins
> produce tables with multiple event-time attributes
Jark Wu created FLINK-18416:
---
Summary: Deprecate TableEnvironment#connect API
Key: FLINK-18416
URL: https://issues.apache.org/jira/browse/FLINK-18416
Project: Flink
Issue Type: Task
; > >> BEGIN /
> > >>> END.
> > >>>
> > >>> The only commonality is that all three group multiple statements.
> > >>> * BEGIN TRANSACTION / COMMIT creates a transactional context that
> > >>> guarantees atomicit
I'm also +1 for not adding the TEMPORAL keyword.
+1 to make the PRIMARY KEY semantic clear for sources.
>From my point of view:
1) PRIMARY KEY on changelog souruce:
It means that when the changelogs (INSERT/UPDATE/DELETE) are materialized,
the materialized table should be unique on the primary
Jark Wu created FLINK-18397:
---
Summary: Translate "Table & SQL Connectors Overview" page into
Chinese
Key: FLINK-18397
URL: https://issues.apache.org/jira/browse/FLINK-18397
Project: Flink
Jark Wu created FLINK-18396:
---
Summary: Translate "Formats Overview" page into Chinese
Key: FLINK-18396
URL: https://issues.apache.org/jira/browse/FLINK-18396
Project: Flink
Issue Type
Jark Wu created FLINK-18393:
---
Summary: Translate "Canal Format" page into Chinese
Key: FLINK-18393
URL: https://issues.apache.org/jira/browse/FLINK-18393
Project: Flink
Issue Type
Jark Wu created FLINK-18395:
---
Summary: Translate "ORC Format" page into Chinese
Key: FLINK-18395
URL: https://issues.apache.org/jira/browse/FLINK-18395
Project: Flink
Issue Type
Jark Wu created FLINK-18394:
---
Summary: Translate "Parquet Format" page into Chinese
Key: FLINK-18394
URL: https://issues.apache.org/jira/browse/FLINK-18394
Project: Flink
Issue Type
Jark Wu created FLINK-18391:
---
Summary: Translate "Avro Format" page into Chinese
Key: FLINK-18391
URL: https://issues.apache.org/jira/browse/FLINK-18391
Project: Flink
Issue Type
Jark Wu created FLINK-18392:
---
Summary: Translate "Debezium Format" page into Chinese
Key: FLINK-18392
URL: https://issues.apache.org/jira/browse/FLINK-18392
Project: Flink
Issue Type
Jark Wu created FLINK-18390:
---
Summary: Translate "JSON Format" page into Chinese
Key: FLINK-18390
URL: https://issues.apache.org/jira/browse/FLINK-18390
Project: Flink
Issue Type
Jark Wu created FLINK-18389:
---
Summary: Translate all SQL connector pages into Chinese
Key: FLINK-18389
URL: https://issues.apache.org/jira/browse/FLINK-18389
Project: Flink
Issue Type: Task
Jark Wu created FLINK-18388:
---
Summary: Translate "CSV Format" page into Chinese
Key: FLINK-18388
URL: https://issues.apache.org/jira/browse/FLINK-18388
Project: Flink
Issue Type
Jark Wu created FLINK-18386:
---
Summary: Translate "Print SQL Connector" page into Chinese
Key: FLINK-18386
URL: https://issues.apache.org/jira/browse/FLINK-18386
Project: Flink
Issue Type
Jark Wu created FLINK-18387:
---
Summary: Translate "BlackHole SQL Connector" page into Chinese
Key: FLINK-18387
URL: https://issues.apache.org/jira/browse/FLINK-18387
Project: Flink
Issue
Jark Wu created FLINK-18385:
---
Summary: Translate "DataGen SQL Connector" page into Chinese
Key: FLINK-18385
URL: https://issues.apache.org/jira/browse/FLINK-18385
Project: Flink
Issue
Jark Wu created FLINK-18384:
---
Summary: Translate "Elasticsearch SQL Connector" page into Chinese
Key: FLINK-18384
URL: https://issues.apache.org/jira/browse/FLINK-18384
Project: Flink
Jark Wu created FLINK-18383:
---
Summary: Translate "JDBC SQL Connector" page into Chinese
Key: FLINK-18383
URL: https://issues.apache.org/jira/browse/FLINK-18383
Project: Flink
Issue Type
Jark Wu created FLINK-18382:
---
Summary: Translate "Kafka SQL Connector" page into Chinese
Key: FLINK-18382
URL: https://issues.apache.org/jira/browse/FLINK-18382
Project: Flink
Issue Type
I'm fine with dropping support for es5.
forward to dev@.
Best,
Jark
On Fri, 19 Jun 2020 at 15:46, jackylau wrote:
> Hi all:
> when i coding the es source connector here
>
> https://github.com/liuyongvs/flink/commit/c397a759d05956629a27bf850458dd4e70330189
> for the elasticsearch source
Jark Wu created FLINK-18375:
---
Summary: Crashed tests at
org.apache.maven.plugin.surefire.booterclient.forkstarter.fork
Key: FLINK-18375
URL: https://issues.apache.org/jira/browse/FLINK-18375
Project: Flink
+1 to support HBase 2.x
But not sure about dropping support for 1.4.x
I cc'ed to user@ and user-zh@ to hear more feedback from users.
Best,
Jark
On Thu, 18 Jun 2020 at 21:25, Gyula Fóra wrote:
> Hi All!
>
> I would like to revive an old ticket
>
+1 for #1
I think this is what we are currently doing, that forward SQL statements to
TableEnv#executeSql, e.g. FLINK-17113, FLINK-18059.
But IMO the SQL CLI specific statements (EXIT, QUIT) should still stay only
in SQL CLI.
Another idea is that, the reviewer/committer should check tests are
Jark Wu created FLINK-18357:
---
Summary: ContinuousFileReaderOperator checkpoint timeout when
files and directories get larger
Key: FLINK-18357
URL: https://issues.apache.org/jira/browse/FLINK-18357
Project
RLs
> > which contain the UDF classes.
> > Alternatively, you can try to inject another classloader into
> > PackagedProgram using Reflection (but that's a rather hacky approach).
> >
> > Hope this helps.
> >
> > Cheers, Fabian
> >
> > Am Mi., 17. Juni
.setSavepointRestoreSettings((descriptor.isRecoverFromSavepoint() &&
> descriptor.getSavepointPath() != null &&
> !descriptor.getSavepointPath().equals("")) ?
>
> SavepointRestoreSettings.forPath(descriptor.getSavepointPath(),
> descriptor.isAllowNonRestore
15 Jun 2020 at 12:48, Congxian Qiu wrote:
> +1 to use {% link %} tag and add a check during CI.
> for Chinese doc, will suggest the Chinese translate contributor use the {%
> link %} tag when reviewing the translate pr.
>
> Best,
> Congxian
>
>
> Jark Wu 于2020年6月10日周三 上午10
Hi,
Which Flink version are you using? Are you using SQL CLI? Could you share
your table/sql program?
We did fix some classloading problems around SQL CLI, e.g. FLINK-18302
Best,
Jark
On Wed, 17 Jun 2020 at 10:31, 杜斌 wrote:
> add the full stack trace here:
>
>
> Caused by:
>
>
Congratulations Yu! Well deserved!
Best,
Jark
On Wed, 17 Jun 2020 at 10:18, Haibo Sun wrote:
> Congratulations Yu!
>
> Best,
> Haibo
>
>
> At 2020-06-17 09:15:02, "jincheng sun" wrote:
> >Hi all,
> >
> >On behalf of the Flink PMC, I'm happy to announce that Yu Li is now
> >part of the Apache
Jark Wu created FLINK-18333:
---
Summary: UnsignedTypeConversionITCase failed caused by MariaDB4j
"Asked to waitFor Program"
Key: FLINK-18333
URL: https://issues.apache.org/jira/browse/FLINK-18333
Hi Fabian,
Thanks for starting this discussion. I think this is a very important
syntax to support file mode and multi-statement for SQL Client.
I'm +1 to introduce a syntax to group SQL statements to execute together.
As a reference, traditional database systems also have similar syntax, such
Jark Wu created FLINK-18303:
---
Summary: Filesystem connector doesn't flush part files after
rolling interval
Key: FLINK-18303
URL: https://issues.apache.org/jira/browse/FLINK-18303
Project: Flink
Jark Wu created FLINK-18302:
---
Summary: Sql client uses wrong class loader when execute INSERT
statements
Key: FLINK-18302
URL: https://issues.apache.org/jira/browse/FLINK-18302
Project: Flink
Hi Jacky,
What's your username in wiki? So that I can give the permission to you.
Best,
Jark
On Fri, 12 Jun 2020 at 11:38, Jacky Lau wrote:
> hi all:
>After this simple discussion here
>
>
Jark Wu created FLINK-18254:
---
Summary: Add documentation for primary key syntax
Key: FLINK-18254
URL: https://issues.apache.org/jira/browse/FLINK-18254
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-18245:
---
Summary: Support to parse -1 for MemorySize and Duration
ConfigOption
Key: FLINK-18245
URL: https://issues.apache.org/jira/browse/FLINK-18245
Project: Flink
Issue
Jark Wu created FLINK-18240:
---
Summary: Correct remainder function usage in documentation or
allow % operator
Key: FLINK-18240
URL: https://issues.apache.org/jira/browse/FLINK-18240
Project: Flink
+1 to use {% link %} tag and add check in CI.
Tips: if want to link a Chinese page, should write: [CLI]({% link ops/
cli.zh.md %})
Best,
Jark
On Wed, 10 Jun 2020 at 10:30, Yangze Guo wrote:
> Thanks for that reminder, Seth!
>
> +1 to add a check during CI if possible.
>
> Best,
> Yangze Guo
I agree with Dawid and others' opinions.
We may not have enough resources to maintain more languages.
Maybe it's time to investigate better translation/synchronization tools
again.
I want to share some background about the current translation process. In
the initial proposal of Chinese
Hi everyone,
On behalf of the PMC, I'm very happy to announce Benchao Li as a new Apache
Flink committer.
Benchao started contributing to Flink since late 2018. He is very active in
Flink SQL component,
and has also participated in many discussions, bug fixes. Over the past few
months, he helped
Jark Wu created FLINK-18199:
---
Summary: Translate "Filesystem SQL Connector" page into Chinese
Key: FLINK-18199
URL: https://issues.apache.org/jira/browse/FLINK-18199
Project: Flink
Issue
Jark Wu created FLINK-18198:
---
Summary: Translate "HBase SQL Connector"page into Chinese
Key: FLINK-18198
URL: https://issues.apache.org/jira/browse/FLINK-18198
Project: Flink
Issue Type
Congratulations Xintong!
Best,
Jark
On Fri, 5 Jun 2020 at 14:32, Danny Chan wrote:
> Congratulations Xintong !
>
> Best,
> Danny Chan
> 在 2020年6月5日 +0800 PM2:20,dev@flink.apache.org,写道:
> >
> > Congratulations Xintong
>
Jark Wu created FLINK-18144:
---
Summary: State TTL configuration can't be set in SQL CLI
Key: FLINK-18144
URL: https://issues.apache.org/jira/browse/FLINK-18144
Project: Flink
Issue Type: Bug
11-1.10.0.jar
> >> >>25M flink-swift-fs-hadoop-1.10.0.jar
> >> >> 160M opt
> >> >>
> >> >> The "filesystem" connectors ar ethe heavy hitters, there.
> >> >>
> >> >> I downloaded most of the SQL c
Jark Wu created FLINK-18141:
---
Summary: Add documentation for Parquet format
Key: FLINK-18141
URL: https://issues.apache.org/jira/browse/FLINK-18141
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-18140:
---
Summary: Add documentation for ORC format
Key: FLINK-18140
URL: https://issues.apache.org/jira/browse/FLINK-18140
Project: Flink
Issue Type: Sub-task
Thanks Jacky for starting this discussion.
The requirement of ES source has been proposed in the community many
times. +1 for the feature from my side.
Here are my thoughts:
1. streaming source
As we only support bounded source for JDBC and HBase, so I think it's fine
to have a bounded ES
Jark Wu created FLINK-18135:
---
Summary: Add documentation for the Canal format
Key: FLINK-18135
URL: https://issues.apache.org/jira/browse/FLINK-18135
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-18134:
---
Summary: Add documentation for the Debezium format
Key: FLINK-18134
URL: https://issues.apache.org/jira/browse/FLINK-18134
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-18131:
---
Summary: Add documentation for the new JSON format
Key: FLINK-18131
URL: https://issues.apache.org/jira/browse/FLINK-18131
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-18133:
---
Summary: Add documentation for the new Avro format
Key: FLINK-18133
URL: https://issues.apache.org/jira/browse/FLINK-18133
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-18132:
---
Summary: Add documentation for the new CSV format
Key: FLINK-18132
URL: https://issues.apache.org/jira/browse/FLINK-18132
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-18118:
---
Summary: Some SQL Jobs with two input operators are loosing data
with unaligned checkpoints
Key: FLINK-18118
URL: https://issues.apache.org/jira/browse/FLINK-18118
Project
Jark Wu created FLINK-18093:
---
Summary: E2E tests manually for Elasticsearch Sink
Key: FLINK-18093
URL: https://issues.apache.org/jira/browse/FLINK-18093
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-18092:
---
Summary: E2E tests manually for HBase
Key: FLINK-18092
URL: https://issues.apache.org/jira/browse/FLINK-18092
Project: Flink
Issue Type: Sub-task
Components
Jark Wu created FLINK-18086:
---
Summary: Migrate SQLClientKafkaITCase to use DDL and new options
to create tables
Key: FLINK-18086
URL: https://issues.apache.org/jira/browse/FLINK-18086
Project: Flink
Jark Wu created FLINK-17995:
---
Summary: Add a new overview page for the new SQL connectors
Key: FLINK-17995
URL: https://issues.apache.org/jira/browse/FLINK-17995
Project: Flink
Issue Type: Sub
Jark Wu created FLINK-17939:
---
Summary: Translate "Python Table API Installation" page into
Chinese
Key: FLINK-17939
URL: https://issues.apache.org/jira/browse/FLINK-17939
Project: Flink
Jark Wu created FLINK-17909:
---
Summary: Make the GenericInMemoryCatalog to hold the serialized
meta data to uncover more potential bugs
Key: FLINK-17909
URL: https://issues.apache.org/jira/browse/FLINK-17909
Jark Wu created FLINK-17887:
---
Summary: Improve interface of ScanFormatFactory and
SinkFormatFactory
Key: FLINK-17887
URL: https://issues.apache.org/jira/browse/FLINK-17887
Project: Flink
Issue
Thanks Robert for bringing up this. +1 to the proposal.
>From my perspective, I would like we can clearify one more thing about "fix
version/s" in this wiki.
IIRC, if a fix is targeted to be fixed in "1.11.0", then it obviously is
fixed in "1.12.0", so such a bug fix should only set "1.11.0", not
Jark Wu created FLINK-17831:
---
Summary: Add documentation for the new Kafka connector
Key: FLINK-17831
URL: https://issues.apache.org/jira/browse/FLINK-17831
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-17832:
---
Summary: Add documentation for the new Elasticsearch connector
Key: FLINK-17832
URL: https://issues.apache.org/jira/browse/FLINK-17832
Project: Flink
Issue Type: Sub
Jark Wu created FLINK-17830:
---
Summary: Add documentation for the new HBase connector
Key: FLINK-17830
URL: https://issues.apache.org/jira/browse/FLINK-17830
Project: Flink
Issue Type: Sub-task
Jark Wu created FLINK-17829:
---
Summary: Add documentation for the new JDBC connector
Key: FLINK-17829
URL: https://issues.apache.org/jira/browse/FLINK-17829
Project: Flink
Issue Type: Sub-task
Hi Roopal,
I think you may not subscribe the dev mailing list, that's why you can't
see Seth's reply.
You can subscribe dev mailing list by sending email to
dev-subscr...@flink.apache.org
I copied Seth's reply here, hope it can help you:
Hi Divya,
I think you may not subscribe the dev mailing list, that's why you can't
see Seth's reply.
You can subscribe dev mailing list by sending email to
dev-subscr...@flink.apache.org
I copied Seth's reply here, hope it can help you:
Jark Wu created FLINK-17826:
---
Summary: Add missing custom query support on new jdbc connector
Key: FLINK-17826
URL: https://issues.apache.org/jira/browse/FLINK-17826
Project: Flink
Issue Type: Sub
Jark Wu created FLINK-17807:
---
Summary: Fix the broken link "/zh/ops/memory/mem_detail.html" in
documentation
Key: FLINK-17807
URL: https://issues.apache.org/jira/browse/FLINK-17807
Proj
Jark Wu created FLINK-17798:
---
Summary: Align the behavior between the new and legacy JDBC table
source
Key: FLINK-17798
URL: https://issues.apache.org/jira/browse/FLINK-17798
Project: Flink
Issue
Jark Wu created FLINK-17797:
---
Summary: Align the behavior between the new and legacy HBase table
source
Key: FLINK-17797
URL: https://issues.apache.org/jira/browse/FLINK-17797
Project: Flink
701 - 800 of 1589 matches
Mail list logo