Feifan Wang created FLINK-29526:
---
Summary: Java doc mistake in SequenceNumberRange#contains()
Key: FLINK-29526
URL: https://issues.apache.org/jira/browse/FLINK-29526
Project: Flink
Issue Type:
zhangjingcun created FLINK-29525:
Summary: Support INSTR、LEFT、RIGHT built-in function in Table API
Key: FLINK-29525
URL: https://issues.apache.org/jira/browse/FLINK-29525
Project: Flink
zhangjingcun created FLINK-29524:
Summary: Support DECODE、ENCODE built-in function in Table API
Key: FLINK-29524
URL: https://issues.apache.org/jira/browse/FLINK-29524
Project: Flink
Issue
zhangjingcun created FLINK-29523:
Summary: Support STR_TO_MAP、SUBSTR built-in function in Table API
Key: FLINK-29523
URL: https://issues.apache.org/jira/browse/FLINK-29523
Project: Flink
zhangjingcun created FLINK-29522:
Summary: Support SPLIT_INDEX built-in function in Table API
Key: FLINK-29522
URL: https://issues.apache.org/jira/browse/FLINK-29522
Project: Flink
Issue
zhangjingcun created FLINK-29521:
Summary: Support REVERSE built-in function in Table API
Key: FLINK-29521
URL: https://issues.apache.org/jira/browse/FLINK-29521
Project: Flink
Issue Type:
zhangjingcun created FLINK-29520:
Summary: Support PARSE_URL built-in function in Table API
Key: FLINK-29520
URL: https://issues.apache.org/jira/browse/FLINK-29520
Project: Flink
Issue Type:
zhangjingcun created FLINK-29519:
Summary: Support DAYOFYEAR、DAYOFMONTH built-in function in Table
API
Key: FLINK-29519
URL: https://issues.apache.org/jira/browse/FLINK-29519
Project: Flink
zhangjingcun created FLINK-29518:
Summary: Support YEAR、QUARTER、MONTH、WEEK、HOUR、MINUTE、SECOND
built-in function in Table API
Key: FLINK-29518
URL: https://issues.apache.org/jira/browse/FLINK-29518
zhangjingcun created FLINK-29517:
Summary: Support DATE_FORMAT built-in function in Table API
Key: FLINK-29517
URL: https://issues.apache.org/jira/browse/FLINK-29517
Project: Flink
Issue
zhangjingcun created FLINK-29516:
Summary: Support TIMESTAMPADD built-in function in Table API
Key: FLINK-29516
URL: https://issues.apache.org/jira/browse/FLINK-29516
Project: Flink
Issue
Mason Chen created FLINK-29515:
--
Summary: Document KafkaSource behavior with deleted topics
Key: FLINK-29515
URL: https://issues.apache.org/jira/browse/FLINK-29515
Project: Flink
Issue Type:
Martijn Visser created FLINK-29514:
--
Summary: Bump Minikdc to v3.2.4
Key: FLINK-29514
URL: https://issues.apache.org/jira/browse/FLINK-29514
Project: Flink
Issue Type: Technical Debt
Martijn Visser created FLINK-29513:
--
Summary: Update Kafka version to 3.2.3
Key: FLINK-29513
URL: https://issues.apache.org/jira/browse/FLINK-29513
Project: Flink
Issue Type: Technical Debt
I also noticed that we two replies in a separate thread on the User mailing
list, which can be found at
https://lists.apache.org/thread/m5ntl3cj81wg7frbfqg9v75c7hqnxtls.
I've included Clayton and David in this email, to at least centralize the
conversation once more :)
On Wed, Oct 5, 2022 at
+1 (non-binding)
- Verified checksums
- Built and ran all the tests
- Verified all pom files point to the same version
- Verified helm chart works as expected and expected docker image
- Tested basic application clusters (v1.13-v1.15) on EKS 1.21.
On 05/10/2022, 13:16, "Őrhidi Mátyás" wrote:
Thanks Jiabao.
+1 (binding)
On Fri, Sep 30, 2022 at 11:22 AM Martijn Visser
wrote:
> Thanks Jiabao!
> +1 (binding)
>
> Cheers, Martijn
>
> On Fri, Sep 30, 2022 at 11:04 AM jiabao.sun .invalid>
> wrote:
>
> > Hi everyone,
> >
> >
> > Thanks for all your feedback for FLIP-262[1]: MongoDB
Hi everyone,
I noticed that we still have a Flink HCatalog connector in the Flink
codebase, which is ancient and undocumented. I don't think this is
necessary anymore, given that we have a Hive connector. I would propose to
remove it.
Looking forward to your feedback.
Thanks,
Martijn
@Maciek
I saw that I missed replying to your question:
> Could you please remind what was the conclusion of discussion on
upgrading Scala to 2.12.15/16?
> https://lists.apache.org/thread/hwksnsqyg7n3djymo7m1s7loymxxbc3t - I
couldn't find any follow-up vote?
There is a vote thread, but that
Fabian Paul created FLINK-29512:
---
Summary: Align SubtaskCommittableManager checkpointId with
CheckpointCommittableManagerImpl checkpointId during recovery
Key: FLINK-29512
URL:
Chesnay Schepler created FLINK-29511:
Summary: Sort properties in OpenAPI spec
Key: FLINK-29511
URL: https://issues.apache.org/jira/browse/FLINK-29511
Project: Flink
Issue Type:
Hi Martin,
Thanks for bringing this up! Lately I was thinking about to bump the hadoop
version to at least 2.6.1 to clean up issues like this:
Chesnay Schepler created FLINK-29510:
Summary: Add NoticeFileChecker tests
Key: FLINK-29510
URL: https://issues.apache.org/jira/browse/FLINK-29510
Project: Flink
Issue Type: Technical
Fabian Paul created FLINK-29509:
---
Summary: Set correct subtaskId during recovery of committables
Key: FLINK-29509
URL: https://issues.apache.org/jira/browse/FLINK-29509
Project: Flink
Issue
+1 (non-binding)
- Verified source distributions (except the licenses and maven artifacts)
- Verified Helm chart and Docker image
- Verified basic examples
Everything seems okay to me.
Cheers,
Matyas
On Tue, Oct 4, 2022 at 10:27 PM Gyula Fóra wrote:
> +1 (binding)
>
> - Verified Helm repo
Chesnay Schepler created FLINK-29508:
Summary: Some NOTICE files are not checked for correctness
Key: FLINK-29508
URL: https://issues.apache.org/jira/browse/FLINK-29508
Project: Flink
> I'm curious what target Scala versions people are currently interested
in.
> I would've expected that everyone wants to migrate to Scala 3, for which
several wrapper projects around Flink already exist
The Scala 3 tooling is still subpar (we're using IntelliJ), so I'm not sure
I would move my
I had a similar use case.
What we did is that we decided that data for enrichment must be versioned,
for example our enrichment data was "refreshed" once a day and we kept old
data.
During the enrichment process we lookup data for given version based on
record's metadata.
Regards.
Krzysztof
> It's possible that for the sake of the Scala API, we would
occasionally require some changes in the Java API. As long as those
changes are not detrimental to Java users, they should be considered.
That's exactly the model we're trying to get to. Don't fix
scala-specific issues with scala
Hello everyone,
I've already answered a bit on Twitter, I'll develop my thoughts a bit
here. For context, my company (DataDome) has a significant codebase on
Scala Flink (around 110K LOC), having been using it since 2017. I myself am
an enthusiastic Scala developer (I don't think I'd like moving
Hi everyone,
I started looking into the migration test data generation after I got
confused about it in the context of the 1.16.0-rc1 voting process [1]. The
release manager needs to do some manual work around adding test data for
certain tests after the release is finalized (see item 15 in the
I have flink job and the current flow looks like below
Source_Kafka->*operator-1*(key by partition)->*operator-2*(enrich the
record)-*Sink1-Operator* & *Sink2-Operator *
With this flow the current problem is at operator-2, the core logic runs
here is to lookup some reference status data from
Gabor Somogyi created FLINK-29507:
-
Summary: Make e2e tests independent of the current directory
Key: FLINK-29507
URL: https://issues.apache.org/jira/browse/FLINK-29507
Project: Flink
Issue
Jane Chan created FLINK-29506:
-
Summary: ParquetInputFormatFactory fails to create format on Flink
1.14
Key: FLINK-29506
URL: https://issues.apache.org/jira/browse/FLINK-29506
Project: Flink
34 matches
Mail list logo