[jira] [Created] (FLINK-34446) regression: alias is not in lateral join

2024-02-15 Thread Jing Ge (Jira)
Jing Ge created FLINK-34446:
---

 Summary: regression: alias is not in lateral join
 Key: FLINK-34446
 URL: https://issues.apache.org/jira/browse/FLINK-34446
 Project: Flink
  Issue Type: Bug
Reporter: Jing Ge


found one regression issue. Query working Flink 1.17.2, but failing with Flink 
1.18.+

 
{code:java}
-- Query working Flink 1.17.2, but failing with Flink 1.18.+

-- -- [ERROR] Could not execute SQL statement. Reason:

-- -- org.apache.calcite.sql.validate.SqlValidatorException: Table 's' not found

SELECT
a_or_b,
id, 
splits
FROM sample as s ,
LATERAL TABLE(split(s.id,'[01]')) lt(splits)
CROSS JOIN (VALUES ('A'), ('B')) AS cj(a_or_b); {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34236) Evaluate strange unstable build after cleaning up CI machines

2024-01-25 Thread Jing Ge (Jira)
Jing Ge created FLINK-34236:
---

 Summary: Evaluate strange unstable build after cleaning up CI 
machines
 Key: FLINK-34236
 URL: https://issues.apache.org/jira/browse/FLINK-34236
 Project: Flink
  Issue Type: Improvement
  Components: Test Infrastructure
Reporter: Jing Ge


To check if it is one time issue because infra change or not.

 

https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=56601&view=logs&j=e9d3d34f-3d15-59f4-0e3e-35067d100dfe&t=5d91035e-8022-55f2-2d4f-ab121508bf7e



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34165) It seems that Apache download link has been changed

2024-01-18 Thread Jing Ge (Jira)
Jing Ge created FLINK-34165:
---

 Summary: It seems that Apache download link has been changed
 Key: FLINK-34165
 URL: https://issues.apache.org/jira/browse/FLINK-34165
 Project: Flink
  Issue Type: Bug
  Components: flink-docker
Affects Versions: 1.18.1, 1.17.2, 1.16.3, 1.15.4
Reporter: Jing Ge
 Attachments: image-2024-01-19-07-55-07-775.png

The link 
[https://www.apache.org/dist/flink/flink-1.17.2/flink-1.17.2-bin-scala_2.12.tgz.asc][1]
 worked previously now redirect to a list page which leads to a wrong 
flink.tgz.asc with HTML instead of expected signature.

!image-2024-01-19-07-55-07-775.png!

The link should be replace with 
https://downloads.apache.org/flink/flink-1.17.2/flink-1.17.2-bin-scala_2.12.tgz.asc

 

[1] 
https://github.com/apache/flink-docker/blob/627987997ca7ec86bcc3d80b26df58aa595b91af/1.17/scala_2.12-java11-ubuntu/Dockerfile#L48C19-L48C101



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34075) Mark version as released in Jira (need PMC role)

2024-01-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-34075:
---

 Summary: Mark version as released in Jira (need PMC role)
 Key: FLINK-34075
 URL: https://issues.apache.org/jira/browse/FLINK-34075
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


In JIRA, inside [version 
management|https://issues.apache.org/jira/plugins/servlet/project-config/FLINK/versions],
 hover over the current release and a settings menu will appear. Click Release, 
and select today’s date.

(Note: Only PMC members have access to the project administration. If you do 
not have access, ask on the mailing list for assistance.)

 

h3. Expectations
 * Release tagged in the source code repository
 * Release version finalized in JIRA. (Note: Not all committers have 
administrator access to JIRA. If you end up getting permissions errors ask on 
the mailing list for assistance)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34074) Verify artifacts related expectations

2024-01-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-34074:
---

 Summary: Verify artifacts related expectations
 Key: FLINK-34074
 URL: https://issues.apache.org/jira/browse/FLINK-34074
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


h3. Expectations
 * Maven artifacts released and indexed in the [Maven Central 
Repository|https://search.maven.org/#search%7Cga%7C1%7Cg%3A%22org.apache.flink%22]
 (usually takes about a day to show up)
 * Source & binary distributions available in the release repository of 
[https://dist.apache.org/repos/dist/release/flink/]
 * Dev repository [https://dist.apache.org/repos/dist/dev/flink/] is empty
 * Website contains links to new release binaries and sources in download page
 * (for minor version updates) the front page references the correct new major 
release version and directs to the correct link



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-34073) Remove old release candidates from dist.apache.org

2024-01-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-34073:
---

 Summary: Remove old release candidates from dist.apache.org
 Key: FLINK-34073
 URL: https://issues.apache.org/jira/browse/FLINK-34073
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Jing Ge


h3. Remove old release candidates from [dist.apache.org|http://dist.apache.org/]

Remove the old release candidates from 
[https://dist.apache.org/repos/dist/dev/flink] using Subversion.
{code:java}
$ svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates
$ cd flink
$ svn remove flink-${RELEASE_VERSION}-rc*
$ svn commit -m "Remove old release candidates for Apache Flink 
${RELEASE_VERSION}
{code}
 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33886) CLONE - Build and stage Java and Python artifacts

2023-12-19 Thread Jing Ge (Jira)
Jing Ge created FLINK-33886:
---

 Summary: CLONE - Build and stage Java and Python artifacts
 Key: FLINK-33886
 URL: https://issues.apache.org/jira/browse/FLINK-33886
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Jing Ge


# Create a local release branch ((!) this step can not be skipped for minor 
releases):
{code:bash}
$ cd ./tools
tools/ $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$RELEASE_VERSION 
RELEASE_CANDIDATE=$RC_NUM releasing/create_release_branch.sh
{code}
 # Tag the release commit:
{code:bash}
$ git tag -s ${TAG} -m "${TAG}"
{code}
 # We now need to do several things:
 ## Create the source release archive
 ## Deploy jar artefacts to the [Apache Nexus 
Repository|https://repository.apache.org/], which is the staging area for 
deploying the jars to Maven Central
 ## Build PyFlink wheel packages
You might want to create a directory on your local machine for collecting the 
various source and binary releases before uploading them. Creating the binary 
releases is a lengthy process but you can do this on another machine (for 
example, in the "cloud"). When doing this, you can skip signing the release 
files on the remote machine, download them to your local machine and sign them 
there.
 # Build the source release:
{code:bash}
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_source_release.sh
{code}
 # Stage the maven artifacts:
{code:bash}
tools $ releasing/deploy_staging_jars.sh
{code}
Review all staged artifacts ([https://repository.apache.org/]). They should 
contain all relevant parts for each module, including pom.xml, jar, test jar, 
source, test source, javadoc, etc. Carefully review any new artifacts.
 # Close the staging repository on Apache Nexus. When prompted for a 
description, enter “Apache Flink, version X, release candidate Y”.
Then, you need to build the PyFlink wheel packages (since 1.11):
 # Set up an azure pipeline in your own Azure account. You can refer to [Azure 
Pipelines|https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository]
 for more details on how to set up azure pipeline for a fork of the Flink 
repository. Note that a google cloud mirror in Europe is used for downloading 
maven artifacts, therefore it is recommended to set your [Azure organization 
region|https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/change-organization-location]
 to Europe to speed up the downloads.
 # Push the release candidate branch to your forked personal Flink repository, 
e.g.
{code:bash}
tools $ git push  
refs/heads/release-${RELEASE_VERSION}-rc${RC_NUM}:release-${RELEASE_VERSION}-rc${RC_NUM}
{code}
 # Trigger the Azure Pipelines manually to build the PyFlink wheel packages
 ## Go to your Azure Pipelines Flink project → Pipelines
 ## Click the "New pipeline" button on the top right
 ## Select "GitHub" → your GitHub Flink repository → "Existing Azure Pipelines 
YAML file"
 ## Select your branch → Set path to "/azure-pipelines.yaml" → click on 
"Continue" → click on "Variables"
 ## Then click "New Variable" button, fill the name with "MODE", and the value 
with "release". Click "OK" to set the variable and the "Save" button to save 
the variables, then back on the "Review your pipeline" screen click "Run" to 
trigger the build.
 ## You should now see a build where only the "CI build (release)" is running
 # Download the PyFlink wheel packages from the build result page after the 
jobs of "build_wheels mac" and "build_wheels linux" have finished.
 ## Download the PyFlink wheel packages
 ### Open the build result page of the pipeline
 ### Go to the {{Artifacts}} page (build_wheels linux -> 1 artifact)
 ### Click {{wheel_Darwin_build_wheels mac}} and {{wheel_Linux_build_wheels 
linux}} separately to download the zip files
 ## Unzip these two zip files
{code:bash}
$ cd /path/to/downloaded_wheel_packages
$ unzip wheel_Linux_build_wheels\ linux.zip
$ unzip wheel_Darwin_build_wheels\ mac.zip{code}
 ## Create directory {{./dist}} under the directory of {{{}flink-python{}}}:
{code:bash}
$ cd 
$ mkdir flink-python/dist{code}
 ## Move the unzipped wheel packages to the directory of 
{{{}flink-python/dist{}}}:
{code:java}
$ mv /path/to/wheel_Darwin_build_wheels\ mac/* flink-python/dist/
$ mv /path/to/wheel_Linux_build_wheels\ linux/* flink-python/dist/
$ cd tools{code}

Finally, we create the binary convenience release files:
{code:bash}
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_binary_release.sh
{code}
If you want to run this step in parallel on a remote machine you have to make 
the release commit available there (for example by pushing to a repository). 
*This is important: the commit inside the binary builds has to match the commit 
of the source builds and the tagged release commit.* 
When building remotely, you can skip gpg 

[jira] [Created] (FLINK-33889) CLONE - Vote on the release candidate

2023-12-19 Thread Jing Ge (Jira)
Jing Ge created FLINK-33889:
---

 Summary: CLONE - Vote on the release candidate
 Key: FLINK-33889
 URL: https://issues.apache.org/jira/browse/FLINK-33889
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.18.0
Reporter: Jing Ge
Assignee: Jing Ge


Once you have built and individually reviewed the release candidate, please 
share it for the community-wide review. Please review foundation-wide [voting 
guidelines|http://www.apache.org/foundation/voting.html] for more information.

Start the review-and-vote thread on the dev@ mailing list. Here’s an email 
template; please adjust as you see fit.
{quote}From: Release Manager
To: dev@flink.apache.org
Subject: [VOTE] Release 1.2.3, release candidate #3

Hi everyone,
Please review and vote on the release candidate #3 for the version 1.2.3, as 
follows:
[ ] +1, Approve the release
[ ] -1, Do not approve the release (please provide specific comments)

The complete staging area is available for your review, which includes:
 * JIRA release notes [1],
 * the official Apache source release and binary convenience releases to be 
deployed to dist.apache.org [2], which are signed with the key with fingerprint 
 [3],
 * all artifacts to be deployed to the Maven Central Repository [4],
 * source code tag "release-1.2.3-rc3" [5],
 * website pull request listing the new release and adding announcement blog 
post [6].

The vote will be open for at least 72 hours. It is adopted by majority 
approval, with at least 3 PMC affirmative votes.

Thanks,
Release Manager

[1] link
[2] link
[3] [https://dist.apache.org/repos/dist/release/flink/KEYS]
[4] link
[5] link
[6] link
{quote}
*If there are any issues found in the release candidate, reply on the vote 
thread to cancel the vote.* There’s no need to wait 72 hours. Proceed to the 
Fix Issues step below and address the problem. However, some issues don’t 
require cancellation. For example, if an issue is found in the website pull 
request, just correct it on the spot and the vote can continue as-is.

For cancelling a release, the release manager needs to send an email to the 
release candidate thread, stating that the release candidate is officially 
cancelled. Next, all artifacts created specifically for the RC in the previous 
steps need to be removed:
 * Delete the staging repository in Nexus
 * Remove the source / binary RC files from dist.apache.org
 * Delete the source code tag in git

*If there are no issues, reply on the vote thread to close the voting.* Then, 
tally the votes in a separate email. Here’s an email template; please adjust as 
you see fit.
{quote}From: Release Manager
To: dev@flink.apache.org
Subject: [RESULT] [VOTE] Release 1.2.3, release candidate #3

I'm happy to announce that we have unanimously approved this release.

There are XXX approving votes, XXX of which are binding:
 * approver 1
 * approver 2
 * approver 3
 * approver 4

There are no disapproving votes.

Thanks everyone!
{quote}
 

h3. Expectations
 * Community votes to release the proposed candidate, with at least three 
approving PMC votes

Any issues that are raised till the vote is over should be either resolved or 
moved into the next release (if applicable).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33888) CLONE - Propose a pull request for website updates

2023-12-19 Thread Jing Ge (Jira)
Jing Ge created FLINK-33888:
---

 Summary: CLONE - Propose a pull request for website updates
 Key: FLINK-33888
 URL: https://issues.apache.org/jira/browse/FLINK-33888
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.18.0
Reporter: Jing Ge
Assignee: Jing Ge


The final step of building the candidate is to propose a website pull request 
containing the following changes:
 * update docs/data/flink.yml

 ** Add a new major version or update minor version as required
 * update docs/data/release_archive.yml
 * update version references in quickstarts ({{{}q/{}}} directory) as required 
(outdated?)
 * add a blog post announcing the release in {{docs/content/posts}}

(!) Don’t merge the PRs before finalizing the release.

 

h3. Expectations
 * Website pull request proposed to list the 
[release|http://flink.apache.org/downloads.html]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33887) CLONE - Stage source and binary releases on dist.apache.org

2023-12-19 Thread Jing Ge (Jira)
Jing Ge created FLINK-33887:
---

 Summary: CLONE - Stage source and binary releases on 
dist.apache.org
 Key: FLINK-33887
 URL: https://issues.apache.org/jira/browse/FLINK-33887
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Jing Ge


Copy the source release to the dev repository of dist.apache.org:
# If you have not already, check out the Flink section of the dev repository on 
dist.apache.org via Subversion. In a fresh directory:
{code:bash}
$ svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates
{code}
# Make a directory for the new release and copy all the artifacts (Flink 
source/binary distributions, hashes, GPG signatures and the python 
subdirectory) into that newly created directory:
{code:bash}
$ mkdir flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
$ mv /tools/releasing/release/* 
flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
{code}
# Add and commit all the files.
{code:bash}
$ cd flink
flink $ svn add flink-${RELEASE_VERSION}-rc${RC_NUM}
flink $ svn commit -m "Add flink-${RELEASE_VERSION}-rc${RC_NUM}"
{code}
# Verify that files are present under 
[https://dist.apache.org/repos/dist/dev/flink|https://dist.apache.org/repos/dist/dev/flink].
# Push the release tag if not done already (the following command assumes to be 
called from within the apache/flink checkout):
{code:bash}
$ git push  refs/tags/release-${RELEASE_VERSION}-rc${RC_NUM}
{code}

 

h3. Expectations
 * Maven artifacts deployed to the staging repository of 
[repository.apache.org|https://repository.apache.org/content/repositories/]
 * Source distribution deployed to the dev repository of 
[dist.apache.org|https://dist.apache.org/repos/dist/dev/flink/]
 * Check hashes (e.g. shasum -c *.sha512)
 * Check signatures (e.g. {{{}gpg --verify 
flink-1.2.3-source-release.tar.gz.asc flink-1.2.3-source-release.tar.gz{}}})
 * {{grep}} for legal headers in each file.
 * If time allows check the NOTICE files of the modules whose dependencies have 
been changed in this release in advance, since the license issues from time to 
time pop up during voting. See [Verifying a Flink 
Release|https://cwiki.apache.org/confluence/display/FLINK/Verifying+a+Flink+Release]
 "Checking License" section.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33885) Build Release Candidate: 1.18.1-rc2

2023-12-19 Thread Jing Ge (Jira)
Jing Ge created FLINK-33885:
---

 Summary: Build Release Candidate: 1.18.1-rc2
 Key: FLINK-33885
 URL: https://issues.apache.org/jira/browse/FLINK-33885
 Project: Flink
  Issue Type: New Feature
Affects Versions: 1.18.0
Reporter: Jing Ge
Assignee: Jing Ge


The core of the release process is the build-vote-fix cycle. Each cycle 
produces one release candidate. The Release Manager repeats this cycle until 
the community approves one release candidate, which is then finalized.

h4. Prerequisites
Set up a few environment variables to simplify Maven commands that follow. This 
identifies the release candidate being built. Start with {{RC_NUM}} equal to 1 
and increment it for each candidate:
{code}
RC_NUM="1"
TAG="release-${RELEASE_VERSION}-rc${RC_NUM}"
{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33882) UT/IT for checkpointing statistics

2023-12-19 Thread Jing Ge (Jira)
Jing Ge created FLINK-33882:
---

 Summary: UT/IT for checkpointing statistics
 Key: FLINK-33882
 URL: https://issues.apache.org/jira/browse/FLINK-33882
 Project: Flink
  Issue Type: Improvement
  Components: Runtime / Checkpointing
Reporter: Jing Ge


https://issues.apache.org/jira/browse/FLINK-33588

has been manually tested by [~zhutong66] as follows:
1. I will package the modified code, and the code modification will be done in 
the jar package of flink-dist-xxx.jar. Replace the jar package with the 
production Flink client.
2. Submit the Flink SQL task in the production environment to Yarn in 
application mode and check the Yarn logs
3. Check for any further errors in the Yarn log.
4. On the web interface of Flink web, check if the data displayed on the 
checkpoint information statistics page is normal.

It would be great to write UT or IT for this change



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33851) CLONE - Start End of Life discussion thread for now outdated Flink minor version

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33851:
---

 Summary: CLONE - Start End of Life discussion thread for now 
outdated Flink minor version
 Key: FLINK-33851
 URL: https://issues.apache.org/jira/browse/FLINK-33851
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


The idea is to discuss whether we should do a final release for the now not 
supported minor version in the community. Such a minor release shouldn't be 
covered by the current minor version release managers. Their only 
responsibility is to trigger the discussion.

The intention of a final patch release for the now unsupported Flink minor 
version is to flush out all the fixes that didn't end up in the previous 
release.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33848) CLONE - Other announcements

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33848:
---

 Summary: CLONE - Other announcements
 Key: FLINK-33848
 URL: https://issues.apache.org/jira/browse/FLINK-33848
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


h3. Recordkeeping

Use [reporter.apache.org|https://reporter.apache.org/addrelease.html?flink] to 
seed the information about the release into future project reports.

(Note: Only PMC members have access report releases. If you do not have access, 
ask on the mailing list for assistance.)
h3. Flink blog

Major or otherwise important releases should have a blog post. Write one if 
needed for this particular release. Minor releases that don’t introduce new 
major functionality don’t necessarily need to be blogged (see [flink-web PR 
#581 for Flink 1.15.3|https://github.com/apache/flink-web/pull/581] as an 
example for a minor release blog post).

Please make sure that the release notes of the documentation (see section 
"Review and update documentation") are linked from the blog post of a major 
release.
We usually include the names of all contributors in the announcement blog post. 
Use the following command to get the list of contributors:
{code}
# first line is required to make sort first with uppercase and then lower
export LC_ALL=C
export FLINK_PREVIOUS_RELEASE_BRANCH=
export FLINK_CURRENT_RELEASE_BRANCH=
# e.g.
# export FLINK_PREVIOUS_RELEASE_BRANCH=release-1.17
# export FLINK_CURRENT_RELEASE_BRANCH=release-1.18
git log $(git merge-base master $FLINK_PREVIOUS_RELEASE_BRANCH)..$(git show-ref 
--hash ${FLINK_CURRENT_RELEASE_BRANCH}) --pretty=format:"%an%n%cn" | sort  -u | 
paste -sd, | sed "s/\,/\, /g"
{code}
h3. Social media

Tweet, post on Facebook, LinkedIn, and other platforms. Ask other contributors 
to do the same.
h3. Flink Release Wiki page

Add a summary of things that went well or that went not so well during the 
release process. This can include feedback from contributors but also more 
generic things like the release have taken longer than initially anticipated 
(and why) to give a bit of context to the release process.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33845) CLONE - Merge website pull request

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33845:
---

 Summary: CLONE - Merge website pull request
 Key: FLINK-33845
 URL: https://issues.apache.org/jira/browse/FLINK-33845
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


Merge the website pull request to [list the 
release|http://flink.apache.org/downloads.html]. Make sure to regenerate the 
website as well, as it isn't build automatically.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33844) CLONE - Update japicmp configuration

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33844:
---

 Summary: CLONE - Update japicmp configuration
 Key: FLINK-33844
 URL: https://issues.apache.org/jira/browse/FLINK-33844
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Sergey Nuyanzin
 Fix For: 1.19.0, 1.18.1


Update the japicmp reference version and wipe exclusions / enable API 
compatibility checks for {{@PublicEvolving}} APIs on the corresponding SNAPSHOT 
branch with the {{update_japicmp_configuration.sh}} script (see below).

For a new major release (x.y.0), run the same command also on the master branch 
for updating the japicmp reference version and removing out-dated exclusions in 
the japicmp configuration.

Make sure that all Maven artifacts are already pushed to Maven Central. 
Otherwise, there's a risk that CI fails due to missing reference artifacts.
{code:bash}
tools $ NEW_VERSION=$RELEASE_VERSION releasing/update_japicmp_configuration.sh
tools $ cd ..$ git add *$ git commit -m "Update japicmp configuration for 
$RELEASE_VERSION" {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33847) CLONE - Apache mailing lists announcements

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33847:
---

 Summary: CLONE - Apache mailing lists announcements
 Key: FLINK-33847
 URL: https://issues.apache.org/jira/browse/FLINK-33847
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


Announce on the {{dev@}} mailing list that the release has been finished.

Announce on the release on the {{user@}} mailing list, listing major 
improvements and contributions.

Announce the release on the [annou...@apache.org|mailto:annou...@apache.org] 
mailing list.
{panel}
{panel}
|{{From: Release Manager}}
{{To: dev@flink.apache.org, u...@flink.apache.org, user...@flink.apache.org, 
annou...@apache.org}}
{{Subject: [ANNOUNCE] Apache Flink 1.2.3 released}}
 
{{The Apache Flink community is very happy to announce the release of Apache 
Flink 1.2.3, which is the third bugfix release for the Apache Flink 1.2 
series.}}
 
{{Apache Flink® is an open-source stream processing framework for distributed, 
high-performing, always-available, and accurate data streaming applications.}}
 
{{The release is available for download at:}}
{{[https://flink.apache.org/downloads.html]}}
 
{{Please check out the release blog post for an overview of the improvements 
for this bugfix release:}}
{{}}
 
{{The full release notes are available in Jira:}}
{{}}
 
{{We would like to thank all contributors of the Apache Flink community who 
made this release possible!}}
 
{{Feel free to reach out to the release managers (or respond to this thread) 
with feedback on the release process. Our goal is to constantly improve the 
release process. Feedback on what could be improved or things that didn't go so 
well are appreciated.}}
 
{{Regards,}}
{{Release Manager}}|



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33850) CLONE - Updates the docs stable version

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33850:
---

 Summary: CLONE - Updates the docs stable version
 Key: FLINK-33850
 URL: https://issues.apache.org/jira/browse/FLINK-33850
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


Update docs to "stable" in {{docs/config.toml}} in the branch of the 
_just-released_ version:
 * Change V{{{}ersion{}}} from {{{}x.y-SNAPSHOT }}to \{{{}x.y.z{}}}, i.e. 
{{1.6-SNAPSHOT}} to {{1.6.0}}
 * Change V{{{}ersionTitle{}}} from {{x.y-SNAPSHOT}} to {{{}x.y{}}}, i.e. 
{{1.6-SNAPSHOT}} to {{1.6}}
 * Change Branch from {{master}} to {{{}release-x.y{}}}, i.e. {{master}} to 
{{release-1.6}}
 * Change {{baseURL}} from 
{{//[ci.apache.org/projects/flink/flink-docs-master|http://ci.apache.org/projects/flink/flink-docs-master]}}
 to 
{{//[ci.apache.org/projects/flink/flink-docs-release-x.y|http://ci.apache.org/projects/flink/flink-docs-release-x.y]}}
 * Change {{javadocs_baseurl}} from 
{{//[ci.apache.org/projects/flink/flink-docs-master|http://ci.apache.org/projects/flink/flink-docs-master]}}
 to 
{{//[ci.apache.org/projects/flink/flink-docs-release-x.y|http://ci.apache.org/projects/flink/flink-docs-release-x.y]}}
 * Change {{IsStable}} to {{true}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33849) CLONE - Update reference data for Migration Tests

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33849:
---

 Summary: CLONE - Update reference data for Migration Tests
 Key: FLINK-33849
 URL: https://issues.apache.org/jira/browse/FLINK-33849
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Sergey Nuyanzin
 Fix For: 1.19.0, 1.18.1


Update migration tests in master to cover migration from new version. Since 
1.18, this step could be done automatically with the following steps. For more 
information please refer to [this 
page.|https://github.com/apache/flink/blob/master/flink-test-utils-parent/flink-migration-test-utils/README.md]
 # {*}On the published release tag (e.g., release-1.16.0){*}, run 
{panel}
{panel}
|{{$ mvn clean }}{{package}} {{{}-Pgenerate-migration-test-data 
-Dgenerate.version={}}}{{{}1.16{}}} {{-nsu -Dfast -DskipTests}}|

The version (1.16 in the command above) should be replaced with the target one.

 # Modify the content of the file 
[apache/flink:flink-test-utils-parent/flink-migration-test-utils/src/main/resources/most_recently_published_version|https://github.com/apache/flink/blob/master/flink-test-utils-parent/flink-migration-test-utils/src/main/resources/most_recently_published_version]
 to the latest version (it would be "v1_16" if sticking to the example where 
1.16.0 was released). 
 # Commit the modification in step a and b with "{_}[release] Generate 
reference data for state migration tests based on release-1.xx.0{_}" to the 
corresponding release branch (e.g. {{release-1.16}} in our example), replace 
"xx" with the actual version (in this example "16"). You should use the Jira 
issue ID in case of [release]  as the commit message's prefix if you have a 
dedicated Jira issue for this task.

 # Cherry-pick the commit to the master branch. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33843) Promote release 1.18.1

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33843:
---

 Summary: Promote release 1.18.1
 Key: FLINK-33843
 URL: https://issues.apache.org/jira/browse/FLINK-33843
 Project: Flink
  Issue Type: New Feature
Affects Versions: 1.18.0
Reporter: Jing Ge
Assignee: Jing Ge


Once the release has been finalized (FLINK-32920), the last step of the process 
is to promote the release within the project and beyond. Please wait for 24h 
after finalizing the release in accordance with the [ASF release 
policy|http://www.apache.org/legal/release-policy.html#release-announcements].

*Final checklist to declare this issue resolved:*
 # Website pull request to [list the 
release|http://flink.apache.org/downloads.html] merged
 # Release announced on the user@ mailing list.
 # Blog post published, if applicable.
 # Release recorded in 
[reporter.apache.org|https://reporter.apache.org/addrelease.html?flink].
 # Release announced on social media.
 # Completion declared on the dev@ mailing list.
 # Update Homebrew: [https://docs.brew.sh/How-To-Open-a-Homebrew-Pull-Request] 
(seems to be done automatically - at least for minor releases  for both minor 
and major releases)
 # Updated the japicmp configuration
 ** corresponding SNAPSHOT branch japicmp reference version set to the just 
released version, and API compatibiltity checks for {{@PublicEvolving}}  was 
enabled
 ** (minor version release only) master branch japicmp reference version set to 
the just released version
 ** (minor version release only) master branch japicmp exclusions have been 
cleared
 # Update the list of previous version in {{docs/config.toml}} on the master 
branch.
 # Set {{show_outdated_warning: true}} in {{docs/config.toml}} in the branch of 
the _now deprecated_ Flink version (i.e. 1.16 if 1.18.0 is released)
 # Update stable and master alias in 
[https://github.com/apache/flink/blob/master/.github/workflows/docs.yml]
 # Open discussion thread for End of Life for Unsupported version (i.e. 1.16)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33846) CLONE - Remove outdated versions

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33846:
---

 Summary: CLONE - Remove outdated versions
 Key: FLINK-33846
 URL: https://issues.apache.org/jira/browse/FLINK-33846
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


h4. dist.apache.org

For a new major release remove all release files older than 2 versions, e.g., 
when releasing 1.7, remove all releases <= 1.5.

For a new bugfix version remove all release files for previous bugfix releases 
in the same series, e.g., when releasing 1.7.1, remove the 1.7.0 release.
# If you have not already, check out the Flink section of the {{release}} 
repository on {{[dist.apache.org|http://dist.apache.org/]}} via Subversion. In 
a fresh directory:
{code}
svn checkout https://dist.apache.org/repos/dist/release/flink --depth=immediates
cd flink
{code}
# Remove files for outdated releases and commit the changes.
{code}
svn remove flink-
svn commit
{code}
# Verify that files  are 
[removed|https://dist.apache.org/repos/dist/release/flink]
(!) Remember to remove the corresponding download links from the website.

h4. CI

Disable the cron job for the now-unsupported version from 
(tools/azure-pipelines/[build-apache-repo.yml|https://github.com/apache/flink/blob/master/tools/azure-pipelines/build-apache-repo.yml])
 in the respective branch.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33840) CLONE - Deploy artifacts to Maven Central Repository

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33840:
---

 Summary: CLONE - Deploy artifacts to Maven Central Repository
 Key: FLINK-33840
 URL: https://issues.apache.org/jira/browse/FLINK-33840
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Qingsheng Ren


Use the [Apache Nexus repository|https://repository.apache.org/] to release the 
staged binary artifacts to the Maven Central repository. In the Staging 
Repositories section, find the relevant release candidate orgapacheflink-XXX 
entry and click Release. Drop all other release candidates that are not being 
released.
h3. Deploy source and binary releases to dist.apache.org

Copy the source and binary releases from the dev repository to the release 
repository at [dist.apache.org|http://dist.apache.org/] using Subversion.
{code:java}
$ svn move -m "Release Flink ${RELEASE_VERSION}" 
https://dist.apache.org/repos/dist/dev/flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
 https://dist.apache.org/repos/dist/release/flink/flink-${RELEASE_VERSION}
{code}
(Note: Only PMC members have access to the release repository. If you do not 
have access, ask on the mailing list for assistance.)
h3. Remove old release candidates from [dist.apache.org|http://dist.apache.org/]

Remove the old release candidates from 
[https://dist.apache.org/repos/dist/dev/flink] using Subversion.
{code:java}
$ svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates
$ cd flink
$ svn remove flink-${RELEASE_VERSION}-rc*
$ svn commit -m "Remove old release candidates for Apache Flink 
${RELEASE_VERSION}
{code}
 

h3. Expectations
 * Maven artifacts released and indexed in the [Maven Central 
Repository|https://search.maven.org/#search%7Cga%7C1%7Cg%3A%22org.apache.flink%22]
 (usually takes about a day to show up)
 * Source & binary distributions available in the release repository of 
[https://dist.apache.org/repos/dist/release/flink/]
 * Dev repository [https://dist.apache.org/repos/dist/dev/flink/] is empty
 * Website contains links to new release binaries and sources in download page
 * (for minor version updates) the front page references the correct new major 
release version and directs to the correct link



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33841) CLONE - Create Git tag and mark version as released in Jira

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33841:
---

 Summary: CLONE - Create Git tag and mark version as released in 
Jira
 Key: FLINK-33841
 URL: https://issues.apache.org/jira/browse/FLINK-33841
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Qingsheng Ren


Create and push a new Git tag for the released version by copying the tag for 
the final release candidate, as follows:
{code:java}
$ git tag -s "release-${RELEASE_VERSION}" refs/tags/${TAG}^{} -m "Release Flink 
${RELEASE_VERSION}"
$ git push  refs/tags/release-${RELEASE_VERSION}
{code}
In JIRA, inside [version 
management|https://issues.apache.org/jira/plugins/servlet/project-config/FLINK/versions],
 hover over the current release and a settings menu will appear. Click Release, 
and select today’s date.

(Note: Only PMC members have access to the project administration. If you do 
not have access, ask on the mailing list for assistance.)

If PRs have been merged to the release branch after the the last release 
candidate was tagged, make sure that the corresponding Jira tickets have the 
correct Fix Version set.

 

h3. Expectations
 * Release tagged in the source code repository
 * Release version finalized in JIRA. (Note: Not all committers have 
administrator access to JIRA. If you end up getting permissions errors ask on 
the mailing list for assistance)



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33839) CLONE - Deploy Python artifacts to PyPI

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33839:
---

 Summary: CLONE - Deploy Python artifacts to PyPI
 Key: FLINK-33839
 URL: https://issues.apache.org/jira/browse/FLINK-33839
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Qingsheng Ren


Release manager should create a PyPI account and ask the PMC add this account 
to pyflink collaborator list with Maintainer role (The PyPI admin account info 
can be found here. NOTE, only visible to PMC members) to deploy the Python 
artifacts to PyPI. The artifacts could be uploaded using 
twine([https://pypi.org/project/twine/]). To install twine, just run:
{code:java}
pip install --upgrade twine==1.12.0
{code}
Download the python artifacts from dist.apache.org and upload it to pypi.org:
{code:java}
svn checkout 
https://dist.apache.org/repos/dist/dev/flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
cd flink-${RELEASE_VERSION}-rc${RC_NUM}
 
cd python
 
#uploads wheels
for f in *.whl; do twine upload --repository-url 
https://upload.pypi.org/legacy/ $f $f.asc; done
 
#upload source packages
twine upload --repository-url https://upload.pypi.org/legacy/ 
apache-flink-libraries-${RELEASE_VERSION}.tar.gz 
apache-flink-libraries-${RELEASE_VERSION}.tar.gz.asc
 
twine upload --repository-url https://upload.pypi.org/legacy/ 
apache-flink-${RELEASE_VERSION}.tar.gz 
apache-flink-${RELEASE_VERSION}.tar.gz.asc
{code}
If upload failed or incorrect for some reason (e.g. network transmission 
problem), you need to delete the uploaded release package of the same version 
(if exists) and rename the artifact to 
{{{}apache-flink-${RELEASE_VERSION}.post0.tar.gz{}}}, then re-upload.

(!) Note: re-uploading to pypi.org must be avoided as much as possible because 
it will cause some irreparable problems. If that happens, users cannot install 
the apache-flink package by explicitly specifying the package version, i.e. the 
following command "pip install apache-flink==${RELEASE_VERSION}" will fail. 
Instead they have to run "pip install apache-flink" or "pip install 
apache-flink==${RELEASE_VERSION}.post0" to install the apache-flink package.

 

h3. Expectations
 * Python artifacts released and indexed in the 
[PyPI|https://pypi.org/project/apache-flink/] Repository



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33838) Finalize release 1.18.1

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33838:
---

 Summary: Finalize release 1.18.1
 Key: FLINK-33838
 URL: https://issues.apache.org/jira/browse/FLINK-33838
 Project: Flink
  Issue Type: New Feature
Affects Versions: 1.17.0
Reporter: Jing Ge
Assignee: Qingsheng Ren


Once the release candidate has been reviewed and approved by the community, the 
release should be finalized. This involves the final deployment of the release 
candidate to the release repositories, merging of the website changes, etc.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33842) CLONE - Publish the Dockerfiles for the new release

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33842:
---

 Summary: CLONE - Publish the Dockerfiles for the new release
 Key: FLINK-33842
 URL: https://issues.apache.org/jira/browse/FLINK-33842
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Matthias Pohl


Note: the official Dockerfiles fetch the binary distribution of the target 
Flink version from an Apache mirror. After publishing the binary release 
artifacts, mirrors can take some hours to start serving the new artifacts, so 
you may want to wait to do this step until you are ready to continue with the 
"Promote the release" steps in the follow-up Jira.

Follow the [release instructions in the flink-docker 
repo|https://github.com/apache/flink-docker#release-workflow] to build the new 
Dockerfiles and send an updated manifest to Docker Hub so the new images are 
built and published.

 

h3. Expectations
 * Dockerfiles in [flink-docker|https://github.com/apache/flink-docker] updated 
for the new Flink release and pull request opened on the Docker official-images 
with an updated manifest



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33837) CLONE - Vote on the release candidate

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33837:
---

 Summary: CLONE - Vote on the release candidate
 Key: FLINK-33837
 URL: https://issues.apache.org/jira/browse/FLINK-33837
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.17.0
Reporter: Jing Ge
Assignee: Qingsheng Ren
 Fix For: 1.17.0


Once you have built and individually reviewed the release candidate, please 
share it for the community-wide review. Please review foundation-wide [voting 
guidelines|http://www.apache.org/foundation/voting.html] for more information.

Start the review-and-vote thread on the dev@ mailing list. Here’s an email 
template; please adjust as you see fit.
{quote}From: Release Manager
To: dev@flink.apache.org
Subject: [VOTE] Release 1.2.3, release candidate #3

Hi everyone,
Please review and vote on the release candidate #3 for the version 1.2.3, as 
follows:
[ ] +1, Approve the release
[ ] -1, Do not approve the release (please provide specific comments)

The complete staging area is available for your review, which includes:
 * JIRA release notes [1],
 * the official Apache source release and binary convenience releases to be 
deployed to dist.apache.org [2], which are signed with the key with fingerprint 
 [3],
 * all artifacts to be deployed to the Maven Central Repository [4],
 * source code tag "release-1.2.3-rc3" [5],
 * website pull request listing the new release and adding announcement blog 
post [6].

The vote will be open for at least 72 hours. It is adopted by majority 
approval, with at least 3 PMC affirmative votes.

Thanks,
Release Manager

[1] link
[2] link
[3] [https://dist.apache.org/repos/dist/release/flink/KEYS]
[4] link
[5] link
[6] link
{quote}
*If there are any issues found in the release candidate, reply on the vote 
thread to cancel the vote.* There’s no need to wait 72 hours. Proceed to the 
Fix Issues step below and address the problem. However, some issues don’t 
require cancellation. For example, if an issue is found in the website pull 
request, just correct it on the spot and the vote can continue as-is.

For cancelling a release, the release manager needs to send an email to the 
release candidate thread, stating that the release candidate is officially 
cancelled. Next, all artifacts created specifically for the RC in the previous 
steps need to be removed:
 * Delete the staging repository in Nexus
 * Remove the source / binary RC files from dist.apache.org
 * Delete the source code tag in git

*If there are no issues, reply on the vote thread to close the voting.* Then, 
tally the votes in a separate email. Here’s an email template; please adjust as 
you see fit.
{quote}From: Release Manager
To: dev@flink.apache.org
Subject: [RESULT] [VOTE] Release 1.2.3, release candidate #3

I'm happy to announce that we have unanimously approved this release.

There are XXX approving votes, XXX of which are binding:
 * approver 1
 * approver 2
 * approver 3
 * approver 4

There are no disapproving votes.

Thanks everyone!
{quote}
 

h3. Expectations
 * Community votes to release the proposed candidate, with at least three 
approving PMC votes

Any issues that are raised till the vote is over should be either resolved or 
moved into the next release (if applicable).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33836) CLONE - Propose a pull request for website updates

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33836:
---

 Summary: CLONE - Propose a pull request for website updates
 Key: FLINK-33836
 URL: https://issues.apache.org/jira/browse/FLINK-33836
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.17.0
Reporter: Jing Ge
Assignee: Qingsheng Ren
 Fix For: 1.17.0


The final step of building the candidate is to propose a website pull request 
containing the following changes:
 # update 
[apache/flink-web:_config.yml|https://github.com/apache/flink-web/blob/asf-site/_config.yml]
 ## update {{FLINK_VERSION_STABLE}} and {{FLINK_VERSION_STABLE_SHORT}} as 
required
 ## update version references in quickstarts ({{{}q/{}}} directory) as required
 ## (major only) add a new entry to {{flink_releases}} for the release binaries 
and sources
 ## (minor only) update the entry for the previous release in the series in 
{{flink_releases}}
 ### Please pay notice to the ids assigned to the download entries. They should 
be unique and reflect their corresponding version number.
 ## add a new entry to {{release_archive.flink}}
 # add a blog post announcing the release in _posts
 # add a organized release notes page under docs/content/release-notes and 
docs/content.zh/release-notes (like 
[https://nightlies.apache.org/flink/flink-docs-release-1.15/release-notes/flink-1.15/]).
 The page is based on the non-empty release notes collected from the issues, 
and only the issues that affect existing users should be included (e.g., 
instead of new functionality). It should be in a separate PR since it would be 
merged to the flink project.

(!) Don’t merge the PRs before finalizing the release.

 

h3. Expectations
 * Website pull request proposed to list the 
[release|http://flink.apache.org/downloads.html]
 * (major only) Check {{docs/config.toml}} to ensure that
 ** the version constants refer to the new version
 ** the {{baseurl}} does not point to {{flink-docs-master}}  but 
{{flink-docs-release-X.Y}} instead



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33834) CLONE - Build and stage Java and Python artifacts

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33834:
---

 Summary: CLONE - Build and stage Java and Python artifacts
 Key: FLINK-33834
 URL: https://issues.apache.org/jira/browse/FLINK-33834
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Qingsheng Ren


# Create a local release branch ((!) this step can not be skipped for minor 
releases):
{code:bash}
$ cd ./tools
tools/ $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$RELEASE_VERSION 
RELEASE_CANDIDATE=$RC_NUM releasing/create_release_branch.sh
{code}
 # Tag the release commit:
{code:bash}
$ git tag -s ${TAG} -m "${TAG}"
{code}
 # We now need to do several things:
 ## Create the source release archive
 ## Deploy jar artefacts to the [Apache Nexus 
Repository|https://repository.apache.org/], which is the staging area for 
deploying the jars to Maven Central
 ## Build PyFlink wheel packages
You might want to create a directory on your local machine for collecting the 
various source and binary releases before uploading them. Creating the binary 
releases is a lengthy process but you can do this on another machine (for 
example, in the "cloud"). When doing this, you can skip signing the release 
files on the remote machine, download them to your local machine and sign them 
there.
 # Build the source release:
{code:bash}
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_source_release.sh
{code}
 # Stage the maven artifacts:
{code:bash}
tools $ releasing/deploy_staging_jars.sh
{code}
Review all staged artifacts ([https://repository.apache.org/]). They should 
contain all relevant parts for each module, including pom.xml, jar, test jar, 
source, test source, javadoc, etc. Carefully review any new artifacts.
 # Close the staging repository on Apache Nexus. When prompted for a 
description, enter “Apache Flink, version X, release candidate Y”.
Then, you need to build the PyFlink wheel packages (since 1.11):
 # Set up an azure pipeline in your own Azure account. You can refer to [Azure 
Pipelines|https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository]
 for more details on how to set up azure pipeline for a fork of the Flink 
repository. Note that a google cloud mirror in Europe is used for downloading 
maven artifacts, therefore it is recommended to set your [Azure organization 
region|https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/change-organization-location]
 to Europe to speed up the downloads.
 # Push the release candidate branch to your forked personal Flink repository, 
e.g.
{code:bash}
tools $ git push  
refs/heads/release-${RELEASE_VERSION}-rc${RC_NUM}:release-${RELEASE_VERSION}-rc${RC_NUM}
{code}
 # Trigger the Azure Pipelines manually to build the PyFlink wheel packages
 ## Go to your Azure Pipelines Flink project → Pipelines
 ## Click the "New pipeline" button on the top right
 ## Select "GitHub" → your GitHub Flink repository → "Existing Azure Pipelines 
YAML file"
 ## Select your branch → Set path to "/azure-pipelines.yaml" → click on 
"Continue" → click on "Variables"
 ## Then click "New Variable" button, fill the name with "MODE", and the value 
with "release". Click "OK" to set the variable and the "Save" button to save 
the variables, then back on the "Review your pipeline" screen click "Run" to 
trigger the build.
 ## You should now see a build where only the "CI build (release)" is running
 # Download the PyFlink wheel packages from the build result page after the 
jobs of "build_wheels mac" and "build_wheels linux" have finished.
 ## Download the PyFlink wheel packages
 ### Open the build result page of the pipeline
 ### Go to the {{Artifacts}} page (build_wheels linux -> 1 artifact)
 ### Click {{wheel_Darwin_build_wheels mac}} and {{wheel_Linux_build_wheels 
linux}} separately to download the zip files
 ## Unzip these two zip files
{code:bash}
$ cd /path/to/downloaded_wheel_packages
$ unzip wheel_Linux_build_wheels\ linux.zip
$ unzip wheel_Darwin_build_wheels\ mac.zip{code}
 ## Create directory {{./dist}} under the directory of {{{}flink-python{}}}:
{code:bash}
$ cd 
$ mkdir flink-python/dist{code}
 ## Move the unzipped wheel packages to the directory of 
{{{}flink-python/dist{}}}:
{code:java}
$ mv /path/to/wheel_Darwin_build_wheels\ mac/* flink-python/dist/
$ mv /path/to/wheel_Linux_build_wheels\ linux/* flink-python/dist/
$ cd tools{code}

Finally, we create the binary convenience release files:
{code:bash}
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_binary_release.sh
{code}
If you want to run this step in parallel on a remote machine you have to make 
the release commit available there (for example by pushing to a repository). 
*This is important: the commit inside the binary builds has to match the commit 
of the source builds and the tagged release commit.* 
When building remotely, you can ski

[jira] [Created] (FLINK-33835) CLONE - Stage source and binary releases on dist.apache.org

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33835:
---

 Summary: CLONE - Stage source and binary releases on 
dist.apache.org
 Key: FLINK-33835
 URL: https://issues.apache.org/jira/browse/FLINK-33835
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Qingsheng Ren


Copy the source release to the dev repository of dist.apache.org:
# If you have not already, check out the Flink section of the dev repository on 
dist.apache.org via Subversion. In a fresh directory:
{code:bash}
$ svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates
{code}
# Make a directory for the new release and copy all the artifacts (Flink 
source/binary distributions, hashes, GPG signatures and the python 
subdirectory) into that newly created directory:
{code:bash}
$ mkdir flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
$ mv /tools/releasing/release/* 
flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
{code}
# Add and commit all the files.
{code:bash}
$ cd flink
flink $ svn add flink-${RELEASE_VERSION}-rc${RC_NUM}
flink $ svn commit -m "Add flink-${RELEASE_VERSION}-rc${RC_NUM}"
{code}
# Verify that files are present under 
[https://dist.apache.org/repos/dist/dev/flink|https://dist.apache.org/repos/dist/dev/flink].
# Push the release tag if not done already (the following command assumes to be 
called from within the apache/flink checkout):
{code:bash}
$ git push  refs/tags/release-${RELEASE_VERSION}-rc${RC_NUM}
{code}

 

h3. Expectations
 * Maven artifacts deployed to the staging repository of 
[repository.apache.org|https://repository.apache.org/content/repositories/]
 * Source distribution deployed to the dev repository of 
[dist.apache.org|https://dist.apache.org/repos/dist/dev/flink/]
 * Check hashes (e.g. shasum -c *.sha512)
 * Check signatures (e.g. {{{}gpg --verify 
flink-1.2.3-source-release.tar.gz.asc flink-1.2.3-source-release.tar.gz{}}})
 * {{grep}} for legal headers in each file.
 * If time allows check the NOTICE files of the modules whose dependencies have 
been changed in this release in advance, since the license issues from time to 
time pop up during voting. See [Verifying a Flink 
Release|https://cwiki.apache.org/confluence/display/FLINK/Verifying+a+Flink+Release]
 "Checking License" section.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33833) Build Release Candidate: 1.18.1-rc1

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33833:
---

 Summary: Build Release Candidate: 1.18.1-rc1
 Key: FLINK-33833
 URL: https://issues.apache.org/jira/browse/FLINK-33833
 Project: Flink
  Issue Type: New Feature
Affects Versions: 1.17.0
Reporter: Jing Ge
Assignee: Jing Ge
 Fix For: 1.17.0


The core of the release process is the build-vote-fix cycle. Each cycle 
produces one release candidate. The Release Manager repeats this cycle until 
the community approves one release candidate, which is then finalized.

h4. Prerequisites
Set up a few environment variables to simplify Maven commands that follow. This 
identifies the release candidate being built. Start with {{RC_NUM}} equal to 1 
and increment it for each candidate:
{code}
RC_NUM="1"
TAG="release-${RELEASE_VERSION}-rc${RC_NUM}"
{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33829) CLONE - Review Release Notes in JIRA

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33829:
---

 Summary: CLONE - Review Release Notes in JIRA
 Key: FLINK-33829
 URL: https://issues.apache.org/jira/browse/FLINK-33829
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Qingsheng Ren


JIRA automatically generates Release Notes based on the {{Fix Version}} field 
applied to issues. Release Notes are intended for Flink users (not Flink 
committers/contributors). You should ensure that Release Notes are informative 
and useful.

Open the release notes from the version status page by choosing the release 
underway and clicking Release Notes.

You should verify that the issues listed automatically by JIRA are appropriate 
to appear in the Release Notes. Specifically, issues should:
 * Be appropriately classified as {{{}Bug{}}}, {{{}New Feature{}}}, 
{{{}Improvement{}}}, etc.
 * Represent noteworthy user-facing changes, such as new functionality, 
backward-incompatible API changes, or performance improvements.
 * Have occurred since the previous release; an issue that was introduced and 
fixed between releases should not appear in the Release Notes.
 * Have an issue title that makes sense when read on its own.

Adjust any of the above properties to the improve clarity and presentation of 
the Release Notes.

Ensure that the JIRA release notes are also included in the release notes of 
the documentation (see section "Review and update documentation").
h4. Content of Release Notes field from JIRA tickets 

To get the list of "release notes" field from JIRA, you can ran the following 
script using JIRA REST API (notes the maxResults limits the number of entries):
{code:bash}
curl -s 
https://issues.apache.org/jira//rest/api/2/search?maxResults=200&jql=project%20%3D%20FLINK%20AND%20%22Release%20Note%22%20is%20not%20EMPTY%20and%20fixVersion%20%3D%20${RELEASE_VERSION}
 | jq '.issues[]|.key,.fields.summary,.fields.customfield_12310192' | paste - - 
-
{code}
{{jq}}  is present in most Linux distributions and on MacOS can be installed 
via brew.

 

h3. Expectations
 * Release Notes in JIRA have been audited and adjusted



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33827) CLONE - Review and update documentation

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33827:
---

 Summary: CLONE - Review and update documentation
 Key: FLINK-33827
 URL: https://issues.apache.org/jira/browse/FLINK-33827
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.17.0
Reporter: Jing Ge
Assignee: Qingsheng Ren
 Fix For: 1.17.0


There are a few pages in the documentation that need to be reviewed and updated 
for each release.
 * Ensure that there exists a release notes page for each non-bugfix release 
(e.g., 1.5.0) in {{{}./docs/release-notes/{}}}, that it is up-to-date, and 
linked from the start page of the documentation.
 * Upgrading Applications and Flink Versions: 
[https://ci.apache.org/projects/flink/flink-docs-master/ops/upgrading.html]
 * ...

 

h3. Expectations
 * Update upgrade compatibility table 
([apache-flink:./docs/content/docs/ops/upgrading.md|https://github.com/apache/flink/blob/master/docs/content/docs/ops/upgrading.md#compatibility-table]
 and 
[apache-flink:./docs/content.zh/docs/ops/upgrading.md|https://github.com/apache/flink/blob/master/docs/content.zh/docs/ops/upgrading.md#compatibility-table]).
 * Update [Release Overview in 
Confluence|https://cwiki.apache.org/confluence/display/FLINK/Release+Management+and+Feature+Plan]
 * (minor only) The documentation for the new major release is visible under 
[https://nightlies.apache.org/flink/flink-docs-release-$SHORT_RELEASE_VERSION] 
(after at least one [doc 
build|https://github.com/apache/flink/actions/workflows/docs.yml] succeeded).
 * (minor only) The documentation for the new major release does not contain 
"-SNAPSHOT" in its version title, and all links refer to the corresponding 
version docs instead of {{{}master{}}}.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33830) CLONE - Select executing Release Manager

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33830:
---

 Summary: CLONE - Select executing Release Manager
 Key: FLINK-33830
 URL: https://issues.apache.org/jira/browse/FLINK-33830
 Project: Flink
  Issue Type: Sub-task
  Components: Release System
Affects Versions: 1.17.0
Reporter: Jing Ge
Assignee: Qingsheng Ren
 Fix For: 1.17.0


h4. GPG Key

You need to have a GPG key to sign the release artifacts. Please be aware of 
the ASF-wide [release signing 
guidelines|https://www.apache.org/dev/release-signing.html]. If you don’t have 
a GPG key associated with your Apache account, please create one according to 
the guidelines.

Determine your Apache GPG Key and Key ID, as follows:
{code:java}
$ gpg --list-keys
{code}
This will list your GPG keys. One of these should reflect your Apache account, 
for example:
{code:java}
--
pub   2048R/845E6689 2016-02-23
uid  Nomen Nescio 
sub   2048R/BA4D50BE 2016-02-23
{code}
In the example above, the key ID is the 8-digit hex string in the {{pub}} line: 
{{{}845E6689{}}}.

Now, add your Apache GPG key to the Flink’s {{KEYS}} file in the [Apache Flink 
release KEYS file|https://dist.apache.org/repos/dist/release/flink/KEYS] 
repository at [dist.apache.org|http://dist.apache.org/]. Follow the 
instructions listed at the top of these files. (Note: Only PMC members have 
write access to the release repository. If you end up getting 403 errors ask on 
the mailing list for assistance.)

Configure {{git}} to use this key when signing code by giving it your key ID, 
as follows:
{code:java}
$ git config --global user.signingkey 845E6689
{code}
You may drop the {{--global}} option if you’d prefer to use this key for the 
current repository only.

You may wish to start {{gpg-agent}} to unlock your GPG key only once using your 
passphrase. Otherwise, you may need to enter this passphrase hundreds of times. 
The setup for {{gpg-agent}} varies based on operating system, but may be 
something like this:
{code:bash}
$ eval $(gpg-agent --daemon --no-grab --write-env-file $HOME/.gpg-agent-info)
$ export GPG_TTY=$(tty)
$ export GPG_AGENT_INFO
{code}
h4. Access to Apache Nexus repository

Configure access to the [Apache Nexus 
repository|https://repository.apache.org/], which enables final deployment of 
releases to the Maven Central Repository.
 # You log in with your Apache account.
 # Confirm you have appropriate access by finding {{org.apache.flink}} under 
{{{}Staging Profiles{}}}.
 # Navigate to your {{Profile}} (top right drop-down menu of the page).
 # Choose {{User Token}} from the dropdown, then click {{{}Access User 
Token{}}}. Copy a snippet of the Maven XML configuration block.
 # Insert this snippet twice into your global Maven {{settings.xml}} file, 
typically {{{}${HOME}/.m2/settings.xml{}}}. The end result should look like 
this, where {{TOKEN_NAME}} and {{TOKEN_PASSWORD}} are your secret tokens:
{code:xml}

   
 
   apache.releases.https
   TOKEN_NAME
   TOKEN_PASSWORD
 
 
   apache.snapshots.https
   TOKEN_NAME
   TOKEN_PASSWORD
 
   
 
{code}

h4. Website development setup

Get ready for updating the Flink website by following the [website development 
instructions|https://flink.apache.org/contributing/improve-website.html].
h4. GNU Tar Setup for Mac (Skip this step if you are not using a Mac)

The default tar application on Mac does not support GNU archive format and 
defaults to Pax. This bloats the archive with unnecessary metadata that can 
result in additional files when decompressing (see [1.15.2-RC2 vote 
thread|https://lists.apache.org/thread/mzbgsb7y9vdp9bs00gsgscsjv2ygy58q]). 
Install gnu-tar and create a symbolic link to use in preference of the default 
tar program.
{code:bash}
$ brew install gnu-tar
$ ln -s /usr/local/bin/gtar /usr/local/bin/tar
$ which tar
{code}
 

h3. Expectations
 * Release Manager’s GPG key is published to 
[dist.apache.org|http://dist.apache.org/]
 * Release Manager’s GPG key is configured in git configuration
 * Release Manager's GPG key is configured as the default gpg key.
 * Release Manager has {{org.apache.flink}} listed under Staging Profiles in 
Nexus
 * Release Manager’s Nexus User Token is configured in settings.xml



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33832) CLONE - Verify that no exclusions were erroneously added to the japicmp plugin

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33832:
---

 Summary: CLONE - Verify that no exclusions were erroneously added 
to the japicmp plugin
 Key: FLINK-33832
 URL: https://issues.apache.org/jira/browse/FLINK-33832
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Matthias Pohl


Verify that no exclusions were erroneously added to the japicmp plugin that 
break compatibility guarantees. Check the exclusions for the 
japicmp-maven-plugin in the root pom (see 
[apache/flink:pom.xml:2175ff|https://github.com/apache/flink/blob/3856c49af77601cf7943a5072d8c932279ce46b4/pom.xml#L2175]
 for exclusions that:
* For minor releases: break source compatibility for {{@Public}} APIs
* For patch releases: break source/binary compatibility for 
{{@Public}}/{{@PublicEvolving}}  APIs
Any such exclusion must be properly justified, in advance.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33831) CLONE - Create a release branch

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33831:
---

 Summary: CLONE - Create a release branch
 Key: FLINK-33831
 URL: https://issues.apache.org/jira/browse/FLINK-33831
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.17.0
Reporter: Jing Ge
Assignee: Leonard Xu
 Fix For: 1.17.0


If you are doing a new minor release, you need to update Flink version in the 
following repositories and the [AzureCI project 
configuration|https://dev.azure.com/apache-flink/apache-flink/]:
 * [apache/flink|https://github.com/apache/flink]
 * [apache/flink-docker|https://github.com/apache/flink-docker]
 * [apache/flink-benchmarks|https://github.com/apache/flink-benchmarks]

Patch releases don't require the these repositories to be touched. Simply 
checkout the already existing branch for that version:
{code:java}
$ git checkout release-$SHORT_RELEASE_VERSION
{code}
h4. Flink repository

Create a branch for the new version that we want to release before updating the 
master branch to the next development version:
{code:bash}
$ cd ./tools
tools $ releasing/create_snapshot_branch.sh
tools $ git checkout master
tools $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION 
NEW_VERSION=$NEXT_SNAPSHOT_VERSION releasing/update_branch_version.sh
{code}
In the {{master}} branch, add a new value (e.g. {{v1_16("1.16")}}) to 
[apache-flink:flink-annotations/src/main/java/org/apache/flink/FlinkVersion|https://github.com/apache/flink/blob/master/flink-annotations/src/main/java/org/apache/flink/FlinkVersion.java]
 as the last entry:
{code:java}
// ...
v1_12("1.12"),
v1_13("1.13"),
v1_14("1.14"),
v1_15("1.15"),
v1_16("1.16");
{code}
The newly created branch and updated {{master}} branch need to be pushed to the 
official repository.
h4. Flink Docker Repository

Afterwards fork off from {{dev-master}} a {{dev-x.y}} branch in the 
[apache/flink-docker|https://github.com/apache/flink-docker] repository. Make 
sure that 
[apache/flink-docker:.github/workflows/ci.yml|https://github.com/apache/flink-docker/blob/dev-master/.github/workflows/ci.yml]
 points to the correct snapshot version; for {{dev-x.y}} it should point to 
{{{}x.y-SNAPSHOT{}}}, while for {{dev-master}} it should point to the most 
recent snapshot version (\{[$NEXT_SNAPSHOT_VERSION}}).

After pushing the new minor release branch, as the last step you should also 
update the documentation workflow to also build the documentation for the new 
release branch. Check [Managing 
Documentation|https://cwiki.apache.org/confluence/display/FLINK/Managing+Documentation]
 on details on how to do that. You may also want to manually trigger a build to 
make the changes visible as soon as possible.

h4. Flink Benchmark Repository
First of all, checkout the {{master}} branch to {{dev-x.y}} branch in 
[apache/flink-benchmarks|https://github.com/apache/flink-benchmarks], so that 
we can have a branch named {{dev-x.y}} which could be built on top of 
(${{CURRENT_SNAPSHOT_VERSION}}).

Then, inside the repository you need to manually update the {{flink.version}} 
property inside the parent *pom.xml* file. It should be pointing to the most 
recent snapshot version ($NEXT_SNAPSHOT_VERSION). For example:
{code:xml}
1.18-SNAPSHOT
{code}

h4. AzureCI Project Configuration
The new release branch needs to be configured within AzureCI to make azure 
aware of the new release branch. This matter can only be handled by Ververica 
employees since they are owning the AzureCI setup.
 

h3. Expectations (Minor Version only if not stated otherwise)
 * Release branch has been created and pushed
 * Changes on the new release branch are picked up by [Azure 
CI|https://dev.azure.com/apache-flink/apache-flink/_build?definitionId=1&_a=summary]
 * {{master}} branch has the version information updated to the new version 
(check pom.xml files and 
 * 
[apache-flink:flink-annotations/src/main/java/org/apache/flink/FlinkVersion|https://github.com/apache/flink/blob/master/flink-annotations/src/main/java/org/apache/flink/FlinkVersion.java]
 enum)
 * New version is added to the 
[apache-flink:flink-annotations/src/main/java/org/apache/flink/FlinkVersion|https://github.com/apache/flink/blob/master/flink-annotations/src/main/java/org/apache/flink/FlinkVersion.java]
 enum.
 * Make sure [flink-docker|https://github.com/apache/flink-docker/] has 
{{dev-x.y}} branch and docker e2e tests run against this branch in the 
corresponding Apache Flink release branch (see 
[apache/flink:flink-end-to-end-tests/test-scripts/common_docker.sh:51|https://github.com/apache/flink/blob/master/flink-end-to-end-tests/test-scripts/common_docker.sh#L51])
 * 
[apache-flink:docs/config.toml|https://github.com/apache/flink/blob/release-1.17/docs/config.toml]
 has been updated appropriately in the new Apache Flink release branch.
 * The {{flink.version}} property (see 
[apache/flink-benchmarks:pom.xml|https://github.com/apache/flink-benchmarks/blob/maste

[jira] [Created] (FLINK-33825) CLONE - Create a new version in JIRA

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33825:
---

 Summary: CLONE - Create a new version in JIRA
 Key: FLINK-33825
 URL: https://issues.apache.org/jira/browse/FLINK-33825
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Martijn Visser


When contributors resolve an issue in JIRA, they are tagging it with a release 
that will contain their changes. With the release currently underway, new 
issues should be resolved against a subsequent future release. Therefore, you 
should create a release item for this subsequent release, as follows:
 # In JIRA, navigate to the [Flink > Administration > 
Versions|https://issues.apache.org/jira/plugins/servlet/project-config/FLINK/versions].
 # Add a new release: choose the next minor version number compared to the one 
currently underway, select today’s date as the Start Date, and choose Add.
(Note: Only PMC members have access to the project administration. If you do 
not have access, ask on the mailing list for assistance.)

 

h3. Expectations
 * The new version should be listed in the dropdown menu of {{fixVersion}} or 
{{affectedVersion}} under "unreleased versions" when creating a new Jira issue.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33824) Prepare Flink 1.18.1 Release

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33824:
---

 Summary: Prepare Flink 1.18.1 Release
 Key: FLINK-33824
 URL: https://issues.apache.org/jira/browse/FLINK-33824
 Project: Flink
  Issue Type: New Feature
  Components: Release System
Affects Versions: 1.17.0
Reporter: Jing Ge
Assignee: Leonard Xu
 Fix For: 1.17.0


This umbrella issue is meant as a test balloon for moving the [release 
documentation|https://cwiki.apache.org/confluence/display/FLINK/Creating+a+Flink+Release]
 into Jira.
h3. Prerequisites
h4. Environment Variables

Commands in the subtasks might expect some of the following enviroment 
variables to be set accordingly to the version that is about to be released:
{code:bash}
RELEASE_VERSION="1.5.0"
SHORT_RELEASE_VERSION="1.5"
CURRENT_SNAPSHOT_VERSION="$SHORT_RELEASE_VERSION-SNAPSHOT"
NEXT_SNAPSHOT_VERSION="1.6-SNAPSHOT"
SHORT_NEXT_SNAPSHOT_VERSION="1.6"
{code}
h4. Build Tools

All of the following steps require to use Maven 3.2.5 and Java 8. Modify your 
PATH environment variable accordingly if needed.
h4. Flink Source
 * Create a new directory for this release and clone the Flink repository from 
Github to ensure you have a clean workspace (this step is optional).
 * Run {{mvn -Prelease clean install}} to ensure that the build processes that 
are specific to that profile are in good shape (this step is optional).

The rest of this instructions assumes that commands are run in the root (or 
{{./tools}} directory) of a repository on the branch of the release version 
with the above environment variables set.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33828) CLONE - Cross team testing

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33828:
---

 Summary: CLONE - Cross team testing
 Key: FLINK-33828
 URL: https://issues.apache.org/jira/browse/FLINK-33828
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Qingsheng Ren


For user facing features that go into the release we'd like to ensure they can 
actually _be used_ by Flink users. To achieve this the release managers ensure 
that an issue for cross team testing is created in the Apache Flink Jira. This 
can and should be picked up by other community members to verify the 
functionality and usability of the feature.
The issue should contain some entry points which enables other community 
members to test it. It should not contain documentation on how to use the 
feature as this should be part of the actual documentation. The cross team 
tests are performed after the feature freeze. Documentation should be in place 
before that. Those tests are manual tests, so do not confuse them with 
automated tests.
To sum that up:
 * User facing features should be tested by other contributors
 * The scope is usability and sanity of the feature
 * The feature needs to be already documented
 * The contributor creates an issue containing some pointers on how to get 
started (e.g. link to the documentation, suggested targets of verification)
 * Other community members pick those issues up and provide feedback
 * Cross team testing happens right after the feature freeze

 

h3. Expectations
 * Jira issues for each expected release task according to the release plan is 
created and labeled as {{{}release-testing{}}}.
 * All the created release-testing-related Jira issues are resolved and the 
corresponding blocker issues are fixed.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33826) CLONE - Triage release-blocking issues in JIRA

2023-12-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-33826:
---

 Summary: CLONE - Triage release-blocking issues in JIRA
 Key: FLINK-33826
 URL: https://issues.apache.org/jira/browse/FLINK-33826
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Qingsheng Ren


There could be outstanding release-blocking issues, which should be triaged 
before proceeding to build a release candidate. We track them by assigning a 
specific Fix version field even before the issue resolved.

The list of release-blocking issues is available at the version status page. 
Triage each unresolved issue with one of the following resolutions:
 * If the issue has been resolved and JIRA was not updated, resolve it 
accordingly.
 * If the issue has not been resolved and it is acceptable to defer this until 
the next release, update the Fix Version field to the new version you just 
created. Please consider discussing this with stakeholders and the dev@ mailing 
list, as appropriate.
 ** When using "Bulk Change" functionality of Jira
 *** First, add the newly created version to Fix Version for all unresolved 
tickets that have old the old version among its Fix Versions.
 *** Afterwards, remove the old version from the Fix Version.
 * If the issue has not been resolved and it is not acceptable to release until 
it is fixed, the release cannot proceed. Instead, work with the Flink community 
to resolve the issue.

 

h3. Expectations
 * There are no release blocking JIRA issues



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33730) Update the compatibility table to only include last three versions

2023-12-03 Thread Jing Ge (Jira)
Jing Ge created FLINK-33730:
---

 Summary: Update the compatibility table to only include last three 
versions
 Key: FLINK-33730
 URL: https://issues.apache.org/jira/browse/FLINK-33730
 Project: Flink
  Issue Type: Improvement
  Components: Documentation
Reporter: Jing Ge
Assignee: Jing Ge


Update the compatibility table 
([apache-flink:./docs/content/docs/ops/upgrading.md|https://github.com/apache/flink/blob/master/docs/content/docs/ops/upgrading.md#compatibility-table]
 and 
[apache-flink:./docs/content.zh/docs/ops/upgrading.md|https://github.com/apache/flink/blob/master/docs/content.zh/docs/ops/upgrading.md#compatibility-table])
 according to the discussion[1].

 

[1] https://lists.apache.org/thread/7yx396x5lmtws0s4t0sf9f2psgny11d6

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33726) Print cost time for stream queries in SQL Client

2023-12-02 Thread Jing Ge (Jira)
Jing Ge created FLINK-33726:
---

 Summary: Print cost time for stream queries in SQL Client
 Key: FLINK-33726
 URL: https://issues.apache.org/jira/browse/FLINK-33726
 Project: Flink
  Issue Type: Improvement
  Components: Table SQL / Client
Reporter: Jing Ge
Assignee: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33654) deprecate DummyStreamExecutionEnvironment

2023-11-26 Thread Jing Ge (Jira)
Jing Ge created FLINK-33654:
---

 Summary: deprecate DummyStreamExecutionEnvironment
 Key: FLINK-33654
 URL: https://issues.apache.org/jira/browse/FLINK-33654
 Project: Flink
  Issue Type: Improvement
  Components: Table SQL / Planner
Reporter: Jing Ge


Deprecate DummyStreamExecutionEnvironment first since it will take time to 
remove it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33651) Update the Chinese version checkpointing doc in fault tolerance

2023-11-25 Thread Jing Ge (Jira)
Jing Ge created FLINK-33651:
---

 Summary: Update the Chinese version checkpointing doc in fault 
tolerance
 Key: FLINK-33651
 URL: https://issues.apache.org/jira/browse/FLINK-33651
 Project: Flink
  Issue Type: Improvement
  Components: Documentation, Runtime / Checkpointing
Reporter: Jing Ge
Assignee: Jing Ge


the Chinese doc missed some items compared to the English doc[1], would you 
mind adding them together? Such as:
 * checkpoint storage
 * unaligned checkpoints:
 * checkpoints with finished tasks:

[1] 
[https://nightlies.apache.org/flink/flink-docs-master/docs/dev/datastream/fault-tolerance/checkpointing/#enabling-and-configuring-checkpointing]

 

Reference: https://github.com/apache/flink/pull/23795#discussion_r1404855432



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33648) table usfs doc in Chinese is out-of-date

2023-11-24 Thread Jing Ge (Jira)
Jing Ge created FLINK-33648:
---

 Summary: table usfs doc in Chinese is out-of-date
 Key: FLINK-33648
 URL: https://issues.apache.org/jira/browse/FLINK-33648
 Project: Flink
  Issue Type: Improvement
  Components: Documentation, Table SQL / API
Affects Versions: 1.18.0
Reporter: Jing Ge


The English version 
[https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/functions/udfs/]
 is very different to the Chinese version 
[https://nightlies.apache.org/flink/flink-docs-master/zh/docs/dev/table/functions/udfs/]

 

Not only the content but also the code examples.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33624) Bump Guava to 32.1.3-jre in flink-table

2023-11-22 Thread Jing Ge (Jira)
Jing Ge created FLINK-33624:
---

 Summary: Bump Guava to 32.1.3-jre in flink-table
 Key: FLINK-33624
 URL: https://issues.apache.org/jira/browse/FLINK-33624
 Project: Flink
  Issue Type: Improvement
  Components: Table SQL / API
Reporter: Jing Ge
Assignee: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33349) CLONE - CLONE - Stage source and binary releases on dist.apache.org

2023-10-24 Thread Jing Ge (Jira)
Jing Ge created FLINK-33349:
---

 Summary: CLONE - CLONE - Stage source and binary releases on 
dist.apache.org
 Key: FLINK-33349
 URL: https://issues.apache.org/jira/browse/FLINK-33349
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Jing Ge
 Fix For: 1.18.0


Copy the source release to the dev repository of dist.apache.org:
# If you have not already, check out the Flink section of the dev repository on 
dist.apache.org via Subversion. In a fresh directory:
{code:bash}
$ svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates
{code}
# Make a directory for the new release and copy all the artifacts (Flink 
source/binary distributions, hashes, GPG signatures and the python 
subdirectory) into that newly created directory:
{code:bash}
$ mkdir flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
$ mv /tools/releasing/release/* 
flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
{code}
# Add and commit all the files.
{code:bash}
$ cd flink
flink $ svn add flink-${RELEASE_VERSION}-rc${RC_NUM}
flink $ svn commit -m "Add flink-${RELEASE_VERSION}-rc${RC_NUM}"
{code}
# Verify that files are present under 
[https://dist.apache.org/repos/dist/dev/flink|https://dist.apache.org/repos/dist/dev/flink].
# Push the release tag if not done already (the following command assumes to be 
called from within the apache/flink checkout):
{code:bash}
$ git push  refs/tags/release-${RELEASE_VERSION}-rc${RC_NUM}
{code}

 

h3. Expectations
 * Maven artifacts deployed to the staging repository of 
[repository.apache.org|https://repository.apache.org/content/repositories/]
 * Source distribution deployed to the dev repository of 
[dist.apache.org|https://dist.apache.org/repos/dist/dev/flink/]
 * Check hashes (e.g. shasum -c *.sha512)
 * Check signatures (e.g. {{{}gpg --verify 
flink-1.2.3-source-release.tar.gz.asc flink-1.2.3-source-release.tar.gz{}}})
 * {{grep}} for legal headers in each file.
 * If time allows check the NOTICE files of the modules whose dependencies have 
been changed in this release in advance, since the license issues from time to 
time pop up during voting. See [Verifying a Flink 
Release|https://cwiki.apache.org/confluence/display/FLINK/Verifying+a+Flink+Release]
 "Checking License" section.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33348) CLONE - CLONE - Build and stage Java and Python artifacts

2023-10-24 Thread Jing Ge (Jira)
Jing Ge created FLINK-33348:
---

 Summary: CLONE - CLONE - Build and stage Java and Python artifacts
 Key: FLINK-33348
 URL: https://issues.apache.org/jira/browse/FLINK-33348
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge
Assignee: Jing Ge
 Fix For: 1.18.0


# Create a local release branch ((!) this step can not be skipped for minor 
releases):
{code:bash}
$ cd ./tools
tools/ $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$RELEASE_VERSION 
RELEASE_CANDIDATE=$RC_NUM releasing/create_release_branch.sh
{code}
 # Tag the release commit:
{code:bash}
$ git tag -s ${TAG} -m "${TAG}"
{code}
 # We now need to do several things:
 ## Create the source release archive
 ## Deploy jar artefacts to the [Apache Nexus 
Repository|https://repository.apache.org/], which is the staging area for 
deploying the jars to Maven Central
 ## Build PyFlink wheel packages
You might want to create a directory on your local machine for collecting the 
various source and binary releases before uploading them. Creating the binary 
releases is a lengthy process but you can do this on another machine (for 
example, in the "cloud"). When doing this, you can skip signing the release 
files on the remote machine, download them to your local machine and sign them 
there.
 # Build the source release:
{code:bash}
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_source_release.sh
{code}
 # Stage the maven artifacts:
{code:bash}
tools $ releasing/deploy_staging_jars.sh
{code}
Review all staged artifacts ([https://repository.apache.org/]). They should 
contain all relevant parts for each module, including pom.xml, jar, test jar, 
source, test source, javadoc, etc. Carefully review any new artifacts.
 # Close the staging repository on Apache Nexus. When prompted for a 
description, enter “Apache Flink, version X, release candidate Y”.
Then, you need to build the PyFlink wheel packages (since 1.11):
 # Set up an azure pipeline in your own Azure account. You can refer to [Azure 
Pipelines|https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository]
 for more details on how to set up azure pipeline for a fork of the Flink 
repository. Note that a google cloud mirror in Europe is used for downloading 
maven artifacts, therefore it is recommended to set your [Azure organization 
region|https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/change-organization-location]
 to Europe to speed up the downloads.
 # Push the release candidate branch to your forked personal Flink repository, 
e.g.
{code:bash}
tools $ git push  
refs/heads/release-${RELEASE_VERSION}-rc${RC_NUM}:release-${RELEASE_VERSION}-rc${RC_NUM}
{code}
 # Trigger the Azure Pipelines manually to build the PyFlink wheel packages
 ## Go to your Azure Pipelines Flink project → Pipelines
 ## Click the "New pipeline" button on the top right
 ## Select "GitHub" → your GitHub Flink repository → "Existing Azure Pipelines 
YAML file"
 ## Select your branch → Set path to "/azure-pipelines.yaml" → click on 
"Continue" → click on "Variables"
 ## Then click "New Variable" button, fill the name with "MODE", and the value 
with "release". Click "OK" to set the variable and the "Save" button to save 
the variables, then back on the "Review your pipeline" screen click "Run" to 
trigger the build.
 ## You should now see a build where only the "CI build (release)" is running
 # Download the PyFlink wheel packages from the build result page after the 
jobs of "build_wheels mac" and "build_wheels linux" have finished.
 ## Download the PyFlink wheel packages
 ### Open the build result page of the pipeline
 ### Go to the {{Artifacts}} page (build_wheels linux -> 1 artifact)
 ### Click {{wheel_Darwin_build_wheels mac}} and {{wheel_Linux_build_wheels 
linux}} separately to download the zip files
 ## Unzip these two zip files
{code:bash}
$ cd /path/to/downloaded_wheel_packages
$ unzip wheel_Linux_build_wheels\ linux.zip
$ unzip wheel_Darwin_build_wheels\ mac.zip{code}
 ## Create directory {{./dist}} under the directory of {{{}flink-python{}}}:
{code:bash}
$ cd 
$ mkdir flink-python/dist{code}
 ## Move the unzipped wheel packages to the directory of 
{{{}flink-python/dist{}}}:
{code:java}
$ mv /path/to/wheel_Darwin_build_wheels\ mac/* flink-python/dist/
$ mv /path/to/wheel_Linux_build_wheels\ linux/* flink-python/dist/
$ cd tools{code}

Finally, we create the binary convenience release files:
{code:bash}
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_binary_release.sh
{code}
If you want to run this step in parallel on a remote machine you have to make 
the release commit available there (for example by pushing to a repository). 
*This is important: the commit inside the binary builds has to match the commit 
of the source builds and the tagged release commit.* 
When

[jira] [Created] (FLINK-33351) CLONE - CLONE - Vote on the release candidate

2023-10-24 Thread Jing Ge (Jira)
Jing Ge created FLINK-33351:
---

 Summary: CLONE - CLONE - Vote on the release candidate
 Key: FLINK-33351
 URL: https://issues.apache.org/jira/browse/FLINK-33351
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.18.0
Reporter: Jing Ge
Assignee: Jing Ge


Once you have built and individually reviewed the release candidate, please 
share it for the community-wide review. Please review foundation-wide [voting 
guidelines|http://www.apache.org/foundation/voting.html] for more information.

Start the review-and-vote thread on the dev@ mailing list. Here’s an email 
template; please adjust as you see fit.
{quote}From: Release Manager
To: dev@flink.apache.org
Subject: [VOTE] Release 1.2.3, release candidate #3

Hi everyone,
Please review and vote on the release candidate #3 for the version 1.2.3, as 
follows:
[ ] +1, Approve the release
[ ] -1, Do not approve the release (please provide specific comments)

The complete staging area is available for your review, which includes:
 * JIRA release notes [1],
 * the official Apache source release and binary convenience releases to be 
deployed to dist.apache.org [2], which are signed with the key with fingerprint 
 [3],
 * all artifacts to be deployed to the Maven Central Repository [4],
 * source code tag "release-1.2.3-rc3" [5],
 * website pull request listing the new release and adding announcement blog 
post [6].

The vote will be open for at least 72 hours. It is adopted by majority 
approval, with at least 3 PMC affirmative votes.

Thanks,
Release Manager

[1] link
[2] link
[3] [https://dist.apache.org/repos/dist/release/flink/KEYS]
[4] link
[5] link
[6] link
{quote}
*If there are any issues found in the release candidate, reply on the vote 
thread to cancel the vote.* There’s no need to wait 72 hours. Proceed to the 
Fix Issues step below and address the problem. However, some issues don’t 
require cancellation. For example, if an issue is found in the website pull 
request, just correct it on the spot and the vote can continue as-is.

For cancelling a release, the release manager needs to send an email to the 
release candidate thread, stating that the release candidate is officially 
cancelled. Next, all artifacts created specifically for the RC in the previous 
steps need to be removed:
 * Delete the staging repository in Nexus
 * Remove the source / binary RC files from dist.apache.org
 * Delete the source code tag in git

*If there are no issues, reply on the vote thread to close the voting.* Then, 
tally the votes in a separate email. Here’s an email template; please adjust as 
you see fit.
{quote}From: Release Manager
To: dev@flink.apache.org
Subject: [RESULT] [VOTE] Release 1.2.3, release candidate #3

I'm happy to announce that we have unanimously approved this release.

There are XXX approving votes, XXX of which are binding:
 * approver 1
 * approver 2
 * approver 3
 * approver 4

There are no disapproving votes.

Thanks everyone!
{quote}
 

h3. Expectations
 * Community votes to release the proposed candidate, with at least three 
approving PMC votes

Any issues that are raised till the vote is over should be either resolved or 
moved into the next release (if applicable).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33350) CLONE - CLONE - Propose a pull request for website updates

2023-10-24 Thread Jing Ge (Jira)
Jing Ge created FLINK-33350:
---

 Summary: CLONE - CLONE - Propose a pull request for website updates
 Key: FLINK-33350
 URL: https://issues.apache.org/jira/browse/FLINK-33350
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.18.0
Reporter: Jing Ge
Assignee: Jing Ge


The final step of building the candidate is to propose a website pull request 
containing the following changes:
 # update 
[apache/flink-web:_config.yml|https://github.com/apache/flink-web/blob/asf-site/_config.yml]
 ## update {{FLINK_VERSION_STABLE}} and {{FLINK_VERSION_STABLE_SHORT}} as 
required
 ## update version references in quickstarts ({{{}q/{}}} directory) as required
 ## (major only) add a new entry to {{flink_releases}} for the release binaries 
and sources
 ## (minor only) update the entry for the previous release in the series in 
{{flink_releases}}
 ### Please pay notice to the ids assigned to the download entries. They should 
be unique and reflect their corresponding version number.
 ## add a new entry to {{release_archive.flink}}
 # add a blog post announcing the release in _posts
 # add a organized release notes page under docs/content/release-notes and 
docs/content.zh/release-notes (like 
[https://nightlies.apache.org/flink/flink-docs-release-1.15/release-notes/flink-1.15/]).
 The page is based on the non-empty release notes collected from the issues, 
and only the issues that affect existing users should be included (e.g., 
instead of new functionality). It should be in a separate PR since it would be 
merged to the flink project.

(!) Don’t merge the PRs before finalizing the release.

 

h3. Expectations
 * Website pull request proposed to list the 
[release|http://flink.apache.org/downloads.html]
 * (major only) Check {{docs/config.toml}} to ensure that
 ** the version constants refer to the new version
 ** the {{baseurl}} does not point to {{flink-docs-master}}  but 
{{flink-docs-release-X.Y}} instead



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33347) CLONE - CLONE - Build Release Candidate: 1.18.0-rc3

2023-10-24 Thread Jing Ge (Jira)
Jing Ge created FLINK-33347:
---

 Summary: CLONE - CLONE - Build Release Candidate: 1.18.0-rc3
 Key: FLINK-33347
 URL: https://issues.apache.org/jira/browse/FLINK-33347
 Project: Flink
  Issue Type: New Feature
Affects Versions: 1.18.0
Reporter: Jing Ge
Assignee: Jing Ge


The core of the release process is the build-vote-fix cycle. Each cycle 
produces one release candidate. The Release Manager repeats this cycle until 
the community approves one release candidate, which is then finalized.
h4. Prerequisites

Set up a few environment variables to simplify Maven commands that follow. This 
identifies the release candidate being built. Start with {{RC_NUM}} equal to 1 
and increment it for each candidate:
{code:java}
RC_NUM="2"
TAG="release-${RELEASE_VERSION}-rc${RC_NUM}"
{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33301) Add Java and Maven version checks in the bash script of Flink release process

2023-10-18 Thread Jing Ge (Jira)
Jing Ge created FLINK-33301:
---

 Summary: Add Java and Maven version checks in the bash script of 
Flink release process
 Key: FLINK-33301
 URL: https://issues.apache.org/jira/browse/FLINK-33301
 Project: Flink
  Issue Type: Bug
  Components: Release System
Affects Versions: 1.18.0
Reporter: Jing Ge
Assignee: Jing Ge
 Fix For: 1.19.0


During the release, Flink requires specific version of Java and Maven. It makes 
sense to check those versions at the very beginning of some bash scripts to let 
it fail fast and therefore improve the efficiency.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33275) CLONE - Vote on the release candidate

2023-10-13 Thread Jing Ge (Jira)
Jing Ge created FLINK-33275:
---

 Summary: CLONE - Vote on the release candidate
 Key: FLINK-33275
 URL: https://issues.apache.org/jira/browse/FLINK-33275
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.18.0
Reporter: Jing Ge


Once you have built and individually reviewed the release candidate, please 
share it for the community-wide review. Please review foundation-wide [voting 
guidelines|http://www.apache.org/foundation/voting.html] for more information.

Start the review-and-vote thread on the dev@ mailing list. Here’s an email 
template; please adjust as you see fit.
{quote}From: Release Manager
To: dev@flink.apache.org
Subject: [VOTE] Release 1.2.3, release candidate #3

Hi everyone,
Please review and vote on the release candidate #3 for the version 1.2.3, as 
follows:
[ ] +1, Approve the release
[ ] -1, Do not approve the release (please provide specific comments)

The complete staging area is available for your review, which includes:
 * JIRA release notes [1],
 * the official Apache source release and binary convenience releases to be 
deployed to dist.apache.org [2], which are signed with the key with fingerprint 
 [3],
 * all artifacts to be deployed to the Maven Central Repository [4],
 * source code tag "release-1.2.3-rc3" [5],
 * website pull request listing the new release and adding announcement blog 
post [6].

The vote will be open for at least 72 hours. It is adopted by majority 
approval, with at least 3 PMC affirmative votes.

Thanks,
Release Manager

[1] link
[2] link
[3] [https://dist.apache.org/repos/dist/release/flink/KEYS]
[4] link
[5] link
[6] link
{quote}
*If there are any issues found in the release candidate, reply on the vote 
thread to cancel the vote.* There’s no need to wait 72 hours. Proceed to the 
Fix Issues step below and address the problem. However, some issues don’t 
require cancellation. For example, if an issue is found in the website pull 
request, just correct it on the spot and the vote can continue as-is.

For cancelling a release, the release manager needs to send an email to the 
release candidate thread, stating that the release candidate is officially 
cancelled. Next, all artifacts created specifically for the RC in the previous 
steps need to be removed:
 * Delete the staging repository in Nexus
 * Remove the source / binary RC files from dist.apache.org
 * Delete the source code tag in git

*If there are no issues, reply on the vote thread to close the voting.* Then, 
tally the votes in a separate email. Here’s an email template; please adjust as 
you see fit.
{quote}From: Release Manager
To: dev@flink.apache.org
Subject: [RESULT] [VOTE] Release 1.2.3, release candidate #3

I'm happy to announce that we have unanimously approved this release.

There are XXX approving votes, XXX of which are binding:
 * approver 1
 * approver 2
 * approver 3
 * approver 4

There are no disapproving votes.

Thanks everyone!
{quote}
 

h3. Expectations
 * Community votes to release the proposed candidate, with at least three 
approving PMC votes

Any issues that are raised till the vote is over should be either resolved or 
moved into the next release (if applicable).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33273) CLONE - Stage source and binary releases on dist.apache.org

2023-10-13 Thread Jing Ge (Jira)
Jing Ge created FLINK-33273:
---

 Summary: CLONE - Stage source and binary releases on 
dist.apache.org
 Key: FLINK-33273
 URL: https://issues.apache.org/jira/browse/FLINK-33273
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


Copy the source release to the dev repository of dist.apache.org:
# If you have not already, check out the Flink section of the dev repository on 
dist.apache.org via Subversion. In a fresh directory:
{code:bash}
$ svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates
{code}
# Make a directory for the new release and copy all the artifacts (Flink 
source/binary distributions, hashes, GPG signatures and the python 
subdirectory) into that newly created directory:
{code:bash}
$ mkdir flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
$ mv /tools/releasing/release/* 
flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
{code}
# Add and commit all the files.
{code:bash}
$ cd flink
flink $ svn add flink-${RELEASE_VERSION}-rc${RC_NUM}
flink $ svn commit -m "Add flink-${RELEASE_VERSION}-rc${RC_NUM}"
{code}
# Verify that files are present under 
[https://dist.apache.org/repos/dist/dev/flink|https://dist.apache.org/repos/dist/dev/flink].
# Push the release tag if not done already (the following command assumes to be 
called from within the apache/flink checkout):
{code:bash}
$ git push  refs/tags/release-${RELEASE_VERSION}-rc${RC_NUM}
{code}

 

h3. Expectations
 * Maven artifacts deployed to the staging repository of 
[repository.apache.org|https://repository.apache.org/content/repositories/]
 * Source distribution deployed to the dev repository of 
[dist.apache.org|https://dist.apache.org/repos/dist/dev/flink/]
 * Check hashes (e.g. shasum -c *.sha512)
 * Check signatures (e.g. {{{}gpg --verify 
flink-1.2.3-source-release.tar.gz.asc flink-1.2.3-source-release.tar.gz{}}})
 * {{grep}} for legal headers in each file.
 * If time allows check the NOTICE files of the modules whose dependencies have 
been changed in this release in advance, since the license issues from time to 
time pop up during voting. See [Verifying a Flink 
Release|https://cwiki.apache.org/confluence/display/FLINK/Verifying+a+Flink+Release]
 "Checking License" section.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33274) CLONE - Propose a pull request for website updates

2023-10-13 Thread Jing Ge (Jira)
Jing Ge created FLINK-33274:
---

 Summary: CLONE - Propose a pull request for website updates
 Key: FLINK-33274
 URL: https://issues.apache.org/jira/browse/FLINK-33274
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.18.0
Reporter: Jing Ge


The final step of building the candidate is to propose a website pull request 
containing the following changes:
 # update 
[apache/flink-web:_config.yml|https://github.com/apache/flink-web/blob/asf-site/_config.yml]
 ## update {{FLINK_VERSION_STABLE}} and {{FLINK_VERSION_STABLE_SHORT}} as 
required
 ## update version references in quickstarts ({{{}q/{}}} directory) as required
 ## (major only) add a new entry to {{flink_releases}} for the release binaries 
and sources
 ## (minor only) update the entry for the previous release in the series in 
{{flink_releases}}
 ### Please pay notice to the ids assigned to the download entries. They should 
be unique and reflect their corresponding version number.
 ## add a new entry to {{release_archive.flink}}
 # add a blog post announcing the release in _posts
 # add a organized release notes page under docs/content/release-notes and 
docs/content.zh/release-notes (like 
[https://nightlies.apache.org/flink/flink-docs-release-1.15/release-notes/flink-1.15/]).
 The page is based on the non-empty release notes collected from the issues, 
and only the issues that affect existing users should be included (e.g., 
instead of new functionality). It should be in a separate PR since it would be 
merged to the flink project.

(!) Don’t merge the PRs before finalizing the release.

 

h3. Expectations
 * Website pull request proposed to list the 
[release|http://flink.apache.org/downloads.html]
 * (major only) Check {{docs/config.toml}} to ensure that
 ** the version constants refer to the new version
 ** the {{baseurl}} does not point to {{flink-docs-master}}  but 
{{flink-docs-release-X.Y}} instead



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33271) CLONE - Build Release Candidate: 1.18.0-rc2

2023-10-13 Thread Jing Ge (Jira)
Jing Ge created FLINK-33271:
---

 Summary: CLONE - Build Release Candidate: 1.18.0-rc2
 Key: FLINK-33271
 URL: https://issues.apache.org/jira/browse/FLINK-33271
 Project: Flink
  Issue Type: New Feature
Affects Versions: 1.18.0
Reporter: Jing Ge
Assignee: Jing Ge


The core of the release process is the build-vote-fix cycle. Each cycle 
produces one release candidate. The Release Manager repeats this cycle until 
the community approves one release candidate, which is then finalized.

h4. Prerequisites
Set up a few environment variables to simplify Maven commands that follow. This 
identifies the release candidate being built. Start with {{RC_NUM}} equal to 1 
and increment it for each candidate:
{code}
RC_NUM="1"
TAG="release-${RELEASE_VERSION}-rc${RC_NUM}"
{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33272) CLONE - Build and stage Java and Python artifacts

2023-10-13 Thread Jing Ge (Jira)
Jing Ge created FLINK-33272:
---

 Summary: CLONE - Build and stage Java and Python artifacts
 Key: FLINK-33272
 URL: https://issues.apache.org/jira/browse/FLINK-33272
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge


# Create a local release branch ((!) this step can not be skipped for minor 
releases):
{code:bash}
$ cd ./tools
tools/ $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$RELEASE_VERSION 
RELEASE_CANDIDATE=$RC_NUM releasing/create_release_branch.sh
{code}
 # Tag the release commit:
{code:bash}
$ git tag -s ${TAG} -m "${TAG}"
{code}
 # We now need to do several things:
 ## Create the source release archive
 ## Deploy jar artefacts to the [Apache Nexus 
Repository|https://repository.apache.org/], which is the staging area for 
deploying the jars to Maven Central
 ## Build PyFlink wheel packages
You might want to create a directory on your local machine for collecting the 
various source and binary releases before uploading them. Creating the binary 
releases is a lengthy process but you can do this on another machine (for 
example, in the "cloud"). When doing this, you can skip signing the release 
files on the remote machine, download them to your local machine and sign them 
there.
 # Build the source release:
{code:bash}
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_source_release.sh
{code}
 # Stage the maven artifacts:
{code:bash}
tools $ releasing/deploy_staging_jars.sh
{code}
Review all staged artifacts ([https://repository.apache.org/]). They should 
contain all relevant parts for each module, including pom.xml, jar, test jar, 
source, test source, javadoc, etc. Carefully review any new artifacts.
 # Close the staging repository on Apache Nexus. When prompted for a 
description, enter “Apache Flink, version X, release candidate Y”.
Then, you need to build the PyFlink wheel packages (since 1.11):
 # Set up an azure pipeline in your own Azure account. You can refer to [Azure 
Pipelines|https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository]
 for more details on how to set up azure pipeline for a fork of the Flink 
repository. Note that a google cloud mirror in Europe is used for downloading 
maven artifacts, therefore it is recommended to set your [Azure organization 
region|https://docs.microsoft.com/en-us/azure/devops/organizations/accounts/change-organization-location]
 to Europe to speed up the downloads.
 # Push the release candidate branch to your forked personal Flink repository, 
e.g.
{code:bash}
tools $ git push  
refs/heads/release-${RELEASE_VERSION}-rc${RC_NUM}:release-${RELEASE_VERSION}-rc${RC_NUM}
{code}
 # Trigger the Azure Pipelines manually to build the PyFlink wheel packages
 ## Go to your Azure Pipelines Flink project → Pipelines
 ## Click the "New pipeline" button on the top right
 ## Select "GitHub" → your GitHub Flink repository → "Existing Azure Pipelines 
YAML file"
 ## Select your branch → Set path to "/azure-pipelines.yaml" → click on 
"Continue" → click on "Variables"
 ## Then click "New Variable" button, fill the name with "MODE", and the value 
with "release". Click "OK" to set the variable and the "Save" button to save 
the variables, then back on the "Review your pipeline" screen click "Run" to 
trigger the build.
 ## You should now see a build where only the "CI build (release)" is running
 # Download the PyFlink wheel packages from the build result page after the 
jobs of "build_wheels mac" and "build_wheels linux" have finished.
 ## Download the PyFlink wheel packages
 ### Open the build result page of the pipeline
 ### Go to the {{Artifacts}} page (build_wheels linux -> 1 artifact)
 ### Click {{wheel_Darwin_build_wheels mac}} and {{wheel_Linux_build_wheels 
linux}} separately to download the zip files
 ## Unzip these two zip files
{code:bash}
$ cd /path/to/downloaded_wheel_packages
$ unzip wheel_Linux_build_wheels\ linux.zip
$ unzip wheel_Darwin_build_wheels\ mac.zip{code}
 ## Create directory {{./dist}} under the directory of {{{}flink-python{}}}:
{code:bash}
$ cd 
$ mkdir flink-python/dist{code}
 ## Move the unzipped wheel packages to the directory of 
{{{}flink-python/dist{}}}:
{code:java}
$ mv /path/to/wheel_Darwin_build_wheels\ mac/* flink-python/dist/
$ mv /path/to/wheel_Linux_build_wheels\ linux/* flink-python/dist/
$ cd tools{code}

Finally, we create the binary convenience release files:
{code:bash}
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_binary_release.sh
{code}
If you want to run this step in parallel on a remote machine you have to make 
the release commit available there (for example by pushing to a repository). 
*This is important: the commit inside the binary builds has to match the commit 
of the source builds and the tagged release commit.* 
When building remotely, you can skip gpg signing by setting 
{{{}SKIP_G

[jira] [Created] (FLINK-33216) turn off the public access of the s3 bucket flink-nightly

2023-10-09 Thread Jing Ge (Jira)
Jing Ge created FLINK-33216:
---

 Summary: turn off the public access of the s3 bucket flink-nightly
 Key: FLINK-33216
 URL: https://issues.apache.org/jira/browse/FLINK-33216
 Project: Flink
  Issue Type: Sub-task
  Components: Build System
Affects Versions: 1.18.0
Reporter: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33215) [Umbrella] use https://nightlies.apache.org/flink/ as the flink artifact server for connector nightly build

2023-10-09 Thread Jing Ge (Jira)
Jing Ge created FLINK-33215:
---

 Summary: [Umbrella] use https://nightlies.apache.org/flink/ as the 
flink artifact server for connector nightly build 
 Key: FLINK-33215
 URL: https://issues.apache.org/jira/browse/FLINK-33215
 Project: Flink
  Issue Type: Bug
  Components: Build System / CI
Affects Versions: 1.18.0
Reporter: Jing Ge


# Flink nightly build artifact should be uploaded to 
[https://nightlies.apache.org/flink/]
 # all externalized connectors should download the Flink artifact from 
[https://nightlies.apache.org/flink/] instead of S3 bucket flink-nightly



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33195) ElasticSearch Connector should directly depend on 3rd-party libs instead of flink-shaded repo

2023-10-05 Thread Jing Ge (Jira)
Jing Ge created FLINK-33195:
---

 Summary: ElasticSearch Connector should directly depend on 
3rd-party libs instead of flink-shaded repo
 Key: FLINK-33195
 URL: https://issues.apache.org/jira/browse/FLINK-33195
 Project: Flink
  Issue Type: Sub-task
  Components: Connectors / ElasticSearch
Affects Versions: 1.18.0
Reporter: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33194) AWS Connector should directly depend on 3rd-party libs instead of flink-shaded repo

2023-10-04 Thread Jing Ge (Jira)
Jing Ge created FLINK-33194:
---

 Summary: AWS Connector should directly depend on 3rd-party libs 
instead of flink-shaded repo
 Key: FLINK-33194
 URL: https://issues.apache.org/jira/browse/FLINK-33194
 Project: Flink
  Issue Type: Sub-task
  Components: Connectors / AWS
Affects Versions: 1.18.0
Reporter: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33193) JDBC Connector should directly depend on 3rd-party libs instead of flink-shaded repo

2023-10-04 Thread Jing Ge (Jira)
Jing Ge created FLINK-33193:
---

 Summary: JDBC Connector should directly depend on 3rd-party libs 
instead of flink-shaded repo
 Key: FLINK-33193
 URL: https://issues.apache.org/jira/browse/FLINK-33193
 Project: Flink
  Issue Type: Sub-task
  Components: Connectors / JDBC
Affects Versions: 1.18.0
Reporter: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33191) Kafka Connector should directly depend on 3rd-party libs instead of flink-shaded repo

2023-10-04 Thread Jing Ge (Jira)
Jing Ge created FLINK-33191:
---

 Summary: Kafka Connector should directly depend on 3rd-party libs 
instead of flink-shaded repo
 Key: FLINK-33191
 URL: https://issues.apache.org/jira/browse/FLINK-33191
 Project: Flink
  Issue Type: Sub-task
Reporter: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33190) Externalized Connectors should directly depend on 3rd-party libs instead of shaded repo

2023-10-04 Thread Jing Ge (Jira)
Jing Ge created FLINK-33190:
---

 Summary: Externalized Connectors should directly depend on 
3rd-party libs instead of shaded repo 
 Key: FLINK-33190
 URL: https://issues.apache.org/jira/browse/FLINK-33190
 Project: Flink
  Issue Type: Improvement
  Components: Build System
Affects Versions: 1.18.0
Reporter: Jing Ge


Connectors shouldn't depend on flink-shaded.
The overhead and/or risks of doing/supporting that right now far
outweigh the benefits.
( Because we either have to encode the full version for all dependencies
into the package, or accept the risk of minor/patch dependency clashes)
Connectors are small enough in scope that depending directly on
guava/jackson/etc. is a fine approach, and they have plenty of other
dependencies that they need to manage anyway; let's treat these the same
way.

 

https://lists.apache.org/thread/mtypmprz2b5p20gj064d0wsz3k0ofpco



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33052) codespeed server is down

2023-09-06 Thread Jing Ge (Jira)
Jing Ge created FLINK-33052:
---

 Summary: codespeed server is down
 Key: FLINK-33052
 URL: https://issues.apache.org/jira/browse/FLINK-33052
 Project: Flink
  Issue Type: Bug
  Components: Test Infrastructure
Reporter: Jing Ge
Assignee: Jing Ge


No update in #flink-dev-benchmarks slack channel since 25th August.

It was a EC2 running in a legacy aws account. Currently on one know which 
account it is. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-33037) Bump Guava to 32.1.2-jre

2023-09-05 Thread Jing Ge (Jira)
Jing Ge created FLINK-33037:
---

 Summary: Bump Guava to 32.1.2-jre
 Key: FLINK-33037
 URL: https://issues.apache.org/jira/browse/FLINK-33037
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge
Assignee: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-32906) Release Testing: Verify FLINK-30025

2023-08-21 Thread Jing Ge (Jira)
Jing Ge created FLINK-32906:
---

 Summary: Release Testing: Verify FLINK-30025
 Key: FLINK-32906
 URL: https://issues.apache.org/jira/browse/FLINK-32906
 Project: Flink
  Issue Type: Sub-task
  Components: Table SQL / API
Reporter: Jing Ge
 Fix For: 1.18.0






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-32546) update Code Style Guide with Java properties naming convention

2023-07-05 Thread Jing Ge (Jira)
Jing Ge created FLINK-32546:
---

 Summary: update Code Style Guide with Java properties naming 
convention
 Key: FLINK-32546
 URL: https://issues.apache.org/jira/browse/FLINK-32546
 Project: Flink
  Issue Type: Improvement
  Components: Documentation
Reporter: Jing Ge


The class [properties|https://en.wikipedia.org/wiki/Property_(programming)] 
must be accessible using {_}get{_}, {_}set{_}, _is_ (can be used for boolean 
properties instead of get), _to_ and other methods (so-called [accessor 
methods|https://en.wikipedia.org/wiki/Accessor] and [mutator 
methods|https://en.wikipedia.org/wiki/Mutator_method]) according to a standard 
[naming 
convention|https://en.wikipedia.org/wiki/Naming_conventions_(programming)]. 

 

[https://en.wikipedia.org/wiki/JavaBeans]

[https://flink.apache.org/how-to-contribute/code-style-and-quality-preamble/]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-31121) KafkaSink should be able to catch and ignore exp via config on/off

2023-02-17 Thread Jing Ge (Jira)
Jing Ge created FLINK-31121:
---

 Summary: KafkaSink should be able to catch and ignore exp via 
config on/off
 Key: FLINK-31121
 URL: https://issues.apache.org/jira/browse/FLINK-31121
 Project: Flink
  Issue Type: Improvement
  Components: Connectors / Kafka
Affects Versions: 1.17.0
Reporter: Jing Ge
 Fix For: 1.18.0


It is a common requirement for users to catch and ignore exp while sinking the 
event to to downstream system like Kafka. It will be convenient for some use 
cases, if Flink Sink can provide built-in functionality and config to turn it 
on and off, especially for cases that data consistency is not very important or 
the stream contains dirty events. [1][2]

First of all, consider doing it for KafkaSink. Long term, a common solution 
that can be used by any connector would be even better.

 

[1][https://lists.apache.org/thread/wy31s8wb9qnskq29wn03kp608z4vrwv8]

[2]https://stackoverflow.com/questions/52308911/how-to-handle-exceptions-in-kafka-sink

 

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-30862) Config doc generation should keep @deprecated ConfigOption

2023-02-01 Thread Jing Ge (Jira)
Jing Ge created FLINK-30862:
---

 Summary: Config doc generation should keep @deprecated ConfigOption
 Key: FLINK-30862
 URL: https://issues.apache.org/jira/browse/FLINK-30862
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge


Currently the content will be removed once the ConfigOption is marked as 
@deprecated. It should be kept, since the ConfigOption is only deprecated and 
still be used. The content should be only removed when the ConfigOption has 
been removed.

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-30758) Remove sql-client.display.max-column-width

2023-01-19 Thread Jing Ge (Jira)
Jing Ge created FLINK-30758:
---

 Summary: Remove sql-client.display.max-column-width
 Key: FLINK-30758
 URL: https://issues.apache.org/jira/browse/FLINK-30758
 Project: Flink
  Issue Type: Improvement
  Components: Table SQL / Client
Reporter: Jing Ge
Assignee: Jing Ge


sql-client.display.max-column-width is deprecated and should be removed in the 
release after the next release. 

 

Please consider removing PrintStyle.DEFAULT_MAX_COLUMN_WIDTH if there is no 
need to use it.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-30025) table.execute().print() can only use the default max column width

2022-11-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-30025:
---

 Summary: table.execute().print() can only use the default max 
column width 
 Key: FLINK-30025
 URL: https://issues.apache.org/jira/browse/FLINK-30025
 Project: Flink
  Issue Type: Improvement
  Components: Table SQL / API
Affects Versions: 1.15.2, 1.16.0
Reporter: Jing Ge


By default, the DEFAULT_MAX_COLUMN_WIDTH in PrintStyle is used now. It should 
be configurable. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-29999) Improve documentation on how a user should migrate from FlinkKafkaConsumer to KafkaSource

2022-11-11 Thread Jing Ge (Jira)
Jing Ge created FLINK-2:
---

 Summary: Improve documentation on how a user should migrate from 
FlinkKafkaConsumer to KafkaSource
 Key: FLINK-2
 URL: https://issues.apache.org/jira/browse/FLINK-2
 Project: Flink
  Issue Type: Improvement
  Components: Documentation
Affects Versions: 1.16.0
Reporter: Jing Ge


We have described how to migrate job from FlinkKafkaConsumer to KafkaSource at 
[https://nightlies.apache.org/flink/flink-docs-master/release-notes/flink-1.14/#flink-24055httpsissuesapacheorgjirabrowseflink-24055.]
 But there are more things to take care of beyond it, one example is the 
idleness handling. Related documentation should improved.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-29289) SequencefileInputFormat based on the new Source API

2022-09-13 Thread Jing Ge (Jira)
Jing Ge created FLINK-29289:
---

 Summary: SequencefileInputFormat based on the new Source API
 Key: FLINK-29289
 URL: https://issues.apache.org/jira/browse/FLINK-29289
 Project: Flink
  Issue Type: New Feature
  Components: Connectors / FileSystem
Affects Versions: 1.16.0
Reporter: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-29224) flink-mirror does not pick up the new release branch 1.16

2022-09-07 Thread Jing Ge (Jira)
Jing Ge created FLINK-29224:
---

 Summary: flink-mirror does not pick up the new release branch 1.16
 Key: FLINK-29224
 URL: https://issues.apache.org/jira/browse/FLINK-29224
 Project: Flink
  Issue Type: Bug
  Components: Build System / CI
Reporter: Jing Ge


The {{release-1.16}} branch has been cut by the community, but it's not getting 
picked up by flink-ci/flink-mirror. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-29095) logging state file deletion

2022-08-24 Thread Jing Ge (Jira)
Jing Ge created FLINK-29095:
---

 Summary: logging state file deletion
 Key: FLINK-29095
 URL: https://issues.apache.org/jira/browse/FLINK-29095
 Project: Flink
  Issue Type: New Feature
  Components: Runtime / State Backends
Affects Versions: 1.16.0
Reporter: Jing Ge


with the incremental checkpoint, conceptually, state files that are never used 
by any checkpoint will be deleted/GC . In practices, state files might be 
deleted when they are still somehow required by the failover which will lead to 
Flink job fails.

We should add the log for trouble shooting.  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-29003) sqlServerDialect should support limit clause

2022-08-16 Thread Jing Ge (Jira)
Jing Ge created FLINK-29003:
---

 Summary: sqlServerDialect should support limit clause
 Key: FLINK-29003
 URL: https://issues.apache.org/jira/browse/FLINK-29003
 Project: Flink
  Issue Type: New Feature
  Components: Connectors / JDBC
Affects Versions: 1.16.0
Reporter: Jing Ge
 Fix For: 1.17.0


SQL Server does not support limit natively, it supports by OFFSET syntax and 
requires ORDER BY to be valid, with the current structure.

This should not be only related to dialect, but also need to change or 
introduce a new {{pushdownXXX}} in the relevant pushdown mechanism.

Please see [https://github.com/apache/flink/pull/20235/files#r917362437] for 
details.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-28778) Bulk fetch of table and column statistics for given partitions

2022-08-02 Thread Jing Ge (Jira)
Jing Ge created FLINK-28778:
---

 Summary: Bulk fetch of table and column statistics for given 
partitions
 Key: FLINK-28778
 URL: https://issues.apache.org/jira/browse/FLINK-28778
 Project: Flink
  Issue Type: New Feature
  Components: Table SQL / Runtime
Affects Versions: 1.15.1
Reporter: Jing Ge
 Fix For: 1.16.0






--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-28432) HiveCatalogTable only support using the latest columns as partition keys

2022-07-06 Thread Jing Ge (Jira)
Jing Ge created FLINK-28432:
---

 Summary: HiveCatalogTable only support using the latest columns as 
partition keys
 Key: FLINK-28432
 URL: https://issues.apache.org/jira/browse/FLINK-28432
 Project: Flink
  Issue Type: Bug
  Components: Connectors / Hive
Affects Versions: 1.16.0
Reporter: Jing Ge


CatalogTable.of(
schema,
TEST_COMMENT,
PartitionKeys,
BatchTableProperties);

For example the schema contains 5 columns: first, second, third, fourth, fifth. 
It is fine if  partition keys are fourth and fifth. But in case they are second 
and third, the created ResolvedCatalogTable will use fourth and fifth as the 
partition keys.

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)


[jira] [Created] (FLINK-28117) More metrics for FileSource

2022-06-18 Thread Jing Ge (Jira)
Jing Ge created FLINK-28117:
---

 Summary: More metrics for FileSource
 Key: FLINK-28117
 URL: https://issues.apache.org/jira/browse/FLINK-28117
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge


According to user's requirement[1], following metrics are required:
1. No. of splits SplitAssigner is initialized with, number of splits re-added 
back to the SplitAssigner
2. Readers created per unit time
3. Time taken to create a reader
4. Time taken for the Reader to produce a single Row
5. Readers closed per unit time
 
Some further nice-to-have metrics:1. Number of rows emitted by the source per 
unit time
2. Time taken by the enumerator to discover the splits 
3. Total splits discovered
 
Please check FLIP-33 first and extend it if above mentioned metrics were not 
included in the FLIP.
 
[1] https://lists.apache.org/thread/t70hhss6d9s65y1vygyytbm6sgl05yrl



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Created] (FLINK-28061) create new tech blog for connector development based on Source API

2022-06-14 Thread Jing Ge (Jira)
Jing Ge created FLINK-28061:
---

 Summary: create new tech blog for connector development based on 
Source API
 Key: FLINK-28061
 URL: https://issues.apache.org/jira/browse/FLINK-28061
 Project: Flink
  Issue Type: Improvement
  Components: Documentation
Reporter: Jing Ge


The most up-to-date blog introduced how to implement SourceFunction which will 
be deprecated soon: 
[https://flink.apache.org/2021/09/07/connector-table-sql-api-part1.html]

 



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Created] (FLINK-28028) Common secure credential protection mechanism in Flink SQL

2022-06-13 Thread Jing Ge (Jira)
Jing Ge created FLINK-28028:
---

 Summary: Common secure credential protection mechanism in Flink SQL
 Key: FLINK-28028
 URL: https://issues.apache.org/jira/browse/FLINK-28028
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge


Currently, the most common way to use credential is to use:

{{CREATE TABLE ...}}

{{WITH ( ...}}

{{'username' = 'user'}}

{{'password' = 'pass'}}

{{)}}

The credential could then be read by calling SHOW CREATE TABLE . 

We should provide a more strong way to protect the credential. 



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Created] (FLINK-27977) SavePoints housekeeping API in Flink Cli, Rest API, SQL client

2022-06-09 Thread Jing Ge (Jira)
Jing Ge created FLINK-27977:
---

 Summary: SavePoints housekeeping API in Flink Cli, Rest API, SQL 
client
 Key: FLINK-27977
 URL: https://issues.apache.org/jira/browse/FLINK-27977
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge


We ran into this issue that a lot of savepoints have been created by customers 
(via their apps). It will take extra (hacking) effort to clean it. 

We should support Savepoints housekeeping to delete all savepoints:
 # REST API - /savepoints-disposal{*}{*}
 # Flink CLI - {{$ ./bin/flink savepoint --disposeAll}}
 # SQL client - DROP SAVEPOINTS (alternative option could be DROP SAVEPOINT 
ALL, but ALL is a SQL keyword) 



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Created] (FLINK-27842) Rename ndv to granularityNumber

2022-05-30 Thread Jing Ge (Jira)
Jing Ge created FLINK-27842:
---

 Summary: Rename ndv to granularityNumber
 Key: FLINK-27842
 URL: https://issues.apache.org/jira/browse/FLINK-27842
 Project: Flink
  Issue Type: Improvement
  Components: Table SQL / API
Reporter: Jing Ge


Currently ndv, which stands for "number of distinct values", is used in 
ColumnStats. It is difficult to understand the meaning and should use a 
professional naming instead. 

 

Suggestion:

replace ndv with granularityNumber:

 

The good news is that the method getNdv() did used within Flink which mean the 
renaming will have very limited impact.

 

ColumnStats {

/** number of distinct values. */

@Deprecated
private final Long ndv;

 

/**Granularity refers to the level of details used to sort and separate data at 
column level. Highly granular data is categorized or separated very precisely. 
For example, the granularity number of gender column is normally 2. In SQL 
world, it means the number of distinct values. */ 

private final Long granularityNumber;

 

@Deprecated
public Long getNdv() {
return ndv;
}

 

public Long getGranularityNumber() {
return granularityNumber;
}

}

 



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Created] (FLINK-27476) Build new import option that only focus on maven main classes

2022-05-02 Thread Jing Ge (Jira)
Jing Ge created FLINK-27476:
---

 Summary: Build new import option that only focus on  maven main 
classes
 Key: FLINK-27476
 URL: https://issues.apache.org/jira/browse/FLINK-27476
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge


ImportOption.DoNotIncludeTests.class used currently has some issue when running 
test with testContainer. It would be good to define the target class path 
precisely.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Created] (FLINK-27112) Optimize the term usage on the job overview WebUI

2022-04-07 Thread Jing Ge (Jira)
Jing Ge created FLINK-27112:
---

 Summary: Optimize the term usage on the job overview WebUI
 Key: FLINK-27112
 URL: https://issues.apache.org/jira/browse/FLINK-27112
 Project: Flink
  Issue Type: Improvement
  Components: Runtime / Metrics
Affects Versions: 1.15.0
Reporter: Jing Ge


The naming convention between WebUI and metrics are not consistent, e.g. 
"Record send" for NumRecordsOut. Further more, Flink 1.15 developed new feature 
to support different pre- and post- sink topologies[1].  New metrics e.g. 
NumRecordsSend has been developed to measure records number sent to the 
external system. 

 

!https://lists.apache.org/api/email.lua?attachment=true&id=4blc716dj0f1odkxvonh1k4ndmfhhltq&file=2635f32b5825620cb950cbc1cc43be0d0177378d76a8aa6303060ca46db70b76|width=545,height=337!



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-27064) Centralize ArchUnit rules for production code

2022-04-05 Thread Jing Ge (Jira)
Jing Ge created FLINK-27064:
---

 Summary: Centralize ArchUnit rules for production code
 Key: FLINK-27064
 URL: https://issues.apache.org/jira/browse/FLINK-27064
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge
Assignee: Jing Ge


It is better to centralize ArchUnit rules so that external repos are able to 
use them. 



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-27061) Improve the ArchUnit test infra in the external elasticsearch connector

2022-04-05 Thread Jing Ge (Jira)
Jing Ge created FLINK-27061:
---

 Summary: Improve the ArchUnit test infra in the external 
elasticsearch connector
 Key: FLINK-27061
 URL: https://issues.apache.org/jira/browse/FLINK-27061
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge
Assignee: Jing Ge


After the elasticsearch connector has been migrated to the external repo, the 
ArchUnit test infra developed for the Flink project should be adapted 
appropriately.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-27057) Connector to external repo migration - Elasticsearch Connector

2022-04-05 Thread Jing Ge (Jira)
Jing Ge created FLINK-27057:
---

 Summary: Connector to external repo migration - Elasticsearch 
Connector
 Key: FLINK-27057
 URL: https://issues.apache.org/jira/browse/FLINK-27057
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge


Umbrella task for Elasticsearch Connector migration.

Please put all Elasticsearch connector migration related tasks under this task. 
We will review the umbrella task to develop the migration guide and best 
practices for further connector migrations.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26884) Move Elasticsearch connector to external connector repository

2022-03-28 Thread Jing Ge (Jira)
Jing Ge created FLINK-26884:
---

 Summary: Move Elasticsearch connector to external connector 
repository
 Key: FLINK-26884
 URL: https://issues.apache.org/jira/browse/FLINK-26884
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge
Assignee: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26828) build flink-ml example module

2022-03-23 Thread Jing Ge (Jira)
Jing Ge created FLINK-26828:
---

 Summary: build flink-ml example module
 Key: FLINK-26828
 URL: https://issues.apache.org/jira/browse/FLINK-26828
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge
Assignee: Jing Ge


first example is the KMeans described in Flink-ML official doc with some 
modification.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26604) update AvroParquet format user-facing document

2022-03-11 Thread Jing Ge (Jira)
Jing Ge created FLINK-26604:
---

 Summary: update AvroParquet format user-facing document
 Key: FLINK-26604
 URL: https://issues.apache.org/jira/browse/FLINK-26604
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge
Assignee: Jing Ge


* add minimal mvn dependency setup
 * describe the namespace use case in the Avro schema
 * reduce the redundant information w.r.t. the bunded/unbunded data



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26525) new ArchUnit rules for SourceTestSuiteBase and SinkTestSuiteBase

2022-03-07 Thread Jing Ge (Jira)
Jing Ge created FLINK-26525:
---

 Summary: new ArchUnit rules for SourceTestSuiteBase and 
SinkTestSuiteBase
 Key: FLINK-26525
 URL: https://issues.apache.org/jira/browse/FLINK-26525
 Project: Flink
  Issue Type: Improvement
  Components: Tests
Reporter: Jing Ge


Rule 1: Tests inheriting from SourceTestSuiteBase or SinkTestSuiteBase should 
have name ending with ITCase or E2ECase.

Rule 2: ITCASE tests inheriting from SourceTestSuiteBase or SinkTestSuiteBase 
should use a MiniClusterTestEnvironment

Rule 3: E2ECASE tests inheriting from SourceTestSuiteBase or SinkTestSuiteBase 
should use a FlinkContainerTestEnvironment






--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26492) rename numRecordsOutErrors to numRecordsSendErrors in SinkWriterMetricGroup

2022-03-04 Thread Jing Ge (Jira)
Jing Ge created FLINK-26492:
---

 Summary: rename numRecordsOutErrors to numRecordsSendErrors  in 
SinkWriterMetricGroup 
 Key: FLINK-26492
 URL: https://issues.apache.org/jira/browse/FLINK-26492
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge


After introducing new metrics in SinkWriterMetricGroup that all include the 
word Send it makes sense to also rename numRecordsOutErrors to 
numRecordsSendErrors.

The best way would be probably to deprecate numRecordsOutError and introduce 
numRecordsSendError. So we can remove numRecordsOutError with 1.16.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26449) Enable tests in KafkaSourceLegacyITCase for test stability issues

2022-03-02 Thread Jing Ge (Jira)
Jing Ge created FLINK-26449:
---

 Summary: Enable tests in KafkaSourceLegacyITCase for test 
stability issues
 Key: FLINK-26449
 URL: https://issues.apache.org/jira/browse/FLINK-26449
 Project: Flink
  Issue Type: Bug
  Components: Tests
Reporter: Jing Ge


this the follow-up ticket of FLINK-26448



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26448) Disable tests in KafkaSourceLegacyITCase for test stability issues

2022-03-02 Thread Jing Ge (Jira)
Jing Ge created FLINK-26448:
---

 Summary: Disable tests in KafkaSourceLegacyITCase for test 
stability issues
 Key: FLINK-26448
 URL: https://issues.apache.org/jira/browse/FLINK-26448
 Project: Flink
  Issue Type: Bug
  Components: Tests
Reporter: Jing Ge


Currently, there are some issues with the Kafka test env with 1 broker.

Disable testBrokerFailure() and testMultipleTopicsWithKafkaSerializer () in 
KafkaSourceLegacyITCase. After disabling them, Test - kafka/gelly will take 29m 
42s, now is hung for 3h 52m with that tests enabled.

They will be enabled again after the kafka test env has been improved.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26420) error numRecordsOut metric in file connector

2022-03-01 Thread Jing Ge (Jira)
Jing Ge created FLINK-26420:
---

 Summary: error numRecordsOut metric in file connector
 Key: FLINK-26420
 URL: https://issues.apache.org/jira/browse/FLINK-26420
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge


FileWriter should use numRecordsSend metric to calculate the outgoing records.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26357) add @PublicEvolving to AvroParquetRecordFormat

2022-02-24 Thread Jing Ge (Jira)
Jing Ge created FLINK-26357:
---

 Summary: add @PublicEvolving to AvroParquetRecordFormat
 Key: FLINK-26357
 URL: https://issues.apache.org/jira/browse/FLINK-26357
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge






--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26301) Test AvroParquet format

2022-02-22 Thread Jing Ge (Jira)
Jing Ge created FLINK-26301:
---

 Summary: Test AvroParquet format
 Key: FLINK-26301
 URL: https://issues.apache.org/jira/browse/FLINK-26301
 Project: Flink
  Issue Type: Improvement
  Components: Formats (JSON, Avro, Parquet, ORC, SequenceFile)
Reporter: Jing Ge
 Fix For: 1.15.0


The following scenarios are worthwhile to test
 * Start simple job with None/At-least-once/exactly-once delivery guarantee 
read Avro Generic/sSpecific/Reflect records and write them to an arbitrary sink.
 * Start simple job with bounded/unbounded data.
 * Start simple job with streaming/batch execution mode.

 

Reference: 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/formats/parquet/



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


[jira] [Created] (FLINK-26294) Using fixed description for ArchUnit rules

2022-02-22 Thread Jing Ge (Jira)
Jing Ge created FLINK-26294:
---

 Summary: Using fixed description for ArchUnit rules
 Key: FLINK-26294
 URL: https://issues.apache.org/jira/browse/FLINK-26294
 Project: Flink
  Issue Type: Improvement
Reporter: Jing Ge


By default, ArchUnit will use the dynamic generated rule description as the key 
to create violation store. One issue is that, each time, when the rule has been 
changed, the refreezing process will create a new store because the rule 
description as the key is changed. After that, the old stores have to be 
deleted manually. In this PR, constant custom descriptions will be used for 
freezing rule to reduce the maintenance effort.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)


  1   2   >