[GitHub] [flink] flinkbot edited a comment on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test
flinkbot edited a comment on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test URL: https://github.com/apache/flink/pull/9835#issuecomment-537804870 ## CI report: * 49f80fc9c378d0884dd6b411dbe2966ee634f057 : CANCELED [Build](https://travis-ci.com/flink-ci/flink/builds/130147375) * 1226ddc5236ed9f1dc9788ece59a492b500fcbaa : PENDING [Build](https://travis-ci.com/flink-ci/flink/builds/130364791) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9836: [FLINK-14317] make hadoop history server logs accessible through flink ui
flinkbot edited a comment on issue #9836: [FLINK-14317] make hadoop history server logs accessible through flink ui URL: https://github.com/apache/flink/pull/9836#issuecomment-538253983 ## CI report: * 552207b7205bd0a39aba4c920cb9655fdce29781 : PENDING [Build](https://travis-ci.com/flink-ci/flink/builds/130363192) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test
flinkbot edited a comment on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test URL: https://github.com/apache/flink/pull/9835#issuecomment-537804870 ## CI report: * 49f80fc9c378d0884dd6b411dbe2966ee634f057 : PENDING [Build](https://travis-ci.com/flink-ci/flink/builds/130147375) * 1226ddc5236ed9f1dc9788ece59a492b500fcbaa : UNKNOWN This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (FLINK-13417) Bump Zookeeper to 3.5.5
[ https://issues.apache.org/jira/browse/FLINK-13417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16944254#comment-16944254 ] Zili Chen commented on FLINK-13417: --- Another option to avoid dependency to bump curator is excluding zk from curator dependencies as done in pr of FLINK-10052. I am starting a new branch to verify it. > Bump Zookeeper to 3.5.5 > --- > > Key: FLINK-13417 > URL: https://issues.apache.org/jira/browse/FLINK-13417 > Project: Flink > Issue Type: Improvement > Components: Runtime / Coordination >Affects Versions: 1.9.0 >Reporter: Konstantin Knauf >Priority: Major > Fix For: 1.10.0 > > > User might want to secure their Zookeeper connection via SSL. > This requires a Zookeeper version >= 3.5.1. We might as well try to bump it > to 3.5.5, which is the latest version. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] buptljy commented on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test
buptljy commented on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test URL: https://github.com/apache/flink/pull/9835#issuecomment-538256190 @flinkbot run travis This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] buptljy commented on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test
buptljy commented on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test URL: https://github.com/apache/flink/pull/9835#issuecomment-538256123 Remove the retries config in exactlyOnce testing because exactly-once producer sets it internally. @flinkbot run travis This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot commented on issue #9836: [FLINK-14317] make hadoop history server logs accessible through flink ui
flinkbot commented on issue #9836: [FLINK-14317] make hadoop history server logs accessible through flink ui URL: https://github.com/apache/flink/pull/9836#issuecomment-538253983 ## CI report: * 552207b7205bd0a39aba4c920cb9655fdce29781 : UNKNOWN This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot commented on issue #9836: [FLINK-14317] make hadoop history server logs accessible through flink ui
flinkbot commented on issue #9836: [FLINK-14317] make hadoop history server logs accessible through flink ui URL: https://github.com/apache/flink/pull/9836#issuecomment-538250647 Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community to review your pull request. We will use this comment to track the progress of the review. ## Automated Checks Last check on commit 552207b7205bd0a39aba4c920cb9655fdce29781 (Fri Oct 04 06:00:58 UTC 2019) **Warnings:** * No documentation files were touched! Remember to keep the Flink docs up to date! * **This pull request references an unassigned [Jira ticket](https://issues.apache.org/jira/browse/FLINK-14317).** According to the [code contribution guide](https://flink.apache.org/contributing/contribute-code.html), tickets need to be assigned before starting with the implementation work. Mention the bot in a comment to re-run the automated checks. ## Review Progress * ❓ 1. The [description] looks good. * ❓ 2. There is [consensus] that the contribution should go into to Flink. * ❓ 3. Needs [attention] from. * ❓ 4. The change fits into the overall [architecture]. * ❓ 5. Overall code [quality] is good. Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process. The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required Bot commands The @flinkbot bot supports the following commands: - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`) - `@flinkbot approve all` to approve all aspects - `@flinkbot approve-until architecture` to approve everything until `architecture` - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention - `@flinkbot disapprove architecture` to remove an approval you gave earlier This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] yuyang08 opened a new pull request #9836: [FLINK-14317] make hadoop history server logs accessible through flink ui
yuyang08 opened a new pull request #9836: [FLINK-14317] make hadoop history server logs accessible through flink ui URL: https://github.com/apache/flink/pull/9836 ## What is the purpose of the change This change is to add hadoop history server url to flink runtime web ui. This will simplify task manager logs access, reducing the time that developers spend on finding logs for specific subtasks, and ultimately improve developer productivity. ## Brief change log - add `JobExecutionHistory` to keep track of cluster id (yarn application id), and taskManagerId (yarn container id) --> taskManagerLocation mapping. We can use the information stored in JobExecutionHisotry to reconstruct hadoop history server url for taskmanager logs, and store the info in archived execution graph. The flink history server reads archived execution graph and renders hadoop history server url. - updated `SubtaskExecutionAttemptDetailsInfo.create` to render proper hadoop history server url ## Verifying this change This change is already covered by existing tests, such as *JobMasterTests*. ## Does this pull request potentially affect one of the following parts: - Dependencies (does it add or upgrade a dependency): no - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: no - The serializers: no - The runtime per-record code paths (performance sensitive): no - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Yarn/Mesos, ZooKeeper: no - The S3 file system connector: no ## Documentation - Does this pull request introduce a new feature? yes - If yes, how is the feature documented? JavaDocs This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Updated] (FLINK-14317) make hadoop history server logs accessible through flink ui
[ https://issues.apache.org/jira/browse/FLINK-14317?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] ASF GitHub Bot updated FLINK-14317: --- Labels: pull-request-available (was: ) > make hadoop history server logs accessible through flink ui > --- > > Key: FLINK-14317 > URL: https://issues.apache.org/jira/browse/FLINK-14317 > Project: Flink > Issue Type: Improvement > Components: Runtime / Task, Runtime / Web Frontend >Reporter: Yu Yang >Priority: Major > Labels: pull-request-available > > Currently if Flink jobs run on Yarn, the task manager logs are not accessible > through flink history server ui. And there is no straightforward way for > users to find yarn logs of specific tasks from completed logs. Making task > manager logs accessible through flink UI will help to improve developer > productivity. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-14225) Travis is unable to parse one of the secure environment variables
[ https://issues.apache.org/jira/browse/FLINK-14225?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16944243#comment-16944243 ] Gary Yao commented on FLINK-14225: -- I think it is either {{IT_CASE_GCS_BUCKET}} or {{IT_CASE_GCS_TOKEN}} (tests against S3 are running, and artifacts are being uploaded). I cannot find usages of these anywhere in the code. > Travis is unable to parse one of the secure environment variables > - > > Key: FLINK-14225 > URL: https://issues.apache.org/jira/browse/FLINK-14225 > Project: Flink > Issue Type: Bug > Components: Build System >Affects Versions: 1.10.0 >Reporter: Gary Yao >Priority: Blocker > > Example: https://travis-ci.org/apache/flink/jobs/589531009 > {noformat} > We were unable to parse one of your secure environment variables. > Please make sure to escape special characters such as ' ' (white space) and $ > (dollar symbol) with \ (backslash) . > For example, thi$isanexample would be typed as thi\$isanexample. See > https://docs.travis-ci.com/user/encryption-keys. > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-14318) JDK11 build stalls during shading
[ https://issues.apache.org/jira/browse/FLINK-14318?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16944231#comment-16944231 ] Gary Yao commented on FLINK-14318: -- cc: [~chesnay] > JDK11 build stalls during shading > - > > Key: FLINK-14318 > URL: https://issues.apache.org/jira/browse/FLINK-14318 > Project: Flink > Issue Type: Bug > Components: Build System >Reporter: Gary Yao >Priority: Critical > > JDK11 build stalls during shading. > Travis stage: e2d - misc - jdk11 > https://travis-ci.org/apache/flink/builds/593022581?utm_source=slack&utm_medium=notification > https://api.travis-ci.org/v3/job/593022629/log.txt > Relevant excerpt from logs: > {noformat} > 01:53:43.889 [INFO] > > 01:53:43.889 [INFO] Building flink-metrics-reporter-prometheus-test > 1.10-SNAPSHOT > 01:53:43.889 [INFO] > > ... > 01:53:44.508 [INFO] Including > org.apache.flink:force-shading:jar:1.10-SNAPSHOT in the shaded jar. > 01:53:44.508 [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.15 from the shaded > jar. > 01:53:44.508 [INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from > the shaded jar. > 01:53:44.508 [INFO] No artifact matching filter io.netty:netty > 01:53:44.522 [INFO] Replacing original artifact with shaded artifact. > 01:53:44.523 [INFO] Replacing > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT.jar > with > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded.jar > 01:53:44.524 [INFO] Replacing original test artifact with shaded test > artifact. > 01:53:44.524 [INFO] Replacing > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-tests.jar > with > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded-tests.jar > 01:53:44.524 [INFO] Dependency-reduced POM written at: > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/dependency-reduced-pom.xml > No output has been received in the last 10m0s, this potentially indicates a > stalled build or something wrong with the build itself. > Check the details on how to adjust your build configuration on: > https://docs.travis-ci.com/user/common-build-problems/#build-times-out-because-no-output-was-received > The build has been terminated > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-14318) JDK11 build stalls during shading
[ https://issues.apache.org/jira/browse/FLINK-14318?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Gary Yao updated FLINK-14318: - Priority: Critical (was: Major) > JDK11 build stalls during shading > - > > Key: FLINK-14318 > URL: https://issues.apache.org/jira/browse/FLINK-14318 > Project: Flink > Issue Type: Bug > Components: Build System >Reporter: Gary Yao >Priority: Critical > > JDK11 build stalls during shading. > Travis stage: e2d - misc - jdk11 > https://travis-ci.org/apache/flink/builds/593022581?utm_source=slack&utm_medium=notification > https://api.travis-ci.org/v3/job/593022629/log.txt > Relevant excerpt from logs: > {noformat} > 01:53:43.889 [INFO] > > 01:53:43.889 [INFO] Building flink-metrics-reporter-prometheus-test > 1.10-SNAPSHOT > 01:53:43.889 [INFO] > > ... > 01:53:44.508 [INFO] Including > org.apache.flink:force-shading:jar:1.10-SNAPSHOT in the shaded jar. > 01:53:44.508 [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.15 from the shaded > jar. > 01:53:44.508 [INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from > the shaded jar. > 01:53:44.508 [INFO] No artifact matching filter io.netty:netty > 01:53:44.522 [INFO] Replacing original artifact with shaded artifact. > 01:53:44.523 [INFO] Replacing > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT.jar > with > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded.jar > 01:53:44.524 [INFO] Replacing original test artifact with shaded test > artifact. > 01:53:44.524 [INFO] Replacing > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-tests.jar > with > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded-tests.jar > 01:53:44.524 [INFO] Dependency-reduced POM written at: > /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/dependency-reduced-pom.xml > {noformat} > No output has been received in the last 10m0s, this potentially indicates a > stalled build or something wrong with the build itself. > Check the details on how to adjust your build configuration on: > https://docs.travis-ci.com/user/common-build-problems/#build-times-out-because-no-output-was-received > The build has been terminated -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (FLINK-14318) JDK11 build stalls during shading
Gary Yao created FLINK-14318: Summary: JDK11 build stalls during shading Key: FLINK-14318 URL: https://issues.apache.org/jira/browse/FLINK-14318 Project: Flink Issue Type: Bug Components: Build System Reporter: Gary Yao JDK11 build stalls during shading. Travis stage: e2d - misc - jdk11 https://travis-ci.org/apache/flink/builds/593022581?utm_source=slack&utm_medium=notification https://api.travis-ci.org/v3/job/593022629/log.txt Relevant excerpt from logs: {noformat} 01:53:43.889 [INFO] 01:53:43.889 [INFO] Building flink-metrics-reporter-prometheus-test 1.10-SNAPSHOT 01:53:43.889 [INFO] ... 01:53:44.508 [INFO] Including org.apache.flink:force-shading:jar:1.10-SNAPSHOT in the shaded jar. 01:53:44.508 [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.15 from the shaded jar. 01:53:44.508 [INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from the shaded jar. 01:53:44.508 [INFO] No artifact matching filter io.netty:netty 01:53:44.522 [INFO] Replacing original artifact with shaded artifact. 01:53:44.523 [INFO] Replacing /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT.jar with /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded.jar 01:53:44.524 [INFO] Replacing original test artifact with shaded test artifact. 01:53:44.524 [INFO] Replacing /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-tests.jar with /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded-tests.jar 01:53:44.524 [INFO] Dependency-reduced POM written at: /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/dependency-reduced-pom.xml {noformat} No output has been received in the last 10m0s, this potentially indicates a stalled build or something wrong with the build itself. Check the details on how to adjust your build configuration on: https://docs.travis-ci.com/user/common-build-problems/#build-times-out-because-no-output-was-received The build has been terminated -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Updated] (FLINK-14318) JDK11 build stalls during shading
[ https://issues.apache.org/jira/browse/FLINK-14318?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Gary Yao updated FLINK-14318: - Description: JDK11 build stalls during shading. Travis stage: e2d - misc - jdk11 https://travis-ci.org/apache/flink/builds/593022581?utm_source=slack&utm_medium=notification https://api.travis-ci.org/v3/job/593022629/log.txt Relevant excerpt from logs: {noformat} 01:53:43.889 [INFO] 01:53:43.889 [INFO] Building flink-metrics-reporter-prometheus-test 1.10-SNAPSHOT 01:53:43.889 [INFO] ... 01:53:44.508 [INFO] Including org.apache.flink:force-shading:jar:1.10-SNAPSHOT in the shaded jar. 01:53:44.508 [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.15 from the shaded jar. 01:53:44.508 [INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from the shaded jar. 01:53:44.508 [INFO] No artifact matching filter io.netty:netty 01:53:44.522 [INFO] Replacing original artifact with shaded artifact. 01:53:44.523 [INFO] Replacing /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT.jar with /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded.jar 01:53:44.524 [INFO] Replacing original test artifact with shaded test artifact. 01:53:44.524 [INFO] Replacing /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-tests.jar with /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded-tests.jar 01:53:44.524 [INFO] Dependency-reduced POM written at: /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/dependency-reduced-pom.xml No output has been received in the last 10m0s, this potentially indicates a stalled build or something wrong with the build itself. Check the details on how to adjust your build configuration on: https://docs.travis-ci.com/user/common-build-problems/#build-times-out-because-no-output-was-received The build has been terminated {noformat} was: JDK11 build stalls during shading. Travis stage: e2d - misc - jdk11 https://travis-ci.org/apache/flink/builds/593022581?utm_source=slack&utm_medium=notification https://api.travis-ci.org/v3/job/593022629/log.txt Relevant excerpt from logs: {noformat} 01:53:43.889 [INFO] 01:53:43.889 [INFO] Building flink-metrics-reporter-prometheus-test 1.10-SNAPSHOT 01:53:43.889 [INFO] ... 01:53:44.508 [INFO] Including org.apache.flink:force-shading:jar:1.10-SNAPSHOT in the shaded jar. 01:53:44.508 [INFO] Excluding org.slf4j:slf4j-api:jar:1.7.15 from the shaded jar. 01:53:44.508 [INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from the shaded jar. 01:53:44.508 [INFO] No artifact matching filter io.netty:netty 01:53:44.522 [INFO] Replacing original artifact with shaded artifact. 01:53:44.523 [INFO] Replacing /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT.jar with /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded.jar 01:53:44.524 [INFO] Replacing original test artifact with shaded test artifact. 01:53:44.524 [INFO] Replacing /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-tests.jar with /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/flink-metrics-reporter-prometheus-test-1.10-SNAPSHOT-shaded-tests.jar 01:53:44.524 [INFO] Dependency-reduced POM written at: /home/travis/build/apache/flink/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/target/dependency-reduced-pom.xml {noformat} No output has been received in the last 10m0s, this potentially indicates a stalled build or something wrong with the build itself. Check the details on how to adjust your build configuration on: https://docs.travis-ci.com/user/common-build-problems/#build-times-out-because-no-output-was-received The build has been terminated > JDK11 build stalls during shading > - > > Key: FLINK-14318 > URL: https://issues.apache.org/jira/browse/FLINK-14318 > Project: Flink >
[GitHub] [flink] GJL commented on a change in pull request #9791: [FLINK-14248][runtime] Let LazyFromSourcesSchedulingStrategy restart terminated tasks
GJL commented on a change in pull request #9791: [FLINK-14248][runtime] Let LazyFromSourcesSchedulingStrategy restart terminated tasks URL: https://github.com/apache/flink/pull/9791#discussion_r331339404 ## File path: flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/strategy/LazyFromSourcesSchedulingStrategy.java ## @@ -78,19 +81,22 @@ public void startScheduling() { deploymentOptions.put(schedulingVertex.getId(), option); } - allocateSlotsAndDeployExecutionVertexIds(getAllVerticesFromTopology()); + allocateSlotsAndDeployExecutionVertices( + getSchedulingExecutionVertices(getAllVerticesFromTopology()), + IS_IN_CREATED_STATE); } @Override - public void restartTasks(Set verticesToRestart) { + public void restartTasks(final Set verticesToRestart) { + final Set verticesToSchedule = getSchedulingExecutionVertices(verticesToRestart); + // increase counter of the dataset first - verticesToRestart + verticesToSchedule .stream() - .map(schedulingTopology::getVertexOrThrow) .flatMap(vertex -> vertex.getProducedResultPartitions().stream()) .forEach(inputConstraintChecker::resetSchedulingResultPartition); - allocateSlotsAndDeployExecutionVertexIds(verticesToRestart); + allocateSlotsAndDeployExecutionVertices(verticesToSchedule, IS_IN_TERMINAL_STATE); Review comment: > And only A1 will be re-scheduled on `restartTasks()` since the inputs of B1 are not ready. B1 should be scheduled later on the partition consumable event from restarted A1. But the terminal state of B1 will prevent B1 from being scheduled. Nice catch. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] GJL commented on a change in pull request #9791: [FLINK-14248][runtime] Let LazyFromSourcesSchedulingStrategy restart terminated tasks
GJL commented on a change in pull request #9791: [FLINK-14248][runtime] Let LazyFromSourcesSchedulingStrategy restart terminated tasks URL: https://github.com/apache/flink/pull/9791#discussion_r331339226 ## File path: flink-runtime/src/main/java/org/apache/flink/runtime/scheduler/strategy/LazyFromSourcesSchedulingStrategy.java ## @@ -78,19 +81,22 @@ public void startScheduling() { deploymentOptions.put(schedulingVertex.getId(), option); } - allocateSlotsAndDeployExecutionVertexIds(getAllVerticesFromTopology()); + allocateSlotsAndDeployExecutionVertices( + getSchedulingExecutionVertices(getAllVerticesFromTopology()), + IS_IN_CREATED_STATE); } @Override - public void restartTasks(Set verticesToRestart) { + public void restartTasks(final Set verticesToRestart) { + final Set verticesToSchedule = getSchedulingExecutionVertices(verticesToRestart); + // increase counter of the dataset first - verticesToRestart + verticesToSchedule .stream() - .map(schedulingTopology::getVertexOrThrow) .flatMap(vertex -> vertex.getProducedResultPartitions().stream()) .forEach(inputConstraintChecker::resetSchedulingResultPartition); - allocateSlotsAndDeployExecutionVertexIds(verticesToRestart); + allocateSlotsAndDeployExecutionVertices(verticesToSchedule, IS_IN_TERMINAL_STATE); Review comment: Ok let's do it. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] xuefuz commented on a change in pull request #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog
xuefuz commented on a change in pull request #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog URL: https://github.com/apache/flink/pull/9822#discussion_r331333915 ## File path: flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/catalog/FunctionCatalog.java ## @@ -128,12 +127,71 @@ public void registerScalarFunction(String name, ScalarFunction function) { throw new TableException("Unknown function class: " + function.getClass()); } - registerFunction( + registerTempSystemFunction( name, definition ); } + public void registerTempCatalogScalarFunction(ObjectIdentifier oi, ScalarFunction function) { + UserFunctionsTypeHelper.validateInstantiation(function.getClass()); + registerTempCatalogFunction( + oi, + new ScalarFunctionDefinition(oi.getObjectName(), function) + ); + } + + public void registerTempCatalogTableFunction( + ObjectIdentifier oi, + TableFunction function, + TypeInformation resultType) { + // check if class not Scala object + UserFunctionsTypeHelper.validateNotSingleton(function.getClass()); + // check if class could be instantiated + UserFunctionsTypeHelper.validateInstantiation(function.getClass()); + + registerTempCatalogFunction( + oi, + new TableFunctionDefinition( + oi.getObjectName(), + function, + resultType) + ); + } + + public void registerTempCatalogAggregateFunction( + ObjectIdentifier oi, + UserDefinedAggregateFunction function, + TypeInformation resultType, + TypeInformation accType) { Review comment: Same as above. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] xuefuz commented on a change in pull request #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog
xuefuz commented on a change in pull request #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog URL: https://github.com/apache/flink/pull/9822#discussion_r331333067 ## File path: flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/catalog/FunctionCatalog.java ## @@ -240,13 +298,24 @@ public PlannerTypeInferenceUtil getPlannerTypeInferenceUtil() { return plannerTypeInferenceUtil; } - private void registerFunction(String name, FunctionDefinition functionDefinition) { - // TODO: should register to catalog - userFunctions.put(normalizeName(name), functionDefinition); + private void registerTempSystemFunction(String name, FunctionDefinition functionDefinition) { + tempSystemFunctions.put(normalizeName(name), functionDefinition); + } + + private void registerTempCatalogFunction(ObjectIdentifier oi, FunctionDefinition functionDefinition) { + tempCatalogFunctions.put(normalizeObjectIdentifier(oi), functionDefinition); } @VisibleForTesting static String normalizeName(String name) { return name.toUpperCase(); } + + @VisibleForTesting + static ObjectIdentifier normalizeObjectIdentifier(ObjectIdentifier oi) { + return ObjectIdentifier.of( + oi.getCatalogName(), + oi.getDatabaseName(), Review comment: Don't we need to normalize the cat/db names as well, if they are case insensitive. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] xuefuz commented on a change in pull request #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog
xuefuz commented on a change in pull request #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog URL: https://github.com/apache/flink/pull/9822#discussion_r33163 ## File path: flink-table/flink-table-api-java/src/main/java/org/apache/flink/table/catalog/FunctionCatalog.java ## @@ -128,12 +127,71 @@ public void registerScalarFunction(String name, ScalarFunction function) { throw new TableException("Unknown function class: " + function.getClass()); } - registerFunction( + registerTempSystemFunction( name, definition ); } + public void registerTempCatalogScalarFunction(ObjectIdentifier oi, ScalarFunction function) { + UserFunctionsTypeHelper.validateInstantiation(function.getClass()); + registerTempCatalogFunction( + oi, + new ScalarFunctionDefinition(oi.getObjectName(), function) + ); + } + + public void registerTempCatalogTableFunction( + ObjectIdentifier oi, + TableFunction function, + TypeInformation resultType) { Review comment: One more tab indention? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] xintongsong commented on issue #9760: [FLINK-13982][runtime] Implement memory calculation logics
xintongsong commented on issue #9760: [FLINK-13982][runtime] Implement memory calculation logics URL: https://github.com/apache/flink/pull/9760#issuecomment-538212636 @tillrohrmann, I agree that the memory configurations can be modeled as a linear programming problem, but I'm not sure whether we should do this. I'm in favor of derive the memory sizes manually, like how it is done currently with a if-else branches. I have some concerns on using known linear programming solvers. - We are limited by the double type precision, and cannot use BigDecimal in case of rounding errors. - If there is a `NoFeasibleSolutionException`, we may not be able to provide helpful information to help the user fix their configurations. E.g., which values are not adding up, and which values can potentially be increased/decreased to fix the problem. On the other hand, manually calculating the memory sizes (one after another) may not solve the exactly same linear programming problem (it may not explore the entire solution space due to early deciding some of the memory sizes). But maybe this is not a bad thing. I think it would be easier for user to understand how a particular memory size is calculated (less dependencies on others), and how to change/ fix it in case of conflicts. I think it would be good that given the explicitly configured values, we try to find a feasible solution by tuning the none specified values. But I think it's not necessary that as long as there's a feasible solution we can always find it, especially at the cost of the two concerns mentioned above. What do you think? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (FLINK-14317) make hadoop history server logs accessible through flink ui
Yu Yang created FLINK-14317: --- Summary: make hadoop history server logs accessible through flink ui Key: FLINK-14317 URL: https://issues.apache.org/jira/browse/FLINK-14317 Project: Flink Issue Type: Improvement Components: Runtime / Task, Runtime / Web Frontend Reporter: Yu Yang Currently if Flink jobs run on Yarn, the task manager logs are not accessible through flink history server ui. And there is no straightforward way for users to find yarn logs of specific tasks from completed logs. Making task manager logs accessible through flink UI will help to improve developer productivity. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog
flinkbot edited a comment on issue #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog URL: https://github.com/apache/flink/pull/9822#issuecomment-536774896 ## CI report: * 611ecdea461d9c202e0fd3ad0b33f24e8a4db061 : CANCELED [Build](https://travis-ci.com/flink-ci/flink/builds/129772949) * cc357e4d5365aa98db6ebb8a3c1a821080bb202e : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129775325) * a06e7bd3f0853e637a7b1cbac52bc70d1ebe3923 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/130117578) * 01f01b31477debaf52158e9878d356d0f51e868b : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/130321929) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog
flinkbot edited a comment on issue #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog URL: https://github.com/apache/flink/pull/9822#issuecomment-536774896 ## CI report: * 611ecdea461d9c202e0fd3ad0b33f24e8a4db061 : CANCELED [Build](https://travis-ci.com/flink-ci/flink/builds/129772949) * cc357e4d5365aa98db6ebb8a3c1a821080bb202e : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129775325) * a06e7bd3f0853e637a7b1cbac52bc70d1ebe3923 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/130117578) * 01f01b31477debaf52158e9878d356d0f51e868b : PENDING [Build](https://travis-ci.com/flink-ci/flink/builds/130321929) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Created] (FLINK-14316) stuck in "Job leader ... lost leadership" error
Steven Zhen Wu created FLINK-14316: -- Summary: stuck in "Job leader ... lost leadership" error Key: FLINK-14316 URL: https://issues.apache.org/jira/browse/FLINK-14316 Project: Flink Issue Type: Bug Components: Runtime / Coordination Affects Versions: 1.7.2 Reporter: Steven Zhen Wu This is the first exception caused restart loop. Later exceptions are the same. Job seems to stuck in this permanent failure state. {code} 2019-10-03 21:42:46,159 INFO org.apache.flink.runtime.executiongraph.ExecutionGraph- Source: clpevents -> device_filter -> processed_imps -> ios_processed_impression -> i mps_ts_assigner (449/1360) (d237f5e99b6a4a580498821473763edb) switched from SCHEDULED to FAILED. java.lang.Exception: Job leader for job id ecb9ad9be934edf7b1a4f7b9dd6df365 lost leadership. at org.apache.flink.runtime.taskexecutor.TaskExecutor$JobLeaderListenerImpl.lambda$jobManagerLostLeadership$1(TaskExecutor.java:1526) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:332) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:158) at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.onReceive(AkkaRpcActor.java:142) at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:165) at akka.actor.Actor$class.aroundReceive(Actor.scala:502) at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:95) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:526) at akka.actor.ActorCell.invoke(ActorCell.scala:495) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:257) at akka.dispatch.Mailbox.run(Mailbox.scala:224) at akka.dispatch.Mailbox.exec(Mailbox.scala:234) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog
flinkbot edited a comment on issue #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog URL: https://github.com/apache/flink/pull/9822#issuecomment-536774896 ## CI report: * 611ecdea461d9c202e0fd3ad0b33f24e8a4db061 : CANCELED [Build](https://travis-ci.com/flink-ci/flink/builds/129772949) * cc357e4d5365aa98db6ebb8a3c1a821080bb202e : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129775325) * a06e7bd3f0853e637a7b1cbac52bc70d1ebe3923 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/130117578) * 01f01b31477debaf52158e9878d356d0f51e868b : UNKNOWN This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] bowenli86 commented on issue #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog
bowenli86 commented on issue #9822: [FLINK-14216][table] introduce temp system functions to FunctionCatalog URL: https://github.com/apache/flink/pull/9822#issuecomment-538133259 @xuefuz can you take a look? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] HuangZhenQiu commented on a change in pull request #9689: [FLINK-7151] add a basic function ddl
HuangZhenQiu commented on a change in pull request #9689: [FLINK-7151] add a basic function ddl URL: https://github.com/apache/flink/pull/9689#discussion_r331084612 ## File path: flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkSqlParserImplTest.java ## @@ -529,6 +529,24 @@ public void testDropIfExists() { check(sql, "DROP TABLE IF EXISTS `CATALOG1`.`DB1`.`TBL1`"); } + @Test + public void testCreateFunction() { Review comment: Added. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] HuangZhenQiu commented on a change in pull request #9689: [FLINK-7151] add a basic function ddl
HuangZhenQiu commented on a change in pull request #9689: [FLINK-7151] add a basic function ddl URL: https://github.com/apache/flink/pull/9689#discussion_r331084550 ## File path: flink-table/flink-sql-parser/src/main/codegen/config.fmpp ## @@ -10,7 +10,7 @@ # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and +# See the License for the specific language governing permissions andFlinkSqlParserImpl Review comment: Good catch This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[jira] [Commented] (FLINK-13417) Bump Zookeeper to 3.5.5
[ https://issues.apache.org/jira/browse/FLINK-13417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16943594#comment-16943594 ] Zili Chen commented on FLINK-13417: --- Hello Stephan, the situation is actually flink-runtime uses zookeeper(flink-mesos used a wrapper version of curator's shared value which satisfy use through a utility way) so thay I think the way you proposed is worth to try. However, there's one prerequisite of this way. That's we need to bump curator-test to 4.x(related FLINK-10052) because curator-test 2.x uses zk 3.4 for starting server so that create container message from zk 2.5 client cannot be identified. > Bump Zookeeper to 3.5.5 > --- > > Key: FLINK-13417 > URL: https://issues.apache.org/jira/browse/FLINK-13417 > Project: Flink > Issue Type: Improvement > Components: Runtime / Coordination >Affects Versions: 1.9.0 >Reporter: Konstantin Knauf >Priority: Major > Fix For: 1.10.0 > > > User might want to secure their Zookeeper connection via SSL. > This requires a Zookeeper version >= 3.5.1. We might as well try to bump it > to 3.5.5, which is the latest version. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-14118) Reduce the unnecessary flushing when there is no data available for flush
[ https://issues.apache.org/jira/browse/FLINK-14118?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16943577#comment-16943577 ] Stephan Ewen commented on FLINK-14118: -- Did we have significant changes in that part of the network stack since the 1.9 release? So that we would expect subtle issues (only detected after many runs) in 1.9 that would not apply to 1.10? If it is a serious performance issue, users would appreciate the fix in 1.9 as well. > Reduce the unnecessary flushing when there is no data available for flush > - > > Key: FLINK-14118 > URL: https://issues.apache.org/jira/browse/FLINK-14118 > Project: Flink > Issue Type: Improvement > Components: Runtime / Network >Reporter: Yingjie Cao >Priority: Critical > Labels: pull-request-available > Fix For: 1.10.0 > > Time Spent: 10m > Remaining Estimate: 0h > > The new flush implementation which works by triggering a netty user event may > cause performance regression compared to the old synchronization-based one. > More specifically, when there is exactly one BufferConsumer in the buffer > queue of subpartition and no new data will be added for a while in the future > (may because of just no input or the logic of the operator is to collect some > data for processing and will not emit records immediately), that is, there is > no data to send, the OutputFlusher will continuously notify data available > and wake up the netty thread, though no data will be returned by the > pollBuffer method. > For some of our production jobs, this will incur 20% to 40% CPU overhead > compared to the old implementation. We tried to fix the problem by checking > if there is new data available when flushing, if there is no new data, the > netty thread will not be notified. It works for our jobs and the cpu usage > falls to previous level. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-14225) Travis is unable to parse one of the secure environment variables
[ https://issues.apache.org/jira/browse/FLINK-14225?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16943574#comment-16943574 ] Stephan Ewen commented on FLINK-14225: -- Do you know which variable that is? I am not aware of any changes in the secret environment variables in a while. Was that error always there that recent? > Travis is unable to parse one of the secure environment variables > - > > Key: FLINK-14225 > URL: https://issues.apache.org/jira/browse/FLINK-14225 > Project: Flink > Issue Type: Bug > Components: Build System >Affects Versions: 1.10.0 >Reporter: Gary Yao >Priority: Blocker > > Example: https://travis-ci.org/apache/flink/jobs/589531009 > {noformat} > We were unable to parse one of your secure environment variables. > Please make sure to escape special characters such as ' ' (white space) and $ > (dollar symbol) with \ (backslash) . > For example, thi$isanexample would be typed as thi\$isanexample. See > https://docs.travis-ci.com/user/encryption-keys. > {noformat} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Commented] (FLINK-13417) Bump Zookeeper to 3.5.5
[ https://issues.apache.org/jira/browse/FLINK-13417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16943572#comment-16943572 ] Stephan Ewen commented on FLINK-13417: -- Sorry for having been out of this thread for a while. If there is a way to avoid explicitly shaded modules (especially inside Flink) I believe that this is preferable. We are also trying hard to get other statically relocated libraries moved to flink-shaded. Is ZK used in more than one module (also outside flink-runtime)? In not, then it should be sufficient to just shade it inside the runtime package. If it is used in other places, is it used directly, or is it used through a runtime utility? If we can get to the later case, we could have ZK only shaded/relocated in flink-runtime. > Bump Zookeeper to 3.5.5 > --- > > Key: FLINK-13417 > URL: https://issues.apache.org/jira/browse/FLINK-13417 > Project: Flink > Issue Type: Improvement > Components: Runtime / Coordination >Affects Versions: 1.9.0 >Reporter: Konstantin Knauf >Priority: Major > Fix For: 1.10.0 > > > User might want to secure their Zookeeper connection via SSL. > This requires a Zookeeper version >= 3.5.1. We might as well try to bump it > to 3.5.5, which is the latest version. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[GitHub] [flink] flinkbot edited a comment on issue #9815: [FLINK-14117][docs-zh] Translate changes on index page to Chinese
flinkbot edited a comment on issue #9815: [FLINK-14117][docs-zh] Translate changes on index page to Chinese URL: https://github.com/apache/flink/pull/9815#issuecomment-536343016 ## CI report: * 615acdb2511760c55f8831934f710678a8962acc : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129610971) * 38894aa30a82d5763ad8137e176ac7d78ee13178 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129652562) * 4b35c7d8ce5d86f030037b924a943f857d7f8a01 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129684998) * 9099b3a2a220c0d289899f05cea6714fc281d800 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129802867) * 90d7fce7f9b27fe52e84c65545c701609b9a3ea9 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129858575) * 7d25238fda6dbf1029d3d315d3b1077227ca05ce : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/130010236) * befea58ae5adddcd82d3e8cf45793488fb4ee5f5 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/130163856) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9815: [FLINK-14117][docs-zh] Translate changes on index page to Chinese
flinkbot edited a comment on issue #9815: [FLINK-14117][docs-zh] Translate changes on index page to Chinese URL: https://github.com/apache/flink/pull/9815#issuecomment-536343016 ## CI report: * 615acdb2511760c55f8831934f710678a8962acc : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129610971) * 38894aa30a82d5763ad8137e176ac7d78ee13178 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129652562) * 4b35c7d8ce5d86f030037b924a943f857d7f8a01 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129684998) * 9099b3a2a220c0d289899f05cea6714fc281d800 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129802867) * 90d7fce7f9b27fe52e84c65545c701609b9a3ea9 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129858575) * 7d25238fda6dbf1029d3d315d3b1077227ca05ce : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/130010236) * befea58ae5adddcd82d3e8cf45793488fb4ee5f5 : PENDING [Build](https://travis-ci.com/flink-ci/flink/builds/130163856) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9815: [FLINK-14117][docs-zh] Translate changes on index page to Chinese
flinkbot edited a comment on issue #9815: [FLINK-14117][docs-zh] Translate changes on index page to Chinese URL: https://github.com/apache/flink/pull/9815#issuecomment-536343016 ## CI report: * 615acdb2511760c55f8831934f710678a8962acc : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129610971) * 38894aa30a82d5763ad8137e176ac7d78ee13178 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129652562) * 4b35c7d8ce5d86f030037b924a943f857d7f8a01 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129684998) * 9099b3a2a220c0d289899f05cea6714fc281d800 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129802867) * 90d7fce7f9b27fe52e84c65545c701609b9a3ea9 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129858575) * 7d25238fda6dbf1029d3d315d3b1077227ca05ce : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/130010236) * befea58ae5adddcd82d3e8cf45793488fb4ee5f5 : UNKNOWN This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] koonchen commented on issue #9815: [FLINK-14117][docs-zh] Translate changes on index page to Chinese
koonchen commented on issue #9815: [FLINK-14117][docs-zh] Translate changes on index page to Chinese URL: https://github.com/apache/flink/pull/9815#issuecomment-537854709 @wuchong Thank you very much, can you review this pr again when you are free? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9689: [FLINK-7151] add a basic function ddl
flinkbot edited a comment on issue #9689: [FLINK-7151] add a basic function ddl URL: https://github.com/apache/flink/pull/9689#issuecomment-531747685 ## CI report: * bd2624914db1147588ea838ae542333c310290cc : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/127790175) * b5523d10152123f45cf883e446872b90532879c3 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/128113059) * 7c4c25b26aa9549fe83628315b816f16327000b1 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129622336) * 9e6b35cb0586839b893211b555192101ffce1a95 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129975363) * 1ce63dfde01ba87a2d8e2d75ab400f47c3632f86 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/130150709) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9706: [FLINK-14118][runtime]Reduce the unnecessary flushing when there is no data available for flush.
flinkbot edited a comment on issue #9706: [FLINK-14118][runtime]Reduce the unnecessary flushing when there is no data available for flush. URL: https://github.com/apache/flink/pull/9706#issuecomment-532638785 ## CI report: * 053ed30d568a00ba42d75d1a9534843c59a068af : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/128156924) * eff90c955453ab20c1dea96bb44f28ec354c8405 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129408495) * 2dcb4e9de3810a282e8534290d65e1a0ec153fd3 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129428351) * 08ea47ddaecc322a4e55a596d75e13b6706d3fc0 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/130150727) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9706: [FLINK-14118][runtime]Reduce the unnecessary flushing when there is no data available for flush.
flinkbot edited a comment on issue #9706: [FLINK-14118][runtime]Reduce the unnecessary flushing when there is no data available for flush. URL: https://github.com/apache/flink/pull/9706#issuecomment-532638785 ## CI report: * 053ed30d568a00ba42d75d1a9534843c59a068af : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/128156924) * eff90c955453ab20c1dea96bb44f28ec354c8405 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129408495) * 2dcb4e9de3810a282e8534290d65e1a0ec153fd3 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129428351) * 08ea47ddaecc322a4e55a596d75e13b6706d3fc0 : PENDING [Build](https://travis-ci.com/flink-ci/flink/builds/130150727) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9689: [FLINK-7151] add a basic function ddl
flinkbot edited a comment on issue #9689: [FLINK-7151] add a basic function ddl URL: https://github.com/apache/flink/pull/9689#issuecomment-531747685 ## CI report: * bd2624914db1147588ea838ae542333c310290cc : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/127790175) * b5523d10152123f45cf883e446872b90532879c3 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/128113059) * 7c4c25b26aa9549fe83628315b816f16327000b1 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129622336) * 9e6b35cb0586839b893211b555192101ffce1a95 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129975363) * 1ce63dfde01ba87a2d8e2d75ab400f47c3632f86 : PENDING [Build](https://travis-ci.com/flink-ci/flink/builds/130150709) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9706: [FLINK-14118][runtime]Reduce the unnecessary flushing when there is no data available for flush.
flinkbot edited a comment on issue #9706: [FLINK-14118][runtime]Reduce the unnecessary flushing when there is no data available for flush. URL: https://github.com/apache/flink/pull/9706#issuecomment-532638785 ## CI report: * 053ed30d568a00ba42d75d1a9534843c59a068af : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/128156924) * eff90c955453ab20c1dea96bb44f28ec354c8405 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129408495) * 2dcb4e9de3810a282e8534290d65e1a0ec153fd3 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129428351) * 08ea47ddaecc322a4e55a596d75e13b6706d3fc0 : UNKNOWN This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9689: [FLINK-7151] add a basic function ddl
flinkbot edited a comment on issue #9689: [FLINK-7151] add a basic function ddl URL: https://github.com/apache/flink/pull/9689#issuecomment-531747685 ## CI report: * bd2624914db1147588ea838ae542333c310290cc : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/127790175) * b5523d10152123f45cf883e446872b90532879c3 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/128113059) * 7c4c25b26aa9549fe83628315b816f16327000b1 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/129622336) * 9e6b35cb0586839b893211b555192101ffce1a95 : SUCCESS [Build](https://travis-ci.com/flink-ci/flink/builds/129975363) * 1ce63dfde01ba87a2d8e2d75ab400f47c3632f86 : UNKNOWN This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] flinkbot edited a comment on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test
flinkbot edited a comment on issue #9835: [FLINK-14309] [test-stability] Add retries and acks config in producer test URL: https://github.com/apache/flink/pull/9835#issuecomment-537804870 ## CI report: * 49f80fc9c378d0884dd6b411dbe2966ee634f057 : FAILURE [Build](https://travis-ci.com/flink-ci/flink/builds/130147375) This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services
[GitHub] [flink] wsry commented on issue #9706: [FLINK-14118][runtime]Reduce the unnecessary flushing when there is no data available for flush.
wsry commented on issue #9706: [FLINK-14118][runtime]Reduce the unnecessary flushing when there is no data available for flush. URL: https://github.com/apache/flink/pull/9706#issuecomment-537817526 @pnowojski I have updated the PR and I also opened a PR for the flink-benchmark project. https://github.com/dataArtisans/flink-benchmarks/pull/31 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services