Re: [DISCUSS] Improve Tutorials section of documentation

2018-08-09 Thread vino yang
+1

I have previously fixed the outdated API usage of the sample code in the
documentation.
In addition, the user also encountered a problem in the mailing list but
the program could not be compiled or the correct result could not be
obtained.
We should come up with a solution to deal with the changes to the API and
the consistency of the sample program.

Thanks, vino.

陈梓立  于2018年8月10日周五 上午10:28写道:

> Hi Fabian,
>
> +1 for improve tutorials stuff. It's a nice idea that distinguish users by
> their goals.
> One thing I suggest is that, we could list out the new content designed,
> which would make the discuss more clear.
>
> FYI, the current struct is: (from where I stand)
>
> ```
> > HOME
>
> - Concepts
>   - Programming Model
>   - Distributed Runtime
> - *Quickstart*
> - *Examples*
>   - Overview
>   - Monitoring the Wikipedia Edit
>   - Batch Example
>
> - *Project Setup*
>   - Project Template for Java
>   - Project Template for Scala
>   - Configuring Dependencies, Connectors, Libaries
>   - IDE Setup
>   - Scala REPL
>   - Running Flink on Windows
>   - Building Flink from Source
> - Application Development
>   - ...
> - Depolyment & Operations
>   - ...
> - Debugging & Monitoring
>   - ...
>
> - Internals
>   - ...
> ```
>
> Aljoscha Krettek  于2018年8月9日周四 下午11:29写道:
>
> > +1
> >
> > I think this moves us in the direction how having a more hands-on
> > tutorials section where we don't explain all the details and a reference
> > section where we provide details but don't necessarily spell out a full
> > step-by-step case.
> >
> > > On 9. Aug 2018, at 14:44, Fabian Hueske  wrote:
> > >
> > > Hi everyone,
> > >
> > > I'd like to discuss a proposal to improve the tutorials / quickstart
> > guides
> > > of Flink's documentation.
> > > I think the current tutorials have a few issues that should be fix in
> > order
> > > to help our (future) users getting started with Flink.
> > >
> > > I propose to add a single "Tutorials" section to the documentation
> where
> > > users find step-by-step guides. The tutorials section help users with
> > > different goals:
> > >
> > >  * Get a quick idea of the overall system
> > >  * Implement a DataStream/DataSet/Table API/SQL job
> > >  * Set up Flink on a local machine (or run a Docker container)
> > >
> > > For some of these goals, we do not offer tutorials yet. Our existing
> > > tutorials are mixed with instructions for how to setup an environment
> to
> > > develop Flink itself ("IDE setup", "Building Flink from Source"), and
> > > reference information that is required to implement applications
> > > ("Configuring Dependencies, Connectors, Libraries", Project Templates
> > > Java/Scala).
> > >
> > > As a first step, I would like to reorganize this content of the
> > > "Quickstart", "Examples", and "Project Setup" sections of the
> > documentation
> > > depending on the goals of the users (getting started, reference lookup,
> > > developing Flink). So, this would be mostly moving content around.
> > >
> > > In a second step, I would improve existing tutorials (Implementing
> > > DataStream applications, Local Setup) and add missing tutorials (Local
> > > Docker setup, Implementing DataSet / Table API / SQL applications,
> etc.).
> > >
> > > What do you think?
> > >
> > > Cheers, Fabian
> >
> >
>


[jira] [Created] (FLINK-10119) 存在数据非json格式,使用KafkaJsonTableSource的话,job无法拉起。

2018-08-09 Thread sean.miao (JIRA)
sean.miao created FLINK-10119:
-

 Summary: 存在数据非json格式,使用KafkaJsonTableSource的话,job无法拉起。
 Key: FLINK-10119
 URL: https://issues.apache.org/jira/browse/FLINK-10119
 Project: Flink
  Issue Type: Bug
  Components: Kafka Connector
Affects Versions: 1.5.1
 Environment: 无
Reporter: sean.miao


开启checkpoint和savepoint,同时开启了job的自动拉起。

flink从kafka消费数据,使用的是Kafka010JsonTableSource。发现只要有一条数据非json格式,就会导致应用挂掉无法拉起。

当前,这仅是满足了处理语义,但是导致应用不可以用就不太好了吧。能不能改成像spark sql一样,不满足格式的数据,增加到一个专门存储无法解析的数据的列里面。

 

我们目前的做法是

JsonRowDeserializationSchema

@Override
public Row deserialize(byte[] message) throws IOException {
 try {
 final JsonNode root = objectMapper.readTree(message);
 return convertRow(root, (RowTypeInfo) typeInfo);
 } catch (Throwable t) {
 throw new IOException("Failed to deserialize JSON object.", t);
 }
}

catch 里抛异常改成了传入一个 “{}”,会使得所有不能解析数据给所有列返回空值。



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: [DISCUSS] Improve Tutorials section of documentation

2018-08-09 Thread 陈梓立
Hi Fabian,

+1 for improve tutorials stuff. It's a nice idea that distinguish users by
their goals.
One thing I suggest is that, we could list out the new content designed,
which would make the discuss more clear.

FYI, the current struct is: (from where I stand)

```
> HOME

- Concepts
  - Programming Model
  - Distributed Runtime
- *Quickstart*
- *Examples*
  - Overview
  - Monitoring the Wikipedia Edit
  - Batch Example

- *Project Setup*
  - Project Template for Java
  - Project Template for Scala
  - Configuring Dependencies, Connectors, Libaries
  - IDE Setup
  - Scala REPL
  - Running Flink on Windows
  - Building Flink from Source
- Application Development
  - ...
- Depolyment & Operations
  - ...
- Debugging & Monitoring
  - ...

- Internals
  - ...
```

Aljoscha Krettek  于2018年8月9日周四 下午11:29写道:

> +1
>
> I think this moves us in the direction how having a more hands-on
> tutorials section where we don't explain all the details and a reference
> section where we provide details but don't necessarily spell out a full
> step-by-step case.
>
> > On 9. Aug 2018, at 14:44, Fabian Hueske  wrote:
> >
> > Hi everyone,
> >
> > I'd like to discuss a proposal to improve the tutorials / quickstart
> guides
> > of Flink's documentation.
> > I think the current tutorials have a few issues that should be fix in
> order
> > to help our (future) users getting started with Flink.
> >
> > I propose to add a single "Tutorials" section to the documentation where
> > users find step-by-step guides. The tutorials section help users with
> > different goals:
> >
> >  * Get a quick idea of the overall system
> >  * Implement a DataStream/DataSet/Table API/SQL job
> >  * Set up Flink on a local machine (or run a Docker container)
> >
> > For some of these goals, we do not offer tutorials yet. Our existing
> > tutorials are mixed with instructions for how to setup an environment to
> > develop Flink itself ("IDE setup", "Building Flink from Source"), and
> > reference information that is required to implement applications
> > ("Configuring Dependencies, Connectors, Libraries", Project Templates
> > Java/Scala).
> >
> > As a first step, I would like to reorganize this content of the
> > "Quickstart", "Examples", and "Project Setup" sections of the
> documentation
> > depending on the goals of the users (getting started, reference lookup,
> > developing Flink). So, this would be mostly moving content around.
> >
> > In a second step, I would improve existing tutorials (Implementing
> > DataStream applications, Local Setup) and add missing tutorials (Local
> > Docker setup, Implementing DataSet / Table API / SQL applications, etc.).
> >
> > What do you think?
> >
> > Cheers, Fabian
>
>


[jira] [Created] (FLINK-10118) Queryable state MapState entry query

2018-08-09 Thread Elias Levy (JIRA)
Elias Levy created FLINK-10118:
--

 Summary: Queryable state MapState entry query
 Key: FLINK-10118
 URL: https://issues.apache.org/jira/browse/FLINK-10118
 Project: Flink
  Issue Type: Improvement
  Components: Queryable State
Affects Versions: 1.6.0
Reporter: Elias Levy


Queryable state allows querying of keyed MapState, but such a query returns all 
MapState entries for the given key.  In some cases, such MapState many include 
substantial number of entries (in the millions), while the user may only be 
interested in one entry.

I propose we allow queries for MapState to provide one or more map entry keys, 
in addition to the state key, and to only return entries for the given map keys.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-10117) REST API for Queryable State

2018-08-09 Thread Elias Levy (JIRA)
Elias Levy created FLINK-10117:
--

 Summary: REST API for Queryable State
 Key: FLINK-10117
 URL: https://issues.apache.org/jira/browse/FLINK-10117
 Project: Flink
  Issue Type: Improvement
  Components: Queryable State, REST
Affects Versions: 1.6.0
Reporter: Elias Levy


At the moment, queryable state requires a JVM based client that can make use of 
the Java queryable state client API in flink-queryable-state-client artifact.  
In addition, the client requires a state descriptor matching the queried state, 
which tightly couples the Flink job and query state clients.

I propose that queryable state become accessible via a REST API.  FLINK-7040 
mentions this possibility, but does not specify work towards that goal.

I suggest that to enable queryable state over REST, users define JSON 
serializers via the state descriptors.  

This would allow queryable state clients to be developed in any language, not 
require them to use a Flink client library, and permit them to be loosely 
coupled with the job, as they could generically parse the returned JSON.

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-10116) createComparator fails on case class with Unit type fields prior to the join-keyl

2018-08-09 Thread Will (JIRA)
Will created FLINK-10116:


 Summary: createComparator fails on case class with Unit type 
fields prior to the join-keyl
 Key: FLINK-10116
 URL: https://issues.apache.org/jira/browse/FLINK-10116
 Project: Flink
  Issue Type: Bug
Affects Versions: 1.6.0, 1.3.3
Reporter: Will
 Attachments: JobFail.scala, JobPass.scala

h1. Overview

When joining between case classes, if the attribute representing the join-key 
comes after Unit definition of fields (that are not being used) the join will 
fail with the error
{quote}{{Exception in thread "main" java.lang.IllegalArgumentException: Could 
not add a comparator for the logicalkey field index 0.}}
{{ at 
org.apache.flink.api.common.typeutils.CompositeType.createComparator(CompositeType.java:162)}}
{{ at 
org.apache.flink.optimizer.postpass.JavaApiPostPass.createComparator(JavaApiPostPass.java:293)}}
{{ at 
org.apache.flink.optimizer.postpass.JavaApiPostPass.traverse(JavaApiPostPass.java:193)}}
{quote}
Using TypeInformation keys does not exhibit the same issue. Initial debugging 
suggests that when calculating the index of the key for strings doesn't count 
Unit elements, however they are included during iteration in 
CompositeType.createComparator which leads to the search failing on the key 
appearing to be a Unit type.
h1. Code Examples to Reproduce

[^JobFail.scala]

[^JobPass.scala]

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-10115) Upload Project to WEB UI has issue

2018-08-09 Thread Yazdan Shirvany (JIRA)
Yazdan Shirvany created FLINK-10115:
---

 Summary: Upload Project to WEB UI has issue
 Key: FLINK-10115
 URL: https://issues.apache.org/jira/browse/FLINK-10115
 Project: Flink
  Issue Type: Bug
  Components: Webfrontend
Reporter: Yazdan Shirvany
 Fix For: 1.6.0


Uploading jar files via WEB UI not working. After {{initializing upload...}} it 
only shows {{saving...}} and file never shows up on UI to be able to submit it



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: [DISCUSS] Improve Tutorials section of documentation

2018-08-09 Thread Aljoscha Krettek
+1

I think this moves us in the direction how having a more hands-on tutorials 
section where we don't explain all the details and a reference section where we 
provide details but don't necessarily spell out a full step-by-step case.

> On 9. Aug 2018, at 14:44, Fabian Hueske  wrote:
> 
> Hi everyone,
> 
> I'd like to discuss a proposal to improve the tutorials / quickstart guides
> of Flink's documentation.
> I think the current tutorials have a few issues that should be fix in order
> to help our (future) users getting started with Flink.
> 
> I propose to add a single "Tutorials" section to the documentation where
> users find step-by-step guides. The tutorials section help users with
> different goals:
> 
>  * Get a quick idea of the overall system
>  * Implement a DataStream/DataSet/Table API/SQL job
>  * Set up Flink on a local machine (or run a Docker container)
> 
> For some of these goals, we do not offer tutorials yet. Our existing
> tutorials are mixed with instructions for how to setup an environment to
> develop Flink itself ("IDE setup", "Building Flink from Source"), and
> reference information that is required to implement applications
> ("Configuring Dependencies, Connectors, Libraries", Project Templates
> Java/Scala).
> 
> As a first step, I would like to reorganize this content of the
> "Quickstart", "Examples", and "Project Setup" sections of the documentation
> depending on the goals of the users (getting started, reference lookup,
> developing Flink). So, this would be mostly moving content around.
> 
> In a second step, I would improve existing tutorials (Implementing
> DataStream applications, Local Setup) and add missing tutorials (Local
> Docker setup, Implementing DataSet / Table API / SQL applications, etc.).
> 
> What do you think?
> 
> Cheers, Fabian



[jira] [Created] (FLINK-10114) Support Orc for StreamingFileSink

2018-08-09 Thread zhangminglei (JIRA)
zhangminglei created FLINK-10114:


 Summary: Support Orc for StreamingFileSink
 Key: FLINK-10114
 URL: https://issues.apache.org/jira/browse/FLINK-10114
 Project: Flink
  Issue Type: Sub-task
  Components: Streaming Connectors
Reporter: zhangminglei
Assignee: zhangminglei






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-10113) Drop support for pre 1.6 shared buffer state

2018-08-09 Thread Dawid Wysakowicz (JIRA)
Dawid Wysakowicz created FLINK-10113:


 Summary: Drop support for pre 1.6 shared buffer state
 Key: FLINK-10113
 URL: https://issues.apache.org/jira/browse/FLINK-10113
 Project: Flink
  Issue Type: Improvement
  Components: CEP
Reporter: Dawid Wysakowicz
Assignee: Dawid Wysakowicz
 Fix For: 1.7.0


We could drop migration code that transforms old pre 1.6 state to 1.6 state.
This will leave possibility to migrate from 1.5 to 1.7 via 1.6.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: [VOTE] Release 1.6.0, release candidate #4

2018-08-09 Thread Yaz Sh
Thanks for the fix!

/Yaz

On Thu, Aug 9, 2018 at 9:10 AM Till Rohrmann  wrote:

> Thanks for reporting this problem Yaz. I just pushed a commit which should
> update the links accordingly once the Flink documentation gets rebuilt
> (over night). Tomorrow it should be fixed.
>
> Cheers,
> Till
>
> On Thu, Aug 9, 2018 at 2:53 PM Yaz Sh  wrote:
>
> > Great Job on Release 1.6!
> >
> > I just checked it out and still I can see v.1.6-SNAPSHOT on the title of
> > https://ci.apache.org/projects/flink/flink-docs-release-1.6/ <
> > https://ci.apache.org/projects/flink/flink-docs-release-1.6/>
> >
> > and when I click on any options, it redirects me to master docs
> > 1.7-SNAPSHOT.
> >
> > I opened this ticket https://issues.apache.org/jira/browse/FLINK-10112 <
> > https://issues.apache.org/jira/browse/FLINK-10112>
> >
> > Also I don’t see v1.6 on “Pick Docs Version" drop down
> >
> > Cheers,
> > Yaz
> >
> > > On Aug 8, 2018, at 3:24 PM, Timo Walther  wrote:
> > >
> > > +1
> > >
> > > - successfully run `mvn clean verify` locally
> > > - successfully run end-to-end tests locally (except for SQL Client
> > end-to-end test)
> > >
> > > Found a bug in the class loading of SQL JAR files. This is not a
> blocker
> > but a bug that we should fix soon. As an easy workaround user should not
> > use different Kafka versions as SQL Client dependencies.
> > >
> > > Regards,
> > > Timo
> > >
> > > Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
> > >> +1
> > >>
> > >> - verified compilation, tests
> > >> - verified checksum and gpg files
> > >> - verified sbt templates (g8, quickstart) - run assemblies on local
> > cluster
> > >>
> > >> - I could not execute the nightly-tests.sh though. The tests that were
> > >> failing most often are:
> > >> - test_streaming_file_sink.sh
> > >> - test_streaming_elasticsearch.sh
> > >>
> > >> Those are connectors though and it might be only tests flakiness so I
> > >> think it should not block the release.
> > >>
> > >> On 08/08/18 16:36, Chesnay Schepler wrote:
> > >>> I did not use the tools/list_deps.py script as I wasn't aware that it
> > >>> existed.
> > >>>
> > >>> Even if I were I wouldn't have used it and in fact would advocate for
> > >>> removing it.
> > >>> It manually parses and constructs dependency information which is
> > >>> utterly unnecessary as maven already provides this functionality,
> with
> > >>> the added bonus of also accounting for dependencyManagement and
> > >>> transitive dependencies which we obviously have to take into account.
> > >>>
> > >>> I used this one-liner instead:
> > >>> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
> > >>> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
> > >>> |'s/:[a-z]*$//g'| || ||sort| |-u
> > >>>
> > >>> |which I have documented here:
> > >>> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
> > >>>
> > >>> On 08.08.2018 15:06, Aljoscha Krettek wrote:
> >  +1
> > 
> >  - verified checksum and gpg files
> >  - verified LICENSE and NOTICE: NOTICE didn't change from 1.5,
> LICENSE
> >  had one unnecessary part removed
> > 
> >  Side comment: I'm not sure whether the "Verify that the LICENSE and
> >  NOTICE file is correct for the binary and source releases" part is
> >  valid anymore because we only have one LICENSE and NOTICE file. also
> >  "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
> >  to the binary distribution and mention all of Flink's Maven
> >  dependencies as well" can be dropped because we don't have them
> > anymore.
> > 
> >  I came to the same conclusion on dependencies. I used
> >  tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
> >  probably what Chesnay also did ... :-)
> > 
> > > On 8. Aug 2018, at 14:43, Chesnay Schepler 
> > wrote:
> > >
> > > +1
> > >
> > > - verified source release contains no binaries
> > > - verified correct versions in source release
> > > - verified compilation, tests and E2E-tests pass (on travis)
> > > - verified checksum and gpg files
> > >
> > > New dependencies (excluding dependencies where we simply depend on
> a
> > > different version now):
> > > Apache licensed:
> > > io.confluent:common-utils:jar:3.3.1
> > > io.confluent:kafka-schema-registry-client:jar:3.3.1
> > > io.prometheus:simpleclient_pushgateway:jar:0.3.0
> > > various Apache Nifi dependencies
> > > various Apache Parquet dependencies
> > > various ElasticSearch dependencies
> > > CDDL:
> > > javax.ws.rs:javax.ws.rs-api:jar:2.1
> > > Bouncycastle (MIT-like):
> > > org.bouncycastle:bcpkix-jdk15on:jar:1.59
> > > org.bouncycastle:bcprov-jdk15on:jar:1.59
> > > MIT:
> > > org.projectlombok:lombok:jar:1.16.20
> > >
> > > On 08.08.2018 13:28, Till Rohrmann w

Re: [VOTE] Release 1.6.0, release candidate #4

2018-08-09 Thread Till Rohrmann
Thanks for reporting this problem Yaz. I just pushed a commit which should
update the links accordingly once the Flink documentation gets rebuilt
(over night). Tomorrow it should be fixed.

Cheers,
Till

On Thu, Aug 9, 2018 at 2:53 PM Yaz Sh  wrote:

> Great Job on Release 1.6!
>
> I just checked it out and still I can see v.1.6-SNAPSHOT on the title of
> https://ci.apache.org/projects/flink/flink-docs-release-1.6/ <
> https://ci.apache.org/projects/flink/flink-docs-release-1.6/>
>
> and when I click on any options, it redirects me to master docs
> 1.7-SNAPSHOT.
>
> I opened this ticket https://issues.apache.org/jira/browse/FLINK-10112 <
> https://issues.apache.org/jira/browse/FLINK-10112>
>
> Also I don’t see v1.6 on “Pick Docs Version" drop down
>
> Cheers,
> Yaz
>
> > On Aug 8, 2018, at 3:24 PM, Timo Walther  wrote:
> >
> > +1
> >
> > - successfully run `mvn clean verify` locally
> > - successfully run end-to-end tests locally (except for SQL Client
> end-to-end test)
> >
> > Found a bug in the class loading of SQL JAR files. This is not a blocker
> but a bug that we should fix soon. As an easy workaround user should not
> use different Kafka versions as SQL Client dependencies.
> >
> > Regards,
> > Timo
> >
> > Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
> >> +1
> >>
> >> - verified compilation, tests
> >> - verified checksum and gpg files
> >> - verified sbt templates (g8, quickstart) - run assemblies on local
> cluster
> >>
> >> - I could not execute the nightly-tests.sh though. The tests that were
> >> failing most often are:
> >> - test_streaming_file_sink.sh
> >> - test_streaming_elasticsearch.sh
> >>
> >> Those are connectors though and it might be only tests flakiness so I
> >> think it should not block the release.
> >>
> >> On 08/08/18 16:36, Chesnay Schepler wrote:
> >>> I did not use the tools/list_deps.py script as I wasn't aware that it
> >>> existed.
> >>>
> >>> Even if I were I wouldn't have used it and in fact would advocate for
> >>> removing it.
> >>> It manually parses and constructs dependency information which is
> >>> utterly unnecessary as maven already provides this functionality, with
> >>> the added bonus of also accounting for dependencyManagement and
> >>> transitive dependencies which we obviously have to take into account.
> >>>
> >>> I used this one-liner instead:
> >>> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
> >>> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
> >>> |'s/:[a-z]*$//g'| || ||sort| |-u
> >>>
> >>> |which I have documented here:
> >>> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
> >>>
> >>> On 08.08.2018 15:06, Aljoscha Krettek wrote:
>  +1
> 
>  - verified checksum and gpg files
>  - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE
>  had one unnecessary part removed
> 
>  Side comment: I'm not sure whether the "Verify that the LICENSE and
>  NOTICE file is correct for the binary and source releases" part is
>  valid anymore because we only have one LICENSE and NOTICE file. also
>  "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
>  to the binary distribution and mention all of Flink's Maven
>  dependencies as well" can be dropped because we don't have them
> anymore.
> 
>  I came to the same conclusion on dependencies. I used
>  tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
>  probably what Chesnay also did ... :-)
> 
> > On 8. Aug 2018, at 14:43, Chesnay Schepler 
> wrote:
> >
> > +1
> >
> > - verified source release contains no binaries
> > - verified correct versions in source release
> > - verified compilation, tests and E2E-tests pass (on travis)
> > - verified checksum and gpg files
> >
> > New dependencies (excluding dependencies where we simply depend on a
> > different version now):
> > Apache licensed:
> > io.confluent:common-utils:jar:3.3.1
> > io.confluent:kafka-schema-registry-client:jar:3.3.1
> > io.prometheus:simpleclient_pushgateway:jar:0.3.0
> > various Apache Nifi dependencies
> > various Apache Parquet dependencies
> > various ElasticSearch dependencies
> > CDDL:
> > javax.ws.rs:javax.ws.rs-api:jar:2.1
> > Bouncycastle (MIT-like):
> > org.bouncycastle:bcpkix-jdk15on:jar:1.59
> > org.bouncycastle:bcprov-jdk15on:jar:1.59
> > MIT:
> > org.projectlombok:lombok:jar:1.16.20
> >
> > On 08.08.2018 13:28, Till Rohrmann wrote:
> >> Thanks for reporting these problems Chesnay. The usage string in
> >> `standalone-job.sh` is out dated and should be updated. The same
> >> applies to
> >> the typo.
> >>
> >> When calling `standalone-job.sh start --job-classname foobar.Job`
> >> please
> >> make sure that the user code j

Re: [VOTE] Release 1.6.0, release candidate #4

2018-08-09 Thread Yaz Sh
Great Job on Release 1.6!

I just checked it out and still I can see v.1.6-SNAPSHOT on the title of 
https://ci.apache.org/projects/flink/flink-docs-release-1.6/ 


and when I click on any options, it redirects me to master docs 1.7-SNAPSHOT.

I opened this ticket https://issues.apache.org/jira/browse/FLINK-10112 


Also I don’t see v1.6 on “Pick Docs Version" drop down

Cheers,
Yaz

> On Aug 8, 2018, at 3:24 PM, Timo Walther  wrote:
> 
> +1
> 
> - successfully run `mvn clean verify` locally
> - successfully run end-to-end tests locally (except for SQL Client end-to-end 
> test)
> 
> Found a bug in the class loading of SQL JAR files. This is not a blocker but 
> a bug that we should fix soon. As an easy workaround user should not use 
> different Kafka versions as SQL Client dependencies.
> 
> Regards,
> Timo
> 
> Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
>> +1
>> 
>> - verified compilation, tests
>> - verified checksum and gpg files
>> - verified sbt templates (g8, quickstart) - run assemblies on local cluster
>> 
>> - I could not execute the nightly-tests.sh though. The tests that were
>> failing most often are:
>> - test_streaming_file_sink.sh
>> - test_streaming_elasticsearch.sh
>> 
>> Those are connectors though and it might be only tests flakiness so I
>> think it should not block the release.
>> 
>> On 08/08/18 16:36, Chesnay Schepler wrote:
>>> I did not use the tools/list_deps.py script as I wasn't aware that it
>>> existed.
>>> 
>>> Even if I were I wouldn't have used it and in fact would advocate for
>>> removing it.
>>> It manually parses and constructs dependency information which is
>>> utterly unnecessary as maven already provides this functionality, with
>>> the added bonus of also accounting for dependencyManagement and
>>> transitive dependencies which we obviously have to take into account.
>>> 
>>> I used this one-liner instead:
>>> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
>>> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
>>> |'s/:[a-z]*$//g'| || ||sort| |-u
>>> 
>>> |which I have documented here:
>>> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
>>> 
>>> On 08.08.2018 15:06, Aljoscha Krettek wrote:
 +1
 
 - verified checksum and gpg files
 - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE
 had one unnecessary part removed
 
 Side comment: I'm not sure whether the "Verify that the LICENSE and
 NOTICE file is correct for the binary and source releases" part is
 valid anymore because we only have one LICENSE and NOTICE file. also
 "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
 to the binary distribution and mention all of Flink's Maven
 dependencies as well" can be dropped because we don't have them anymore.
 
 I came to the same conclusion on dependencies. I used
 tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
 probably what Chesnay also did ... :-)
 
> On 8. Aug 2018, at 14:43, Chesnay Schepler  wrote:
> 
> +1
> 
> - verified source release contains no binaries
> - verified correct versions in source release
> - verified compilation, tests and E2E-tests pass (on travis)
> - verified checksum and gpg files
> 
> New dependencies (excluding dependencies where we simply depend on a
> different version now):
> Apache licensed:
> io.confluent:common-utils:jar:3.3.1
> io.confluent:kafka-schema-registry-client:jar:3.3.1
> io.prometheus:simpleclient_pushgateway:jar:0.3.0
> various Apache Nifi dependencies
> various Apache Parquet dependencies
> various ElasticSearch dependencies
> CDDL:
> javax.ws.rs:javax.ws.rs-api:jar:2.1
> Bouncycastle (MIT-like):
> org.bouncycastle:bcpkix-jdk15on:jar:1.59
> org.bouncycastle:bcprov-jdk15on:jar:1.59
> MIT:
> org.projectlombok:lombok:jar:1.16.20
> 
> On 08.08.2018 13:28, Till Rohrmann wrote:
>> Thanks for reporting these problems Chesnay. The usage string in
>> `standalone-job.sh` is out dated and should be updated. The same
>> applies to
>> the typo.
>> 
>> When calling `standalone-job.sh start --job-classname foobar.Job`
>> please
>> make sure that the user code jar is contained in the classpath (e.g.
>> putting the jar in the lib directory). Documenting this behaviour
>> is part
>> of the pending issue FLINK-10001.
>> 
>> We should fix all of these issues. They are, however, no release
>> blockers.
>> 
>> Cheers,
>> Till
>> 
>> On Wed, Aug 8, 2018 at 11:31 AM Chesnay Schepler
>>  wrote:
>> 
>>> I found some issues with the standalone-job.sh script.
>>>

[jira] [Created] (FLINK-10112) https://ci.apache.org title for release 1.6 shows 1.6-SNAPSHOT

2018-08-09 Thread Yazdan Shirvany (JIRA)
Yazdan Shirvany created FLINK-10112:
---

 Summary: https://ci.apache.org title for release 1.6 shows 
1.6-SNAPSHOT
 Key: FLINK-10112
 URL: https://issues.apache.org/jira/browse/FLINK-10112
 Project: Flink
  Issue Type: Bug
Affects Versions: 1.6.0
Reporter: Yazdan Shirvany


Flink 1.6 has been released but I still can see 1.6-SNAPSHOT on 
[https://ci.apache.org/] and when I click to any of the options it redirects me 
to master docs with 1.7-SNAPSHOT title



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[DISCUSS] Improve Tutorials section of documentation

2018-08-09 Thread Fabian Hueske
Hi everyone,

I'd like to discuss a proposal to improve the tutorials / quickstart guides
of Flink's documentation.
I think the current tutorials have a few issues that should be fix in order
to help our (future) users getting started with Flink.

I propose to add a single "Tutorials" section to the documentation where
users find step-by-step guides. The tutorials section help users with
different goals:

  * Get a quick idea of the overall system
  * Implement a DataStream/DataSet/Table API/SQL job
  * Set up Flink on a local machine (or run a Docker container)

For some of these goals, we do not offer tutorials yet. Our existing
tutorials are mixed with instructions for how to setup an environment to
develop Flink itself ("IDE setup", "Building Flink from Source"), and
reference information that is required to implement applications
("Configuring Dependencies, Connectors, Libraries", Project Templates
Java/Scala).

As a first step, I would like to reorganize this content of the
"Quickstart", "Examples", and "Project Setup" sections of the documentation
depending on the goals of the users (getting started, reference lookup,
developing Flink). So, this would be mostly moving content around.

In a second step, I would improve existing tutorials (Implementing
DataStream applications, Local Setup) and add missing tutorials (Local
Docker setup, Implementing DataSet / Table API / SQL applications, etc.).

What do you think?

Cheers, Fabian


Re: [ANNOUNCE] Apache Flink 1.6.0 released

2018-08-09 Thread vino yang
Congratulations!

Great work! Till, thank you for advancing the smooth release of Flink 1.6.

Vino.

Till Rohrmann  于2018年8月9日周四 下午7:21写道:

> The Apache Flink community is very happy to announce the release of Apache
> Flink 1.6.0.
>
> Apache Flink® is an open-source stream processing framework for
> distributed, high-performing, always-available, and accurate data streaming
> applications.
>
> The release is available for download at:
> https://flink.apache.org/downloads.html
>
> Please check out the release blog post for an overview of the improvements
> for this bugfix release:
> https://flink.apache.org/news/2018/08/09/release-1.6.0.html
>
> The full release notes are available in Jira:
>
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760
>
> We would like to thank all contributors of the Apache Flink community who
> made this release possible!
>
> Regards,
> Till
>


[ANNOUNCE] Apache Flink 1.6.0 released

2018-08-09 Thread Till Rohrmann
The Apache Flink community is very happy to announce the release of Apache
Flink 1.6.0.

Apache Flink® is an open-source stream processing framework for
distributed, high-performing, always-available, and accurate data streaming
applications.

The release is available for download at:
https://flink.apache.org/downloads.html

Please check out the release blog post for an overview of the improvements
for this bugfix release:
https://flink.apache.org/news/2018/08/09/release-1.6.0.html

The full release notes are available in Jira:
https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12342760

We would like to thank all contributors of the Apache Flink community who
made this release possible!

Regards,
Till


[jira] [Created] (FLINK-10111) Test failure because of HAQueryableStateFsBackendITCase#testMapState thorws NPE

2018-08-09 Thread vinoyang (JIRA)
vinoyang created FLINK-10111:


 Summary: Test failure because of 
HAQueryableStateFsBackendITCase#testMapState thorws NPE
 Key: FLINK-10111
 URL: https://issues.apache.org/jira/browse/FLINK-10111
 Project: Flink
  Issue Type: Bug
  Components: Tests
Reporter: vinoyang


stack trace : 
{code:java}
testMapState(org.apache.flink.queryablestate.itcases.HAQueryableStateFsBackendITCase)
  Time elapsed: 0.528 sec  <<< ERROR!
java.lang.NullPointerException: null
at 
org.apache.flink.queryablestate.itcases.AbstractQueryableStateTestBase.testMapState(AbstractQueryableStateTestBase.java:840)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:283)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:173)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:128)
at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:203)
at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:155)
at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
{code}
travis log : https://travis-ci.org/apache/flink/jobs/412636761



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[RESULT] [VOTE] Release 1.6.0, release candidate #4

2018-08-09 Thread Till Rohrmann
I'm happy to announce that we have unanimously approved the 1.6.0 release.

There are 7 approving votes, 4 of which are binding:
- Piotrek (non-binding)
- Chesnay (binding)
- Aljoscha (binding)
- Dawid (non-binding)
- Timo (binding)
- Vino (non-binding)
- Till (binding)

There are no disapproving votes.

Thanks everyone for the hard work and help making this release possible!


Re: [VOTE] Release 1.6.0, release candidate #4

2018-08-09 Thread Till Rohrmann
I hereby close the vote. The result will be announced in a separate thread.

On Thu, Aug 9, 2018 at 11:47 AM Till Rohrmann  wrote:

> +1
>
> - Checked checksums and signatures
> - Verified that no unwanted binaries are contained in source release
> - Checked LICENSE and NOTICE file
> - Checked that all newly added dependencies have a compatible license
> - Checked that a local cluster can be started and stopped without
> exceptions in the log
> - Verified that SBT quickstarts are up to date
> - Verified that Java quickstarts work with IntelliJ
> - Verified that all Jepsen tests pass
> - Verified that e2e tests modulo test_sql_client.sh (see
> https://issues.apache.org/jira/browse/FLINK-10107) pass
>
> Cheers,
> Till
>
> On Thu, Aug 9, 2018 at 8:18 AM vino yang  wrote:
>
>> +1
>>
>> - checkout 1.6 source code and successfully run `mvn clean package
>> -DskipTests`
>> - searched '1.5' and '1.5.2' in all modules pom file and successfully
>> verified flink version was changed
>> - successfully run table and sql test locally
>>
>> Thanks, vino.
>>
>>
>> Timo Walther  于2018年8月9日周四 上午3:24写道:
>>
>> > +1
>> >
>> > - successfully run `mvn clean verify` locally
>> > - successfully run end-to-end tests locally (except for SQL Client
>> > end-to-end test)
>> >
>> > Found a bug in the class loading of SQL JAR files. This is not a blocker
>> > but a bug that we should fix soon. As an easy workaround user should not
>> > use different Kafka versions as SQL Client dependencies.
>> >
>> > Regards,
>> > Timo
>> >
>> > Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
>> > > +1
>> > >
>> > > - verified compilation, tests
>> > > - verified checksum and gpg files
>> > > - verified sbt templates (g8, quickstart) - run assemblies on local
>> > cluster
>> > >
>> > > - I could not execute the nightly-tests.sh though. The tests that were
>> > > failing most often are:
>> > >  - test_streaming_file_sink.sh
>> > >  - test_streaming_elasticsearch.sh
>> > >
>> > > Those are connectors though and it might be only tests flakiness so I
>> > > think it should not block the release.
>> > >
>> > > On 08/08/18 16:36, Chesnay Schepler wrote:
>> > >> I did not use the tools/list_deps.py script as I wasn't aware that it
>> > >> existed.
>> > >>
>> > >> Even if I were I wouldn't have used it and in fact would advocate for
>> > >> removing it.
>> > >> It manually parses and constructs dependency information which is
>> > >> utterly unnecessary as maven already provides this functionality,
>> with
>> > >> the added bonus of also accounting for dependencyManagement and
>> > >> transitive dependencies which we obviously have to take into account.
>> > >>
>> > >> I used this one-liner instead:
>> > >> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
>> > >> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
>> > >> |'s/:[a-z]*$//g'| || ||sort| |-u
>> > >>
>> > >> |which I have documented here:
>> > >> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
>> > >>
>> > >> On 08.08.2018 15:06, Aljoscha Krettek wrote:
>> > >>> +1
>> > >>>
>> > >>> - verified checksum and gpg files
>> > >>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5,
>> LICENSE
>> > >>> had one unnecessary part removed
>> > >>>
>> > >>> Side comment: I'm not sure whether the "Verify that the LICENSE and
>> > >>> NOTICE file is correct for the binary and source releases" part is
>> > >>> valid anymore because we only have one LICENSE and NOTICE file. also
>> > >>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
>> > >>> to the binary distribution and mention all of Flink's Maven
>> > >>> dependencies as well" can be dropped because we don't have them
>> > anymore.
>> > >>>
>> > >>> I came to the same conclusion on dependencies. I used
>> > >>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
>> > >>> probably what Chesnay also did ... :-)
>> > >>>
>> >  On 8. Aug 2018, at 14:43, Chesnay Schepler 
>> > wrote:
>> > 
>> >  +1
>> > 
>> >  - verified source release contains no binaries
>> >  - verified correct versions in source release
>> >  - verified compilation, tests and E2E-tests pass (on travis)
>> >  - verified checksum and gpg files
>> > 
>> >  New dependencies (excluding dependencies where we simply depend on
>> a
>> >  different version now):
>> >   Apache licensed:
>> >   io.confluent:common-utils:jar:3.3.1
>> >   io.confluent:kafka-schema-registry-client:jar:3.3.1
>> >   io.prometheus:simpleclient_pushgateway:jar:0.3.0
>> >   various Apache Nifi dependencies
>> >   various Apache Parquet dependencies
>> >   various ElasticSearch dependencies
>> >   CDDL:
>> >   javax.ws.rs:javax.ws.rs-api:jar:2.1
>> >   Bouncycastle (MIT-like):
>> >   org.bouncycastle:bcpkix-jdk15on:jar:1.59
>> >   org.bouncycastle:bcprov-j

Re: [VOTE] Release 1.6.0, release candidate #4

2018-08-09 Thread Till Rohrmann
+1

- Checked checksums and signatures
- Verified that no unwanted binaries are contained in source release
- Checked LICENSE and NOTICE file
- Checked that all newly added dependencies have a compatible license
- Checked that a local cluster can be started and stopped without
exceptions in the log
- Verified that SBT quickstarts are up to date
- Verified that Java quickstarts work with IntelliJ
- Verified that all Jepsen tests pass
- Verified that e2e tests modulo test_sql_client.sh (see
https://issues.apache.org/jira/browse/FLINK-10107) pass

Cheers,
Till

On Thu, Aug 9, 2018 at 8:18 AM vino yang  wrote:

> +1
>
> - checkout 1.6 source code and successfully run `mvn clean package
> -DskipTests`
> - searched '1.5' and '1.5.2' in all modules pom file and successfully
> verified flink version was changed
> - successfully run table and sql test locally
>
> Thanks, vino.
>
>
> Timo Walther  于2018年8月9日周四 上午3:24写道:
>
> > +1
> >
> > - successfully run `mvn clean verify` locally
> > - successfully run end-to-end tests locally (except for SQL Client
> > end-to-end test)
> >
> > Found a bug in the class loading of SQL JAR files. This is not a blocker
> > but a bug that we should fix soon. As an easy workaround user should not
> > use different Kafka versions as SQL Client dependencies.
> >
> > Regards,
> > Timo
> >
> > Am 08.08.18 um 18:10 schrieb Dawid Wysakowicz:
> > > +1
> > >
> > > - verified compilation, tests
> > > - verified checksum and gpg files
> > > - verified sbt templates (g8, quickstart) - run assemblies on local
> > cluster
> > >
> > > - I could not execute the nightly-tests.sh though. The tests that were
> > > failing most often are:
> > >  - test_streaming_file_sink.sh
> > >  - test_streaming_elasticsearch.sh
> > >
> > > Those are connectors though and it might be only tests flakiness so I
> > > think it should not block the release.
> > >
> > > On 08/08/18 16:36, Chesnay Schepler wrote:
> > >> I did not use the tools/list_deps.py script as I wasn't aware that it
> > >> existed.
> > >>
> > >> Even if I were I wouldn't have used it and in fact would advocate for
> > >> removing it.
> > >> It manually parses and constructs dependency information which is
> > >> utterly unnecessary as maven already provides this functionality, with
> > >> the added bonus of also accounting for dependencyManagement and
> > >> transitive dependencies which we obviously have to take into account.
> > >>
> > >> I used this one-liner instead:
> > >> |mvn dependency:list | ||grep| |":.*:.*:.*"| || ||grep| |-||v| |-e
> > >> ||"Finished at"| |-e ||"Some problems"| || ||cut| |-d] -f2- | ||sed|
> > >> |'s/:[a-z]*$//g'| || ||sort| |-u
> > >>
> > >> |which I have documented here:
> > >> https://cwiki.apache.org/confluence/display/FLINK/Dependencies
> > >>
> > >> On 08.08.2018 15:06, Aljoscha Krettek wrote:
> > >>> +1
> > >>>
> > >>> - verified checksum and gpg files
> > >>> - verified LICENSE and NOTICE: NOTICE didn't change from 1.5, LICENSE
> > >>> had one unnecessary part removed
> > >>>
> > >>> Side comment: I'm not sure whether the "Verify that the LICENSE and
> > >>> NOTICE file is correct for the binary and source releases" part is
> > >>> valid anymore because we only have one LICENSE and NOTICE file. also
> > >>> "The LICENSE and NOTICE files in flink-dist/src/main/flink-bin refer
> > >>> to the binary distribution and mention all of Flink's Maven
> > >>> dependencies as well" can be dropped because we don't have them
> > anymore.
> > >>>
> > >>> I came to the same conclusion on dependencies. I used
> > >>> tools/list_deps.py and diff'ed the output for 1.5 and 1.6, that's
> > >>> probably what Chesnay also did ... :-)
> > >>>
> >  On 8. Aug 2018, at 14:43, Chesnay Schepler 
> > wrote:
> > 
> >  +1
> > 
> >  - verified source release contains no binaries
> >  - verified correct versions in source release
> >  - verified compilation, tests and E2E-tests pass (on travis)
> >  - verified checksum and gpg files
> > 
> >  New dependencies (excluding dependencies where we simply depend on a
> >  different version now):
> >   Apache licensed:
> >   io.confluent:common-utils:jar:3.3.1
> >   io.confluent:kafka-schema-registry-client:jar:3.3.1
> >   io.prometheus:simpleclient_pushgateway:jar:0.3.0
> >   various Apache Nifi dependencies
> >   various Apache Parquet dependencies
> >   various ElasticSearch dependencies
> >   CDDL:
> >   javax.ws.rs:javax.ws.rs-api:jar:2.1
> >   Bouncycastle (MIT-like):
> >   org.bouncycastle:bcpkix-jdk15on:jar:1.59
> >   org.bouncycastle:bcprov-jdk15on:jar:1.59
> >   MIT:
> >   org.projectlombok:lombok:jar:1.16.20
> > 
> >  On 08.08.2018 13:28, Till Rohrmann wrote:
> > > Thanks for reporting these problems Chesnay. The usage string in
> > > `standalone-job.sh` is out dated and should 

[jira] [Created] (FLINK-10110) Harden e2e Kafka shutdown

2018-08-09 Thread Till Rohrmann (JIRA)
Till Rohrmann created FLINK-10110:
-

 Summary: Harden e2e Kafka shutdown
 Key: FLINK-10110
 URL: https://issues.apache.org/jira/browse/FLINK-10110
 Project: Flink
  Issue Type: Improvement
  Components: Tests
Affects Versions: 1.6.0
Reporter: Till Rohrmann
Assignee: Till Rohrmann
 Fix For: 1.6.1, 1.7.0


Due to KAFKA-4931, the shutdown of Kafka components can fail if the output of 
{{ps}} is limited to 4096 characters. Therefore, it can happen that e2e tests 
which start Kafka don't properly shut it down. I suggest to fix this problem by 
hardening our {{stop_kafka_cluster}} in {{kafka-common.sh}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-10109) Add documentation for StreamingFileSink

2018-08-09 Thread Aljoscha Krettek (JIRA)
Aljoscha Krettek created FLINK-10109:


 Summary: Add documentation for StreamingFileSink
 Key: FLINK-10109
 URL: https://issues.apache.org/jira/browse/FLINK-10109
 Project: Flink
  Issue Type: Sub-task
  Components: Streaming Connectors
Reporter: Aljoscha Krettek
Assignee: Aljoscha Krettek






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-10108) DATE_FORMAT function in sql test throws a NumberFormatException

2018-08-09 Thread Xingcan Cui (JIRA)
Xingcan Cui created FLINK-10108:
---

 Summary: DATE_FORMAT function in sql test throws a 
NumberFormatException
 Key: FLINK-10108
 URL: https://issues.apache.org/jira/browse/FLINK-10108
 Project: Flink
  Issue Type: Bug
  Components: Table API & SQL
Reporter: Xingcan Cui


{{testSqlApi("DATE_FORMAT(TIMESTAMP '1991-01-02 03:04:06', '%m/%d/%Y')", 
"01/02/1991")}} will throw a {{NumberFormatException}}, whereas the function 
works fine in {{testAllApis()}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-10107) SQL Client end-to-end test fails for releases

2018-08-09 Thread Timo Walther (JIRA)
Timo Walther created FLINK-10107:


 Summary: SQL Client end-to-end test fails for releases
 Key: FLINK-10107
 URL: https://issues.apache.org/jira/browse/FLINK-10107
 Project: Flink
  Issue Type: Bug
  Components: Table API & SQL
Reporter: Timo Walther
Assignee: Timo Walther


It seems that SQL JARs for Kafka 0.10 and Kafka 0.9 have conflicts that only 
occur for releases and not SNAPSHOT builds. This might be due to their file 
name. Depending on the file name either 0.9 is loaded before 0.10 and vice 
versa.

One of the following errors occured:
{code}
2018-08-08 18:28:51,636 ERROR 
org.apache.flink.kafka09.shaded.org.apache.kafka.clients.ClientUtils  - Failed 
to close coordinator
java.lang.NoClassDefFoundError: 
org/apache/flink/kafka09/shaded/org/apache/kafka/common/requests/OffsetCommitResponse
at 
org.apache.flink.kafka09.shaded.org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.sendOffsetCommitRequest(ConsumerCoordinator.java:473)
at 
org.apache.flink.kafka09.shaded.org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.commitOffsetsSync(ConsumerCoordinator.java:357)
at 
org.apache.flink.kafka09.shaded.org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.maybeAutoCommitOffsetsSync(ConsumerCoordinator.java:439)
at 
org.apache.flink.kafka09.shaded.org.apache.kafka.clients.consumer.internals.ConsumerCoordinator.close(ConsumerCoordinator.java:319)
at 
org.apache.flink.kafka09.shaded.org.apache.kafka.clients.ClientUtils.closeQuietly(ClientUtils.java:63)
at 
org.apache.flink.kafka09.shaded.org.apache.kafka.clients.consumer.KafkaConsumer.close(KafkaConsumer.java:1277)
at 
org.apache.flink.kafka09.shaded.org.apache.kafka.clients.consumer.KafkaConsumer.close(KafkaConsumer.java:1258)
at 
org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread.run(KafkaConsumerThread.java:286)
Caused by: java.lang.ClassNotFoundException: 
org.apache.flink.kafka09.shaded.org.apache.kafka.common.requests.OffsetCommitResponse
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at 
org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader.loadClass(FlinkUserCodeClassLoaders.java:120)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 8 more
{code}
{code}
java.lang.NoSuchFieldError: producer
at 
org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer010.invoke(FlinkKafkaProducer010.java:369)
at 
org.apache.flink.streaming.api.operators.StreamSink.processElement(StreamSink.java:56)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:579)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:554)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:534)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:689)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:667)
at 
org.apache.flink.streaming.api.operators.StreamMap.processElement(StreamMap.java:41)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:579)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:554)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:534)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:689)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:667)
at 
org.apache.flink.streaming.api.operators.TimestampedCollector.collect(TimestampedCollector.java:51)
at 
org.apache.flink.table.runtime.CRowWrappingCollector.collect(CRowWrappingCollector.scala:37)
at 
org.apache.flink.table.runtime.CRowWrappingCollector.collect(CRowWrappingCollector.scala:28)
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-10106) Include test name in temp directory of e2e test

2018-08-09 Thread Till Rohrmann (JIRA)
Till Rohrmann created FLINK-10106:
-

 Summary: Include test name in temp directory of e2e test
 Key: FLINK-10106
 URL: https://issues.apache.org/jira/browse/FLINK-10106
 Project: Flink
  Issue Type: Improvement
  Components: Tests
Affects Versions: 1.6.0
Reporter: Till Rohrmann
 Fix For: 1.7.0


For better debuggability it would help to include the name of the e2e test in 
the created temporary testing directory 
{{temp-test-directory--UUID}}.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)