[jira] [Closed] (FLINK-6807) Elasticsearch 5 connector artifact not published to maven

2017-06-02 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6807.
-
Resolution: Duplicate

[~fassisr...@hotmail.com] thanks for reporting this. I've marked this as a 
duplicate since [~rmetzger] is working on a fix for the release process.

> Elasticsearch 5 connector artifact not published to maven 
> --
>
> Key: FLINK-6807
> URL: https://issues.apache.org/jira/browse/FLINK-6807
> Project: Flink
>  Issue Type: Bug
>  Components: ElasticSearch Connector
>Affects Versions: 1.3.0
>Reporter: Francisco Rosa
>Priority: Blocker
>  Labels: maven
> Fix For: 1.3.1
>
>
> Downloaded Flink 1.3.0, want to use the Elasticsearch 5.X connector.
> The maven artifact does not seem to be published to maven central 
> repositories yet. Looking for:
>   
> org.apache.flink:flink-connector-elasticsearch5_2.10:1.3.0



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6466) Build Hadoop 2.8.0 convenience binaries

2017-06-01 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6466?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6466.
-
Resolution: Implemented

master: 2438824a8f39c07783e80adf902c72eecab31347

> Build Hadoop 2.8.0 convenience binaries
> ---
>
> Key: FLINK-6466
> URL: https://issues.apache.org/jira/browse/FLINK-6466
> Project: Flink
>  Issue Type: New Feature
>  Components: Build System
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
> Fix For: 1.4.0
>
>
> As discussed on the dev mailing list, add Hadoop 2.8 to the 
> {{create_release_files.sh}} script and TravisCI test matrix.
> If there is consensus then references to binaries for old versions of Hadoop 
> could be removed.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6038) Add deep links to Apache Bahir Flink streaming connector documentations

2017-05-31 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6038?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6038.
-
Resolution: Implemented

master: f12c591aa7e9cf72edbeaf04b53eca71fa3681ca
release-1.3: 4ff1f439ee4529cd47ca1e9d37084da6fd2298b8

> Add deep links to Apache Bahir Flink streaming connector documentations
> ---
>
> Key: FLINK-6038
> URL: https://issues.apache.org/jira/browse/FLINK-6038
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation, Streaming Connectors
>Reporter: Tzu-Li (Gordon) Tai
>Assignee: David Anderson
> Fix For: 1.3.1, 1.4.0
>
>
> Recently, the Bahir documentation for Flink streaming connectors in Bahir was 
> added to Bahir's website: BAHIR-90.
> We should add deep links to the individual Bahir connector dos under 
> {{/dev/connectors/overview}}, instead of just shallow links to the source 
> {{README.md}} s in the community ecosystem page.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6038) Add deep links to Apache Bahir Flink streaming connector documentations

2017-05-31 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6038?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6038:
--
Fix Version/s: 1.4.0
   1.3.1

> Add deep links to Apache Bahir Flink streaming connector documentations
> ---
>
> Key: FLINK-6038
> URL: https://issues.apache.org/jira/browse/FLINK-6038
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation, Streaming Connectors
>Reporter: Tzu-Li (Gordon) Tai
>Assignee: David Anderson
> Fix For: 1.3.1, 1.4.0
>
>
> Recently, the Bahir documentation for Flink streaming connectors in Bahir was 
> added to Bahir's website: BAHIR-90.
> We should add deep links to the individual Bahir connector dos under 
> {{/dev/connectors/overview}}, instead of just shallow links to the source 
> {{README.md}} s in the community ecosystem page.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6778) Activate strict checkstyle for flink-dist

2017-05-31 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6778?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6778.
-
   Resolution: Implemented
Fix Version/s: 1.4.0

master: b5eac06d7cca8317a0f10e08bd5aae654a587a13

> Activate strict checkstyle for flink-dist
> -
>
> Key: FLINK-6778
> URL: https://issues.apache.org/jira/browse/FLINK-6778
> Project: Flink
>  Issue Type: Sub-task
>Affects Versions: 1.4.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Trivial
> Fix For: 1.4.0
>
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6777) Activate strict checkstyle for flink-scala-shell

2017-05-31 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6777?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6777.
-
   Resolution: Implemented
Fix Version/s: 1.4.0

master: 9d9c53ec142c173693985b52ce31f85bacd9b2d8

> Activate strict checkstyle for flink-scala-shell
> 
>
> Key: FLINK-6777
> URL: https://issues.apache.org/jira/browse/FLINK-6777
> Project: Flink
>  Issue Type: Sub-task
>  Components: Scala Shell
>Affects Versions: 1.4.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Trivial
> Fix For: 1.4.0
>
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6779) Activate strict checkstyle in flink-scala

2017-05-31 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6779?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6779.
-
   Resolution: Implemented
Fix Version/s: 1.4.0

master: 6ab7719b565934a0c127dcca3f46d71a71ecc0ef

> Activate strict checkstyle in flink-scala
> -
>
> Key: FLINK-6779
> URL: https://issues.apache.org/jira/browse/FLINK-6779
> Project: Flink
>  Issue Type: Sub-task
>  Components: Scala API
>Affects Versions: 1.4.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Trivial
> Fix For: 1.4.0
>
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6560) Restore maven parallelism in flink-tests

2017-05-31 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6560?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6560.
-
Resolution: Fixed

master: 52efdcb7dc2cfa8a79852f06178db5824eb83fb8

> Restore maven parallelism in flink-tests
> 
>
> Key: FLINK-6560
> URL: https://issues.apache.org/jira/browse/FLINK-6560
> Project: Flink
>  Issue Type: Bug
>  Components: Build System
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Critical
> Fix For: 1.4.0
>
>
> FLINK-6506 added the maven variable {{flink.forkCountTestPackage}} which is 
> used by the TravisCI script but no default value is set.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6332) Upgrade Scala version to 2.11.11

2017-05-31 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6332.
-
   Resolution: Implemented
Fix Version/s: 1.4.0

master: 49a58fdbb4ac80a0c0d34e42803ee6d1ca26b77a

> Upgrade Scala version to 2.11.11
> 
>
> Key: FLINK-6332
> URL: https://issues.apache.org/jira/browse/FLINK-6332
> Project: Flink
>  Issue Type: Improvement
>  Components: Build System, Scala API
>Reporter: Ted Yu
>Assignee: Greg Hogan
>Priority: Minor
> Fix For: 1.4.0
>
>
> Currently scala-2.11 profile uses Scala 2.11.7
> 2.11.11 is the most recent version.
> This issue is to upgrade to Scala 2.11.11



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Assigned] (FLINK-6731) Activate strict checkstyle for flink-tests

2017-05-30 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6731?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan reassigned FLINK-6731:
-

Assignee: Greg Hogan

> Activate strict checkstyle for flink-tests
> --
>
> Key: FLINK-6731
> URL: https://issues.apache.org/jira/browse/FLINK-6731
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests
>Reporter: Chesnay Schepler
>Assignee: Greg Hogan
>
> Long term issue for incrementally introducing the strict checkstyle to 
> flink-tests.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6779) Activate strict checkstyle in flink-scala

2017-05-30 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6779:
-

 Summary: Activate strict checkstyle in flink-scala
 Key: FLINK-6779
 URL: https://issues.apache.org/jira/browse/FLINK-6779
 Project: Flink
  Issue Type: Sub-task
  Components: Scala API
Affects Versions: 1.4.0
Reporter: Greg Hogan
Assignee: Greg Hogan
Priority: Trivial






--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6777) Activate strict checkstyle for flink-scala-shell

2017-05-30 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6777:
-

 Summary: Activate strict checkstyle for flink-scala-shell
 Key: FLINK-6777
 URL: https://issues.apache.org/jira/browse/FLINK-6777
 Project: Flink
  Issue Type: Sub-task
  Components: Scala Shell
Affects Versions: 1.4.0
Reporter: Greg Hogan
Assignee: Greg Hogan
Priority: Trivial






--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6778) Activate strict checkstyle for flink-dist

2017-05-30 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6778:
-

 Summary: Activate strict checkstyle for flink-dist
 Key: FLINK-6778
 URL: https://issues.apache.org/jira/browse/FLINK-6778
 Project: Flink
  Issue Type: Sub-task
Affects Versions: 1.4.0
Reporter: Greg Hogan
Assignee: Greg Hogan
Priority: Trivial






--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6699) Activate strict-checkstyle in flink-yarn-tests

2017-05-28 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16027882#comment-16027882
 ] 

Greg Hogan commented on FLINK-6699:
---

[~Zentol] it looks like we forgot to activate the checkstyle in this module 
({{CliFrontendYarnAddressConfigurationTest}} has import errors).

> Activate strict-checkstyle in flink-yarn-tests
> --
>
> Key: FLINK-6699
> URL: https://issues.apache.org/jira/browse/FLINK-6699
> Project: Flink
>  Issue Type: Sub-task
>  Components: Tests, YARN
>Reporter: Chesnay Schepler
>Assignee: Chesnay Schepler
> Fix For: 1.4.0
>
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6732) Activate strict checkstyle for flink-java

2017-05-26 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6732?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16026473#comment-16026473
 ] 

Greg Hogan commented on FLINK-6732:
---

[~Zentol] what do you mean by "long term issue"? Is the intent to create 
multiple pull requests?

> Activate strict checkstyle for flink-java
> -
>
> Key: FLINK-6732
> URL: https://issues.apache.org/jira/browse/FLINK-6732
> Project: Flink
>  Issue Type: Sub-task
>  Components: Java API
>Reporter: Chesnay Schepler
>Assignee: Dawid Wysakowicz
>
> Long term issue for incrementally introducing the strict checkstyle to 
> flink-java.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6727) Remove duplicate java8 Join test

2017-05-26 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6727?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16026339#comment-16026339
 ] 

Greg Hogan commented on FLINK-6727:
---

These are not identical, {{JoinITCase}} returns a {{Tuple}} whereas 
{{FlatJoinITCase}} collects the {{Tuple}}.

> Remove duplicate java8 Join test
> 
>
> Key: FLINK-6727
> URL: https://issues.apache.org/jira/browse/FLINK-6727
> Project: Flink
>  Issue Type: Improvement
>  Components: Tests
>Affects Versions: 1.4.0
>Reporter: Chesnay Schepler
>Priority: Minor
> Fix For: 1.4.0
>
>
> The java8 {{JoinITCase}} and {{FlatJoinITCase}} are completely identical; one 
> should be removed.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6734) Exclude org.apache.flink.api.java.tuple from checkstyle AvoidStarImport

2017-05-26 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6734?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16026327#comment-16026327
 ] 

Greg Hogan commented on FLINK-6734:
---

If making this change we also need to consider how this will be configured in 
IDEs. In IntelliJ I only see the option {{Packages to Use Import with '*'}}.

> Exclude org.apache.flink.api.java.tuple from checkstyle AvoidStarImport
> ---
>
> Key: FLINK-6734
> URL: https://issues.apache.org/jira/browse/FLINK-6734
> Project: Flink
>  Issue Type: Sub-task
>  Components: Build System
>Reporter: Dawid Wysakowicz
>Assignee: Dawid Wysakowicz
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6698) Activate strict checkstyle

2017-05-26 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6698?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16026178#comment-16026178
 ] 

Greg Hogan commented on FLINK-6698:
---

It's been fine for the smaller modules but for the remaining large modules we 
should as much as possible split {{main}} / {{test}} and imports / whitespace / 
other into different commits or tickets.

> Activate strict checkstyle
> --
>
> Key: FLINK-6698
> URL: https://issues.apache.org/jira/browse/FLINK-6698
> Project: Flink
>  Issue Type: Improvement
>Reporter: Chesnay Schepler
>
> Umbrella issue for introducing the strict checkstyle, to keep track of which 
> modules are already covered.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6722) Activate strict checkstyle for flink-table

2017-05-25 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6722?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16025524#comment-16025524
 ] 

Greg Hogan commented on FLINK-6722:
---

[~fhueske] [~Zentol] the only modifications to Scala code would be removal of 
unused imports / import order (if desired) and references to renamed Java 
packages and types (as we now disallow non-alphanumeric characters).

> Activate strict checkstyle for flink-table
> --
>
> Key: FLINK-6722
> URL: https://issues.apache.org/jira/browse/FLINK-6722
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API & SQL
>Reporter: Chesnay Schepler
>Assignee: Greg Hogan
> Fix For: 1.4.0
>
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Assigned] (FLINK-6722) Activate strict checkstyle for flink-table

2017-05-25 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6722?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan reassigned FLINK-6722:
-

Assignee: Greg Hogan

> Activate strict checkstyle for flink-table
> --
>
> Key: FLINK-6722
> URL: https://issues.apache.org/jira/browse/FLINK-6722
> Project: Flink
>  Issue Type: Sub-task
>  Components: Table API & SQL
>Reporter: Chesnay Schepler
>Assignee: Greg Hogan
> Fix For: 1.4.0
>
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6712) Bump Kafka010 version to 0.10.2.0

2017-05-25 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6712?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16024662#comment-16024662
 ] 

Greg Hogan commented on FLINK-6712:
---

+1 and also a version bump is required to build for Scala 2.12.

> Bump Kafka010 version to 0.10.2.0
> -
>
> Key: FLINK-6712
> URL: https://issues.apache.org/jira/browse/FLINK-6712
> Project: Flink
>  Issue Type: Improvement
>  Components: Kafka Connector
>Reporter: Tzu-Li (Gordon) Tai
>
> The main reason for the default version bump is to allow dynamic JAAS 
> configurations using client properties (see 
> https://cwiki.apache.org/confluence/display/KAFKA/KIP-85%3A+Dynamic+JAAS+configuration+for+Kafka+clients),
>  which will make it possible to have multiple Kafka consumers / producers in 
> the same job to have different authentication settings.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Comment Edited] (FLINK-6698) Activate strict checkstyle

2017-05-24 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6698?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16023154#comment-16023154
 ] 

Greg Hogan edited comment on FLINK-6698 at 5/24/17 7:18 PM:


Rebasing to master and running the strict checkstyle analysis across the whole 
project yields ~50,000 total violations. By module:
- runtime ~16,700
- core ~11,000
- java ~5700
- optimizer ~5400
- tests ~3500
- gelly/gelly-examples ~1800
- all others ~5900

Edit: updated totals to include test packages ... the IntelliJ inspection does 
not honor the "include test sources" but depends on the checkstyle preference 
"Only Java sources (including tests)".


was (Author: greghogan):
Rebasing to master and running the strict checkstyle analysis across the whole 
project yields ~27,000 total violations. By module:
- runtime ~8000
- core ~7300
- java ~3200
- optimizer ~2800
- examples-batch ~1000
- gelly/gelly-examples ~1000
- all others ~3700

> Activate strict checkstyle
> --
>
> Key: FLINK-6698
> URL: https://issues.apache.org/jira/browse/FLINK-6698
> Project: Flink
>  Issue Type: Improvement
>Reporter: Chesnay Schepler
>
> Umbrella issue for introducing the strict checkstyle, to keep track of which 
> modules are already covered.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6709) Activate strict checkstyle for flink-gellies

2017-05-24 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6709:
-

 Summary: Activate strict checkstyle for flink-gellies
 Key: FLINK-6709
 URL: https://issues.apache.org/jira/browse/FLINK-6709
 Project: Flink
  Issue Type: Sub-task
  Components: Gelly
Affects Versions: 1.4.0
Reporter: Greg Hogan
Assignee: Greg Hogan
Priority: Trivial
 Fix For: 1.4.0






--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6707) Activate strict checkstyle for flink-examples

2017-05-24 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6707:
-

 Summary: Activate strict checkstyle for flink-examples
 Key: FLINK-6707
 URL: https://issues.apache.org/jira/browse/FLINK-6707
 Project: Flink
  Issue Type: Sub-task
  Components: Examples
Affects Versions: 1.4.0
Reporter: Greg Hogan
Assignee: Greg Hogan
Priority: Trivial
 Fix For: 1.4.0






--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6698) Activate strict checkstyle

2017-05-24 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6698?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16023154#comment-16023154
 ] 

Greg Hogan commented on FLINK-6698:
---

Rebasing to master and running the strict checkstyle analysis across the whole 
project yields ~27,000 total violations. By module:
- runtime ~8000
- core ~7300
- java ~3200
- optimizer ~2800
- examples-batch ~1000
- gelly/gelly-examples ~1000
- all others ~3700

> Activate strict checkstyle
> --
>
> Key: FLINK-6698
> URL: https://issues.apache.org/jira/browse/FLINK-6698
> Project: Flink
>  Issue Type: Improvement
>Reporter: Chesnay Schepler
>
> Umbrella issue for introducing the strict checkstyle, to keep track of which 
> modules are already covered.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6696) Test checkstyle is also applied to .properties files under /src/test/resources

2017-05-24 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6696?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16022773#comment-16022773
 ] 

Greg Hogan commented on FLINK-6696:
---

Text files should end with a new line for consistency and so that, for example, 
when cat'ing a file the prompt is not appended to the last line of the file. 
There may be conflicts between the configured checkstyle and test resources but 
I don't think we should exclude the test resources until such time.

> Test checkstyle is also applied to .properties files under /src/test/resources
> --
>
> Key: FLINK-6696
> URL: https://issues.apache.org/jira/browse/FLINK-6696
> Project: Flink
>  Issue Type: Bug
>  Components: Build System
>Affects Versions: 1.4.0
>Reporter: Chesnay Schepler
>Priority: Trivial
> Fix For: 1.4.0
>
>
> While working on FLINK-6695 I stumbled upon this curious checkstyle error in 
> the flink-storm:
> {code}
> [ERROR] src\test\resources\log4j-test.properties:[0] (misc) 
> NewlineAtEndOfFile: File does not end with a newline.
> {code}
> I don't think we meant to include the resources directory. [~greghogan]



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6603) Enable checkstyle on test sources

2017-05-24 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6603?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6603:
--
Issue Type: Sub-task  (was: Improvement)
Parent: FLINK-6698

> Enable checkstyle on test sources
> -
>
> Key: FLINK-6603
> URL: https://issues.apache.org/jira/browse/FLINK-6603
> Project: Flink
>  Issue Type: Sub-task
>  Components: DataStream API
>Affects Versions: 1.4.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Trivial
> Fix For: 1.4.0
>
>
> With the addition of strict checkstyle to select modules (currently limited 
> to {{flink-streaming-java}}) we can enable the checkstyle flag 
> {{includeTestSourceDirectory}} to perform the same unused imports, 
> whitespace, and other checks on test sources.
> Should first resolve the import grouping as discussed in FLINK-6107. Also, 
> several tests exceed the 2500 line limit.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6603) Enable checkstyle on test sources

2017-05-23 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6603?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6603.
-
Resolution: Implemented

master: 12b4185c6c09101b64e12a84c33dc4d28f95cff9

> Enable checkstyle on test sources
> -
>
> Key: FLINK-6603
> URL: https://issues.apache.org/jira/browse/FLINK-6603
> Project: Flink
>  Issue Type: Improvement
>  Components: DataStream API
>Affects Versions: 1.4.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Trivial
> Fix For: 1.4.0
>
>
> With the addition of strict checkstyle to select modules (currently limited 
> to {{flink-streaming-java}}) we can enable the checkstyle flag 
> {{includeTestSourceDirectory}} to perform the same unused imports, 
> whitespace, and other checks on test sources.
> Should first resolve the import grouping as discussed in FLINK-6107. Also, 
> several tests exceed the 2500 line limit.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6648) Transforms for Gelly examples

2017-05-19 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6648:
-

 Summary: Transforms for Gelly examples
 Key: FLINK-6648
 URL: https://issues.apache.org/jira/browse/FLINK-6648
 Project: Flink
  Issue Type: Improvement
  Components: Gelly
Affects Versions: 1.4.0
Reporter: Greg Hogan
Assignee: Greg Hogan
 Fix For: 1.4.0


A primary objective of the Gelly examples {{Runner}} is to make adding new 
inputs and algorithms as simple and powerful as possible. A recent feature made 
it possible to translate the key ID of generated graphs to alternative numeric 
or string representations. For floating point and {{LongValue}} it is desirable 
to translate the key ID of the algorithm results.

Currently a {{Runner}} job consists of an input, an algorithm, and an output. A 
{{Transform}} will translate the input {{Graph}} and the algorithm output 
{{DataSet}}. The {{Input}} and algorithm {{Driver}} will return an ordered list 
of {{Transform}} which will be executed in that order (processed in reverse 
order for algorithm output) . The {{Transform}} can be configured as are inputs 
and drivers.

Example transforms:
- the aforementioned translation of key ID types
- surrogate types (String -> Long or Int) for user data
- FLINK-4481 Maximum results for pairwise algorithms
- FLINK-3625 Graph algorithms to permute graph labels and edges



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Assigned] (FLINK-6332) Upgrade Scala version to 2.11.11

2017-05-19 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6332?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan reassigned FLINK-6332:
-

Assignee: Greg Hogan

> Upgrade Scala version to 2.11.11
> 
>
> Key: FLINK-6332
> URL: https://issues.apache.org/jira/browse/FLINK-6332
> Project: Flink
>  Issue Type: Improvement
>  Components: Build System, Scala API
>Reporter: Ted Yu
>Assignee: Greg Hogan
>Priority: Minor
>
> Currently scala-2.11 profile uses Scala 2.11.7
> 2.11.11 is the most recent version.
> This issue is to upgrade to Scala 2.11.11



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6009) Deprecate DataSetUtils#checksumHashCode

2017-05-19 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6009?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6009.
-
Resolution: Implemented

> Deprecate DataSetUtils#checksumHashCode
> ---
>
> Key: FLINK-6009
> URL: https://issues.apache.org/jira/browse/FLINK-6009
> Project: Flink
>  Issue Type: Improvement
>  Components: Java API
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Trivial
> Fix For: 1.3.0
>
>
> This is likely only used by Gelly and we have a more featureful 
> implementation allowing for multiple outputs and setting the job name. 
> Deprecation will allow this to be removed in Flink 2.0.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Reopened] (FLINK-6009) Deprecate DataSetUtils#checksumHashCode

2017-05-19 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6009?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan reopened FLINK-6009:
---

> Deprecate DataSetUtils#checksumHashCode
> ---
>
> Key: FLINK-6009
> URL: https://issues.apache.org/jira/browse/FLINK-6009
> Project: Flink
>  Issue Type: Improvement
>  Components: Java API
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Trivial
> Fix For: 1.3.0
>
>
> This is likely only used by Gelly and we have a more featureful 
> implementation allowing for multiple outputs and setting the job name. 
> Deprecation will allow this to be removed in Flink 2.0.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Issue Comment Deleted] (FLINK-6576) Allow ConfigOptions to validate the configured value

2017-05-19 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6576?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6576:
--
Comment: was deleted

(was: I happend to also look at this and in addition to {{taskmanager.sh}} the 
{{flink-daemon.sh}} log rotation needs escaping: {{rotateLogFilesWithPrefix 
"$FLINK_LOG_DIR" "$FLINK_LOG_PREFIX"}}.)

> Allow ConfigOptions to validate the configured value
> 
>
> Key: FLINK-6576
> URL: https://issues.apache.org/jira/browse/FLINK-6576
> Project: Flink
>  Issue Type: Wish
>  Components: Configuration
>Reporter: Chesnay Schepler
>Priority: Trivial
>
> It is not unusual for a config parameter to only accept certain values. Ports 
> may not be negative, modes way essentially be enums, and file paths must not 
> be gibberish.
> Currently these validations happen manually after the value was extracted 
> from the {{Configuration}}.
> I want to start a discussion on whether we want to move this validation into 
> the ConfigOption itself.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Reopened] (FLINK-6577) Expand supported types for ConfigOptions

2017-05-19 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan reopened FLINK-6577:
---

> Expand supported types for ConfigOptions
> 
>
> Key: FLINK-6577
> URL: https://issues.apache.org/jira/browse/FLINK-6577
> Project: Flink
>  Issue Type: Wish
>  Components: Configuration
>Reporter: Chesnay Schepler
>Priority: Trivial
>
> The type of a {{ConfigOption}} is currently limited to the types the that 
> {{Configuration}} supports, which boils down to basic types and byte arrays.
> It would be useful if they could also return things like enums, or the 
> recently added {{MemorySize}}.
> I propose adding a {{fromConfiguration(Configuration}} method to the 
> {{ConfigOption}} class.
> {code}
> // ConfigOption definition
> ConfigOption MEMORY =
> key("memory")
> .defaultValue(new MemorySize(12345))
> .from(new ExtractorFunction() {
> MemorySize extract(Configuration config) {
> // add check for unconfigured option
> return MemorySize.parse(config.getString("memory");}
> });
> // usage
> MemorySize memory = MEMORY.fromConfiguration(config);
> // with default
> MemorySize memory = MEMORY.fromConfiguration(config, new MemorySize(12345);
> // internals of ConfigOption#fromConfiguration
>  fromConfiguration(Configuration config) {
> if (this.extractor == null) { // throw error or something }
> T value = this.extractor.extract(config);
> return value == null ? defaultValue : value;
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6577) Expand supported types for ConfigOptions

2017-05-19 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6577.
-
Resolution: Fixed

> Expand supported types for ConfigOptions
> 
>
> Key: FLINK-6577
> URL: https://issues.apache.org/jira/browse/FLINK-6577
> Project: Flink
>  Issue Type: Wish
>  Components: Configuration
>Reporter: Chesnay Schepler
>Priority: Trivial
>
> The type of a {{ConfigOption}} is currently limited to the types the that 
> {{Configuration}} supports, which boils down to basic types and byte arrays.
> It would be useful if they could also return things like enums, or the 
> recently added {{MemorySize}}.
> I propose adding a {{fromConfiguration(Configuration}} method to the 
> {{ConfigOption}} class.
> {code}
> // ConfigOption definition
> ConfigOption MEMORY =
> key("memory")
> .defaultValue(new MemorySize(12345))
> .from(new ExtractorFunction() {
> MemorySize extract(Configuration config) {
> // add check for unconfigured option
> return MemorySize.parse(config.getString("memory");}
> });
> // usage
> MemorySize memory = MEMORY.fromConfiguration(config);
> // with default
> MemorySize memory = MEMORY.fromConfiguration(config, new MemorySize(12345);
> // internals of ConfigOption#fromConfiguration
>  fromConfiguration(Configuration config) {
> if (this.extractor == null) { // throw error or something }
> T value = this.extractor.extract(config);
> return value == null ? defaultValue : value;
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Issue Comment Deleted] (FLINK-6577) Expand supported types for ConfigOptions

2017-05-19 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6577:
--
Comment: was deleted

(was: Sorry about that. Too much tab hopping.)

> Expand supported types for ConfigOptions
> 
>
> Key: FLINK-6577
> URL: https://issues.apache.org/jira/browse/FLINK-6577
> Project: Flink
>  Issue Type: Wish
>  Components: Configuration
>Reporter: Chesnay Schepler
>Priority: Trivial
>
> The type of a {{ConfigOption}} is currently limited to the types the that 
> {{Configuration}} supports, which boils down to basic types and byte arrays.
> It would be useful if they could also return things like enums, or the 
> recently added {{MemorySize}}.
> I propose adding a {{fromConfiguration(Configuration}} method to the 
> {{ConfigOption}} class.
> {code}
> // ConfigOption definition
> ConfigOption MEMORY =
> key("memory")
> .defaultValue(new MemorySize(12345))
> .from(new ExtractorFunction() {
> MemorySize extract(Configuration config) {
> // add check for unconfigured option
> return MemorySize.parse(config.getString("memory");}
> });
> // usage
> MemorySize memory = MEMORY.fromConfiguration(config);
> // with default
> MemorySize memory = MEMORY.fromConfiguration(config, new MemorySize(12345);
> // internals of ConfigOption#fromConfiguration
>  fromConfiguration(Configuration config) {
> if (this.extractor == null) { // throw error or something }
> T value = this.extractor.extract(config);
> return value == null ? defaultValue : value;
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6577) Expand supported types for ConfigOptions

2017-05-19 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6577?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16017545#comment-16017545
 ] 

Greg Hogan commented on FLINK-6577:
---

Sorry about that. Too much tab hopping.

> Expand supported types for ConfigOptions
> 
>
> Key: FLINK-6577
> URL: https://issues.apache.org/jira/browse/FLINK-6577
> Project: Flink
>  Issue Type: Wish
>  Components: Configuration
>Reporter: Chesnay Schepler
>Priority: Trivial
>
> The type of a {{ConfigOption}} is currently limited to the types the that 
> {{Configuration}} supports, which boils down to basic types and byte arrays.
> It would be useful if they could also return things like enums, or the 
> recently added {{MemorySize}}.
> I propose adding a {{fromConfiguration(Configuration}} method to the 
> {{ConfigOption}} class.
> {code}
> // ConfigOption definition
> ConfigOption MEMORY =
> key("memory")
> .defaultValue(new MemorySize(12345))
> .from(new ExtractorFunction() {
> MemorySize extract(Configuration config) {
> // add check for unconfigured option
> return MemorySize.parse(config.getString("memory");}
> });
> // usage
> MemorySize memory = MEMORY.fromConfiguration(config);
> // with default
> MemorySize memory = MEMORY.fromConfiguration(config, new MemorySize(12345);
> // internals of ConfigOption#fromConfiguration
>  fromConfiguration(Configuration config) {
> if (this.extractor == null) { // throw error or something }
> T value = this.extractor.extract(config);
> return value == null ? defaultValue : value;
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-3155) Update Flink docker version to latest stable Flink version

2017-05-19 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3155?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-3155.
-
Resolution: Fixed

> Update Flink docker version to latest stable Flink version
> --
>
> Key: FLINK-3155
> URL: https://issues.apache.org/jira/browse/FLINK-3155
> Project: Flink
>  Issue Type: Task
>  Components: Docker
>Affects Versions: 1.0.0, 1.1.0
>Reporter: Maximilian Michels
>Priority: Minor
>
> It would be nice to always set the Docker Flink binary URL to point to the 
> latest Flink version. Until then, this JIRA keeps track of the updates for 
> releases.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6576) Allow ConfigOptions to validate the configured value

2017-05-19 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6576?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16017493#comment-16017493
 ] 

Greg Hogan commented on FLINK-6576:
---

I happend to also look at this and in addition to {{taskmanager.sh}} the 
{{flink-daemon.sh}} log rotation needs escaping: {{rotateLogFilesWithPrefix 
"$FLINK_LOG_DIR" "$FLINK_LOG_PREFIX"}}.

> Allow ConfigOptions to validate the configured value
> 
>
> Key: FLINK-6576
> URL: https://issues.apache.org/jira/browse/FLINK-6576
> Project: Flink
>  Issue Type: Wish
>  Components: Configuration
>Reporter: Chesnay Schepler
>Priority: Trivial
>
> It is not unusual for a config parameter to only accept certain values. Ports 
> may not be negative, modes way essentially be enums, and file paths must not 
> be gibberish.
> Currently these validations happen manually after the value was extracted 
> from the {{Configuration}}.
> I want to start a discussion on whether we want to move this validation into 
> the ConfigOption itself.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6616) Clarify provenance of official Docker images

2017-05-18 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6616?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6616.
-
Resolution: Implemented

master: 9a64d50f07c03102dfcb44656ec01852fdb56ac1
1.3.0: 45923ffb8e5fcc98c569e55137e52a495cc7635f

> Clarify provenance of official Docker images
> 
>
> Key: FLINK-6616
> URL: https://issues.apache.org/jira/browse/FLINK-6616
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Critical
> Fix For: 1.3.0, 1.4.0
>
>
> Note that the official Docker images for Flink are community supported and 
> not an official release of the Apache Flink PMC.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6560) Restore maven parallelism in flink-tests

2017-05-18 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6560?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6560:
--
Fix Version/s: (was: 1.3.0)

> Restore maven parallelism in flink-tests
> 
>
> Key: FLINK-6560
> URL: https://issues.apache.org/jira/browse/FLINK-6560
> Project: Flink
>  Issue Type: Bug
>  Components: Build System
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Critical
> Fix For: 1.4.0
>
>
> FLINK-6506 added the maven variable {{flink.forkCountTestPackage}} which is 
> used by the TravisCI script but no default value is set.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6616) Clarify provenance of official Docker images

2017-05-18 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6616?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6616:
--
Fix Version/s: 1.4.0

> Clarify provenance of official Docker images
> 
>
> Key: FLINK-6616
> URL: https://issues.apache.org/jira/browse/FLINK-6616
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Critical
> Fix For: 1.3.0, 1.4.0
>
>
> Note that the official Docker images for Flink are community supported and 
> not an official release of the Apache Flink PMC.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6466) Build Hadoop 2.8.0 convenience binaries

2017-05-18 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6466?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6466:
--
Fix Version/s: (was: 1.3.0)
   1.4.0

> Build Hadoop 2.8.0 convenience binaries
> ---
>
> Key: FLINK-6466
> URL: https://issues.apache.org/jira/browse/FLINK-6466
> Project: Flink
>  Issue Type: New Feature
>  Components: Build System
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
> Fix For: 1.4.0
>
>
> As discussed on the dev mailing list, add Hadoop 2.8 to the 
> {{create_release_files.sh}} script and TravisCI test matrix.
> If there is consensus then references to binaries for old versions of Hadoop 
> could be removed.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6616) Clarify provenance of official Docker images

2017-05-17 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6616:
-

 Summary: Clarify provenance of official Docker images
 Key: FLINK-6616
 URL: https://issues.apache.org/jira/browse/FLINK-6616
 Project: Flink
  Issue Type: Improvement
  Components: Documentation
Affects Versions: 1.3.0
Reporter: Greg Hogan
Assignee: Greg Hogan
Priority: Critical
 Fix For: 1.3.0


Note that the official Docker images for Flink are community supported and not 
an official release of the Apache Flink PMC.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6611) When TaskManager or JobManager restart after crash the PID file contain also the old PID

2017-05-17 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6611?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16013951#comment-16013951
 ] 

Greg Hogan commented on FLINK-6611:
---

It is sometimes desirable to run multiple TaskManagers on the same node so we 
cannot simply overwrite the PID file. This situation cannot be prevented after, 
for example, a hard reboot.

> When TaskManager or JobManager restart after crash the PID file contain also 
> the old PID
> 
>
> Key: FLINK-6611
> URL: https://issues.apache.org/jira/browse/FLINK-6611
> Project: Flink
>  Issue Type: Task
>  Components: Startup Shell Scripts
>Reporter: Mauro Cortellazzi
>Assignee: Mauro Cortellazzi
>Priority: Trivial
>
> When TaskManager or JobManager restart after crash the PID file contain also 
> the old PID.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6603) Enable checkstyle on test sources

2017-05-16 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6603:
-

 Summary: Enable checkstyle on test sources
 Key: FLINK-6603
 URL: https://issues.apache.org/jira/browse/FLINK-6603
 Project: Flink
  Issue Type: Improvement
  Components: Streaming
Affects Versions: 1.4.0
Reporter: Greg Hogan
Assignee: Greg Hogan
Priority: Trivial
 Fix For: 1.4.0


With the addition of strict checkstyle to select modules (currently limited to 
{{flink-streaming-java}}) we can enable the checkstyle flag 
{{includeTestSourceDirectory}} to perform the same unused imports, whitespace, 
and other checks on test sources.

Should first resolve the import grouping as discussed in FLINK-6107. Also, 
several tests exceed the 2500 line limit.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6582) Project from maven archetype is not buildable by default due to ${scala.binary.version}

2017-05-15 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16010531#comment-16010531
 ] 

Greg Hogan commented on FLINK-6582:
---

Interesting that the newer version of the plugin is working for your setup. 
Thanks for the clarification, I had originally thought that the issue was in 
creating the project but now see that the new project's pom has the placeholder 
variable.

> Project from maven archetype is not buildable by default due to 
> ${scala.binary.version}
> ---
>
> Key: FLINK-6582
> URL: https://issues.apache.org/jira/browse/FLINK-6582
> Project: Flink
>  Issue Type: Bug
>  Components: Build System, Quickstarts
>Affects Versions: 1.3.0
>Reporter: Dawid Wysakowicz
>Assignee: Greg Hogan
> Fix For: 1.3.0, 1.4.0
>
>
> When creating a java project from maven-archetype dependencies to flink are 
> unresolvable due to $\{scala.binary.version\} placeholder.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6582) Project from maven archetype is not buildable by default due to ${scala.binary.version}

2017-05-15 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6582:
--
Fix Version/s: 1.4.0

> Project from maven archetype is not buildable by default due to 
> ${scala.binary.version}
> ---
>
> Key: FLINK-6582
> URL: https://issues.apache.org/jira/browse/FLINK-6582
> Project: Flink
>  Issue Type: Bug
>  Components: Build System, Quickstarts
>Affects Versions: 1.3.0
>Reporter: Dawid Wysakowicz
>Assignee: Greg Hogan
> Fix For: 1.3.0, 1.4.0
>
>
> When creating a java project from maven-archetype dependencies to flink are 
> unresolvable due to $\{scala.binary.version\} placeholder.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Issue Comment Deleted] (FLINK-6166) Broken quickstart for snapshots

2017-05-15 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6166?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6166:
--
Comment: was deleted

(was: [~aljoscha] when setting the plugin version the user is requested to 
specify values for the defined parameters (groupID, artifactID, etc.):

{code}
$ mvn org.apache.maven.plugins:maven-archetype-plugin:2.4:generate 
-DarchetypeGroupId=org.apache.flink -DarchetypeArtifactId=flink-quickstart-java 
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 -DarchetypeVersion=1.4-SNAPSHOT
[INFO] Scanning for projects...
[INFO] 
[INFO] 
[INFO] Building Maven Stub Project (No POM) 1
[INFO] 
[INFO] 
[INFO] >>> maven-archetype-plugin:2.4:generate (default-cli) > generate-sources 
@ standalone-pom >>>
[INFO] 
[INFO] <<< maven-archetype-plugin:2.4:generate (default-cli) < generate-sources 
@ standalone-pom <<<
[INFO] 
[INFO] 
[INFO] --- maven-archetype-plugin:2.4:generate (default-cli) @ standalone-pom 
---
[INFO] Generating project in Interactive mode
[INFO] Archetype repository not defined. Using the one from 
[org.apache.flink:flink-quickstart-java:1.1-SNAPSHOT -> 
https://repository.apache.org/content/repositories/snapshots] found in catalog 
https://repository.apache.org/content/repositories/snapshots/
Downloading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-quickstart-java/1.4-SNAPSHOT/maven-metadata.xml
Downloaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-quickstart-java/1.4-SNAPSHOT/maven-metadata.xml
 (1.2 kB at 5.4 kB/s)
Define value for property 'groupId': : org.apache.flink
Define value for property 'artifactId': : flink-quickstart-java
Define value for property 'version':  1.0-SNAPSHOT: : 
Define value for property 'package':  org.apache.flink: : 
Confirm properties configuration:
groupId: org.apache.flink
artifactId: flink-quickstart-java
version: 1.0-SNAPSHOT
package: org.apache.flink
 Y: : Y
[INFO] 

[INFO] Using following parameters for creating project from Archetype: 
flink-quickstart-java:1.4-SNAPSHOT
[INFO] 

[INFO] Parameter: groupId, Value: org.apache.flink
[INFO] Parameter: artifactId, Value: flink-quickstart-java
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Parameter: package, Value: org.apache.flink
[INFO] Parameter: packageInPathFormat, Value: org/apache/flink
[INFO] Parameter: package, Value: org.apache.flink
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Parameter: groupId, Value: org.apache.flink
[INFO] Parameter: artifactId, Value: flink-quickstart-java
[WARNING] CP Don't override file 
/Users/greg/Software/temp/flink-quickstart-java/src/main/resources
[INFO] project created from Archetype in dir: 
/Users/greg/Software/temp/flink-quickstart-java
[INFO] 
[INFO] BUILD SUCCESS
[INFO] 
[INFO] Total time: 14.482 s
[INFO] Finished at: 2017-05-15T08:53:18-04:00
[INFO] Final Memory: 18M/307M
[INFO] 
{code})

> Broken quickstart for snapshots
> ---
>
> Key: FLINK-6166
> URL: https://issues.apache.org/jira/browse/FLINK-6166
> Project: Flink
>  Issue Type: Bug
>  Components: Documentation
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Priority: Critical
>
> I am unable to [run the 
> quickstart|https://ci.apache.org/projects/flink/flink-docs-release-1.3/quickstart/java_api_quickstart.html]
>  for 1.3-SNAPSHOT. This looks to be due to the 
> [3.0.0|http://maven.apache.org/archetype/maven-archetype-plugin/generate-mojo.html]
>  release of the Archetype plugin, which now uses {{archetype-catalog.xml}}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Assigned] (FLINK-6582) Project from maven archetype is not buildable by default due to ${scala.binary.version}

2017-05-15 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan reassigned FLINK-6582:
-

Assignee: Greg Hogan

> Project from maven archetype is not buildable by default due to 
> ${scala.binary.version}
> ---
>
> Key: FLINK-6582
> URL: https://issues.apache.org/jira/browse/FLINK-6582
> Project: Flink
>  Issue Type: Bug
>  Components: Build System, Quickstarts
>Affects Versions: 1.3.0
>Reporter: Dawid Wysakowicz
>Assignee: Greg Hogan
> Fix For: 1.3.0
>
>
> When creating a java project from maven-archetype dependencies to flink are 
> unresolvable due to $\{scala.binary.version\} placeholder.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6582) Project from maven archetype is not buildable by default due to ${scala.binary.version}

2017-05-15 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16010473#comment-16010473
 ] 

Greg Hogan commented on FLINK-6582:
---

[~dawidwys] how are you attributing the error to $\{scala.binary.version\}? Are 
you seeing

{code}
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-archetype-plugin:3.0.1:generate (default-cli) on 
project standalone-pom: archetypeCatalog 
'https://repository.apache.org/content/repositories/snapshots/' is not 
supported anymore. Please read the plugin documentation for details. -> [Help 1]
{code}

> Project from maven archetype is not buildable by default due to 
> ${scala.binary.version}
> ---
>
> Key: FLINK-6582
> URL: https://issues.apache.org/jira/browse/FLINK-6582
> Project: Flink
>  Issue Type: Bug
>  Components: Build System, Quickstarts
>Affects Versions: 1.3.0
>Reporter: Dawid Wysakowicz
> Fix For: 1.3.0
>
>
> When creating a java project from maven-archetype dependencies to flink are 
> unresolvable due to $\{scala.binary.version\} placeholder.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6582) Project from maven archetype is not buildable by default due to ${scala.binary.version}

2017-05-15 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16010466#comment-16010466
 ] 

Greg Hogan commented on FLINK-6582:
---

I now see that this was filed as {{major}} and [~aljoscha] had bumped to 
{{blocker}}. Had not meant to override that decision.

> Project from maven archetype is not buildable by default due to 
> ${scala.binary.version}
> ---
>
> Key: FLINK-6582
> URL: https://issues.apache.org/jira/browse/FLINK-6582
> Project: Flink
>  Issue Type: Bug
>  Components: Build System, Quickstarts
>Affects Versions: 1.3.0
>Reporter: Dawid Wysakowicz
> Fix For: 1.3.0
>
>
> When creating a java project from maven-archetype dependencies to flink are 
> unresolvable due to $\{scala.binary.version\} placeholder.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6582) Project from maven archetype is not buildable by default due to ${scala.binary.version}

2017-05-15 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16010439#comment-16010439
 ] 

Greg Hogan commented on FLINK-6582:
---

[~dawidwys] thank for reporting this. What command are you using? I have not 
been able to reproduce. Is this related to FLINK-6166?

> Project from maven archetype is not buildable by default due to 
> ${scala.binary.version}
> ---
>
> Key: FLINK-6582
> URL: https://issues.apache.org/jira/browse/FLINK-6582
> Project: Flink
>  Issue Type: Bug
>  Components: Build System, Quickstarts
>Affects Versions: 1.3.0
>Reporter: Dawid Wysakowicz
> Fix For: 1.3.0
>
>
> When creating a java project from maven-archetype dependencies to flink are 
> unresolvable due to $\{scala.binary.version\} placeholder.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6166) Broken quickstart for snapshots

2017-05-15 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16010438#comment-16010438
 ] 

Greg Hogan commented on FLINK-6166:
---

[~aljoscha] when setting the plugin version the user is requested to specify 
values for the defined parameters (groupID, artifactID, etc.):

{code}
$ mvn org.apache.maven.plugins:maven-archetype-plugin:2.4:generate 
-DarchetypeGroupId=org.apache.flink -DarchetypeArtifactId=flink-quickstart-java 
-DarchetypeCatalog=https://repository.apache.org/content/repositories/snapshots/
 -DarchetypeVersion=1.4-SNAPSHOT
[INFO] Scanning for projects...
[INFO] 
[INFO] 
[INFO] Building Maven Stub Project (No POM) 1
[INFO] 
[INFO] 
[INFO] >>> maven-archetype-plugin:2.4:generate (default-cli) > generate-sources 
@ standalone-pom >>>
[INFO] 
[INFO] <<< maven-archetype-plugin:2.4:generate (default-cli) < generate-sources 
@ standalone-pom <<<
[INFO] 
[INFO] 
[INFO] --- maven-archetype-plugin:2.4:generate (default-cli) @ standalone-pom 
---
[INFO] Generating project in Interactive mode
[INFO] Archetype repository not defined. Using the one from 
[org.apache.flink:flink-quickstart-java:1.1-SNAPSHOT -> 
https://repository.apache.org/content/repositories/snapshots] found in catalog 
https://repository.apache.org/content/repositories/snapshots/
Downloading: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-quickstart-java/1.4-SNAPSHOT/maven-metadata.xml
Downloaded: 
https://repository.apache.org/content/repositories/snapshots/org/apache/flink/flink-quickstart-java/1.4-SNAPSHOT/maven-metadata.xml
 (1.2 kB at 5.4 kB/s)
Define value for property 'groupId': : org.apache.flink
Define value for property 'artifactId': : flink-quickstart-java
Define value for property 'version':  1.0-SNAPSHOT: : 
Define value for property 'package':  org.apache.flink: : 
Confirm properties configuration:
groupId: org.apache.flink
artifactId: flink-quickstart-java
version: 1.0-SNAPSHOT
package: org.apache.flink
 Y: : Y
[INFO] 

[INFO] Using following parameters for creating project from Archetype: 
flink-quickstart-java:1.4-SNAPSHOT
[INFO] 

[INFO] Parameter: groupId, Value: org.apache.flink
[INFO] Parameter: artifactId, Value: flink-quickstart-java
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Parameter: package, Value: org.apache.flink
[INFO] Parameter: packageInPathFormat, Value: org/apache/flink
[INFO] Parameter: package, Value: org.apache.flink
[INFO] Parameter: version, Value: 1.0-SNAPSHOT
[INFO] Parameter: groupId, Value: org.apache.flink
[INFO] Parameter: artifactId, Value: flink-quickstart-java
[WARNING] CP Don't override file 
/Users/greg/Software/temp/flink-quickstart-java/src/main/resources
[INFO] project created from Archetype in dir: 
/Users/greg/Software/temp/flink-quickstart-java
[INFO] 
[INFO] BUILD SUCCESS
[INFO] 
[INFO] Total time: 14.482 s
[INFO] Finished at: 2017-05-15T08:53:18-04:00
[INFO] Final Memory: 18M/307M
[INFO] 
{code}

> Broken quickstart for snapshots
> ---
>
> Key: FLINK-6166
> URL: https://issues.apache.org/jira/browse/FLINK-6166
> Project: Flink
>  Issue Type: Bug
>  Components: Documentation
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Priority: Critical
>
> I am unable to [run the 
> quickstart|https://ci.apache.org/projects/flink/flink-docs-release-1.3/quickstart/java_api_quickstart.html]
>  for 1.3-SNAPSHOT. This looks to be due to the 
> [3.0.0|http://maven.apache.org/archetype/maven-archetype-plugin/generate-mojo.html]
>  release of the Archetype plugin, which now uses {{archetype-catalog.xml}}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6582) Project from maven archetype is not buildable by default due to ${scala.binary.version}

2017-05-15 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6582?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6582:
--
Priority: Major  (was: Blocker)

> Project from maven archetype is not buildable by default due to 
> ${scala.binary.version}
> ---
>
> Key: FLINK-6582
> URL: https://issues.apache.org/jira/browse/FLINK-6582
> Project: Flink
>  Issue Type: Bug
>  Components: Build System, Quickstarts
>Affects Versions: 1.3.0
>Reporter: Dawid Wysakowicz
> Fix For: 1.3.0
>
>
> When creating a java project from maven-archetype dependencies to flink are 
> unresolvable due to $\{scala.binary.version\} placeholder.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-5966) Document Running Flink on Kubernetes

2017-05-13 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-5966?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16009364#comment-16009364
 ] 

Greg Hogan commented on FLINK-5966:
---

Has this been resolved with FLINK-6330?

> Document Running Flink on Kubernetes
> 
>
> Key: FLINK-5966
> URL: https://issues.apache.org/jira/browse/FLINK-5966
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Affects Versions: 1.2.0, 1.3.0
>Reporter: StephenWithPH
>Priority: Minor
>  Labels: documentation
>
> There are several good sources of information regarding running prior 
> versions of Flink in Kubernetes. I was able to follow those and fill in the 
> gaps to get Flink 1.2 up in K8s.
> I plan to document my steps in detail in order to submit a PR. There are 
> several existing PRs that may improve how Flink runs in containers. (See 
> [FLINK-5635 Improve Docker tooling|https://github.com/apache/flink/pull/3205] 
> and [FLINK-5634 Flink should not always redirect 
> stdout|https://github.com/apache/flink/pull/3204])
> Depending on the timing of those PRs, I may tailor my docs towards Flink 1.3 
> in order to reflect those changes.
> I'm opening this JIRA issue to begin the process.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6573) Flink MongoDB Connector

2017-05-12 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6573?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16008853#comment-16008853
 ] 

Greg Hogan commented on FLINK-6573:
---

[~naagachowdary] glad to hear of your success and thank you for this 
contributions! The Apache Flink community is adding new connectors to the 
[bahir-flink|https://github.com/apache/bahir-flink] repo but please continue to 
make use of the flink-dev mailing list for any questions or discussion.

> Flink MongoDB Connector
> ---
>
> Key: FLINK-6573
> URL: https://issues.apache.org/jira/browse/FLINK-6573
> Project: Flink
>  Issue Type: New Feature
>  Components: Batch Connectors and Input/Output Formats
>Affects Versions: 1.2.0
> Environment: Linux Operating System, Mongo DB
>Reporter: Nagamallikarjuna
>Priority: Minor
> Fix For: 2.0.0
>
>   Original Estimate: 672h
>  Remaining Estimate: 672h
>
> Hi Community,
> Currently we are using Flink in the current Project. We have huge amount of 
> data to process using Flink which resides in Mongo DB. We have a requirement 
> of parallel data connectivity in between Flink and Mongo DB for both 
> reads/writes. Currently we are planning to create this connector and 
> contribute to the Community.
> I will update the further details once I receive your feedback 
> Please let us know if you have any concerns.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6559) Rename 'slaves' to 'workers'

2017-05-12 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6559?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16008060#comment-16008060
 ] 

Greg Hogan commented on FLINK-6559:
---

Should either filename be accepted but not both and ship with "workers"?

> Rename 'slaves' to 'workers'
> 
>
> Key: FLINK-6559
> URL: https://issues.apache.org/jira/browse/FLINK-6559
> Project: Flink
>  Issue Type: Improvement
>  Components: Startup Shell Scripts
>Reporter: Stephan Ewen
>Priority: Minor
>
> It just feels a little nicer to call the "workers"



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6560) Restore maven parallelism in flink-tests

2017-05-11 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6560?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16007026#comment-16007026
 ] 

Greg Hogan commented on FLINK-6560:
---

I had thought {{flink-tests}} was taking longer but noticed a build with 
{{forkCount=1}} where top was at 100% and only a few seconds slower.

> Restore maven parallelism in flink-tests
> 
>
> Key: FLINK-6560
> URL: https://issues.apache.org/jira/browse/FLINK-6560
> Project: Flink
>  Issue Type: Bug
>  Components: Build System
>Affects Versions: 1.3.0, 1.4.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Critical
> Fix For: 1.3.0, 1.4.0
>
>
> FLINK-6506 added the maven variable {{flink.forkCountTestPackage}} which is 
> used by the TravisCI script but no default value is set.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6560) Restore maven parallelism in flink-tests

2017-05-11 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6560:
-

 Summary: Restore maven parallelism in flink-tests
 Key: FLINK-6560
 URL: https://issues.apache.org/jira/browse/FLINK-6560
 Project: Flink
  Issue Type: Bug
  Components: Build System
Affects Versions: 1.3.0, 1.4.0
Reporter: Greg Hogan
Assignee: Greg Hogan
Priority: Minor
 Fix For: 1.3.0, 1.4.0


FLINK-6506 added the maven variable {{flink.forkCountTestPackage}} which is 
used by the TravisCI script but no default value is set.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6393) Add Evenly Graph Generator to Flink Gelly

2017-05-11 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6393?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6393.
-
   Resolution: Implemented
Fix Version/s: 1.4.0
   1.3.0

master: 3ee8c69aa4390a8d51b33f262f719fb1a5474d51
release-1.3: cfaecda3ac9a736213f8bfe643b8f57ce492e243

> Add Evenly Graph Generator to Flink Gelly
> -
>
> Key: FLINK-6393
> URL: https://issues.apache.org/jira/browse/FLINK-6393
> Project: Flink
>  Issue Type: New Feature
>  Components: Gelly
>Reporter: FlorianFan
>Assignee: FlorianFan
>Priority: Minor
> Fix For: 1.3.0, 1.4.0
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Evenly graph means every vertex in the graph has the same degree, so the 
> graph can be treated as evenly due to all the edges in the graph are 
> distributed evenly. when vertex degree is 0, an empty graph will be 
> generated. when vertex degree is vertex count - 1, complete graph will be 
> generated.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6332) Upgrade Scala version to 2.11.11

2017-05-11 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16006228#comment-16006228
 ] 

Greg Hogan commented on FLINK-6332:
---

[2.11.11 concludes the 2.11 
series|https://www.scala-lang.org/news/releases-1Q17.html]. It seems the safest 
time to upgrade would be at the start of a new release cycle.

> Upgrade Scala version to 2.11.11
> 
>
> Key: FLINK-6332
> URL: https://issues.apache.org/jira/browse/FLINK-6332
> Project: Flink
>  Issue Type: Improvement
>  Components: Build System, Scala API
>Reporter: Ted Yu
>Priority: Minor
>
> Currently scala-2.11 profile uses Scala 2.11.7
> 2.11.11 is the most recent version.
> This issue is to upgrade to Scala 2.11.11



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6529) Rework the shading model in Flink

2017-05-10 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6529?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16005710#comment-16005710
 ] 

Greg Hogan commented on FLINK-6529:
---

[~wheat9] appreciate your enthusiasm and just wanted to make sure you have 
followed the discussion on the mailing list. I believe these modules will go in 
a separate Flink repo so you'll want to coordinate with [~StephanEwen] 
especially if the goal is to have this in for 1.3.

> Rework the shading model in Flink
> -
>
> Key: FLINK-6529
> URL: https://issues.apache.org/jira/browse/FLINK-6529
> Project: Flink
>  Issue Type: Bug
>  Components: Build System
>Affects Versions: 1.3.0, 1.2.1
>Reporter: Stephan Ewen
>Assignee: Haohui Mai
>Priority: Critical
>
> h2. Problem
> Currently, Flink shades dependencies like ASM and Guava into all jars of 
> projects that reference it and relocate the classes.
> There are some drawbacks to that approach, let's discuss them at the example 
> of ASM:
>   - The ASM classes are for example in {{flink-core}}, {{flink-java}}, 
> {{flink-scala}}, {{flink-runtime}}, etc.
>   - Users that reference these dependencies have the classes multiple times 
> in the classpath. That is unclean (works, through, because the classes are 
> identical). The same happens when building the final dist. jar.
>   - Some of these dependencies require to include license files in the shaded 
> jar. It is hard to impossible to build a good automatic solution for that, 
> partly due to Maven's very poor cross-project path support
>   - Scala does not support shading really well. Scala classes have references 
> to classes in more places than just the class names (apparently for Scala 
> reflect support). Referencing a Scala project with shaded ASM still requires 
> to add a reference to unshaded ASM (at least as a compile dependency).
> h2. Proposal
> I propose that we build and deploy a {{asm-flink-shaded}} version of ASM and 
> directly program against the relocated namespaces. Since we never use classes 
> that we relocate in public interfaces, Flink users will never see the 
> relocated class names. Internally, it does not hurt to use them.
>   - Proper maven dependency management, no hidden (shaded) dependencies
>   - one copy of each dependency
>   - proper Scala interoperability
>   - no clumsy license management (license is in the deployed 
> {{asm-flink-shaded}})



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6330) Improve Docker documentation

2017-05-10 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6330?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6330.
-
   Resolution: Fixed
Fix Version/s: 1.4.0
   1.3.0

master: 91f376589b717d46b124d7f8e181950926f2ca1e
release-1.3: 560db53f91792aa21fa21c09604de6e08dfcde2e
release-1.2: 048c0a3e7390e9a8c991b32a8bdaf2648a4b564d

> Improve Docker documentation
> 
>
> Key: FLINK-6330
> URL: https://issues.apache.org/jira/browse/FLINK-6330
> Project: Flink
>  Issue Type: Bug
>  Components: Docker
>Affects Versions: 1.2.0
>Reporter: Patrick Lucas
>Assignee: Patrick Lucas
> Fix For: 1.3.0, 1.2.2, 1.4.0
>
>
> The "Docker" page in the docs exists but is blank.
> Add something useful here, including references to the official images that 
> should exist once 1.2.1 is released, and add a brief "Kubernetes" page as 
> well, referencing the [helm chart|https://github.com/docker-flink/examples]. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6512) some code examples are poorly formatted

2017-05-10 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6512.
-
   Resolution: Implemented
Fix Version/s: 1.4.0
   1.2.2
   1.3.0

master: 8a8d95e31132889ea5cc3423ea50280fbaf47062
release-1.3: 81b6c82142ad1c4a8c1288bff754840c65ec4059
release-1.2: 9fbd08b58f4ead2dddcc283f99385fa5be94eecf

> some code examples are poorly formatted
> ---
>
> Key: FLINK-6512
> URL: https://issues.apache.org/jira/browse/FLINK-6512
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Affects Versions: 1.3.0, 1.2.1
>Reporter: David Anderson
>Assignee: David Anderson
> Fix For: 1.3.0, 1.2.2, 1.4.0
>
>
> Some code examples in the docs are hard to read, mostly because the code 
> highlighting plugin was overlooked.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6513) various typos and grammatical flaws

2017-05-10 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6513?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6513.
-
   Resolution: Implemented
Fix Version/s: 1.4.0
   1.3.0

master: 71d76731dc6f611a6f8772cb06a59f5c642ec6cc
release-1.3: 81f58bae1a6730e9e6e06185e1dc6cac36e82892

> various typos and grammatical flaws
> ---
>
> Key: FLINK-6513
> URL: https://issues.apache.org/jira/browse/FLINK-6513
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Affects Versions: 1.3.0
>Reporter: David Anderson
> Fix For: 1.3.0, 1.4.0
>
>
> I want to propose small changes to several pages to fix some typos and 
> grammatical flaws.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-5819) Improve metrics reporting

2017-05-10 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-5819?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-5819.
-
   Resolution: Implemented
Fix Version/s: 1.4.0
   1.3.0

master: 3dd5c991bab9474b40ee20d2b63ac532bffa34f7
release-1.3: 4882f366686e4c97a3442962ca2e9b5b0f0b8d19

> Improve metrics reporting
> -
>
> Key: FLINK-5819
> URL: https://issues.apache.org/jira/browse/FLINK-5819
> Project: Flink
>  Issue Type: Improvement
>  Components: Webfrontend
>Affects Versions: 1.3.0
>Reporter: Paul Nelligan
>  Labels: web-ui
> Fix For: 1.3.0, 1.4.0
>
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> When displaying individual metrics for a vertex / node of a job in the webui, 
> it is desirable to add an option to display metrics as a numeric or as a 
> chart.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-5831) Sort metrics in metric selector and add search box

2017-05-10 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-5831?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-5831.
-
Resolution: Implemented

master: 3642c5a607a640dac118e2c87f60e41bd3b5f651
release-1.3: 140c4eea17fa40c496179318f378960342362da7

> Sort metrics in metric selector and add search box
> --
>
> Key: FLINK-5831
> URL: https://issues.apache.org/jira/browse/FLINK-5831
> Project: Flink
>  Issue Type: Improvement
>  Components: Webfrontend
>Reporter: Robert Metzger
> Attachments: dropDown.png
>
>
> The JobManager UI makes it hard to select metrics using the drop down menu.
> First of all, it would me nice to sort all entries. Also a search box on top 
> of the drop down would make it much easier to find the metrics.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6157) Make TypeInformation fully serializable

2017-05-09 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6157?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6157:
--
Fix Version/s: 1.4.0
   1.3.0

> Make TypeInformation fully serializable
> ---
>
> Key: FLINK-6157
> URL: https://issues.apache.org/jira/browse/FLINK-6157
> Project: Flink
>  Issue Type: Improvement
>  Components: Core
>Reporter: Timo Walther
>Assignee: Timo Walther
> Fix For: 1.3.0, 1.4.0
>
>
> TypeInformation is already declared serializable, however, there are no tests 
> that verify that all classes are really serializable.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6157) Make TypeInformation fully serializable

2017-05-09 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6157?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6157.
-
Resolution: Implemented

master: 6cf4a93ebbf36f6c1dd4cc2c9b17dd58880c17c2
release-1.3: 2e044d468157576fa6dd8946590b4ab684c2c043

> Make TypeInformation fully serializable
> ---
>
> Key: FLINK-6157
> URL: https://issues.apache.org/jira/browse/FLINK-6157
> Project: Flink
>  Issue Type: Improvement
>  Components: Core
>Reporter: Timo Walther
>Assignee: Timo Walther
> Fix For: 1.3.0, 1.4.0
>
>
> TypeInformation is already declared serializable, however, there are no tests 
> that verify that all classes are really serializable.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-5070) Unable to use Scala's BeanProperty with classes

2017-05-09 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-5070?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-5070.
-
Resolution: Fixed

master: c90b6da5ff845e55c52150a3c6f0b7192959a40e
release-1.3: 949d16e8a00656fa1ee1b235c8ca704331b09f55

> Unable to use Scala's BeanProperty with classes
> ---
>
> Key: FLINK-5070
> URL: https://issues.apache.org/jira/browse/FLINK-5070
> Project: Flink
>  Issue Type: Bug
>  Components: Core, Scala API, Type Serialization System
>Affects Versions: 1.1.3
>Reporter: Jakub Nowacki
>Assignee: Timo Walther
> Fix For: 1.3.0, 1.4.0
>
>
> When using Scala class with with property (both var and val) annotated as 
> BeanProperty, throws an exception {{java.lang.IllegalStateException: Detected 
> more than one getter}}.
> The simple code which follows throws that exception:
> {code:java}
> class SomeClass(@BeanProperty var prop: Int)
> object SampleBeanProperty {
> def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.createLocalEnvironment()
> // Create a DataSet from a list of elements
> env.fromElements(1,2)
> .map(new SomeClass(_))
> .print
> env.execute()
> }
> }
> {code}
> Full exception:
> {code}
> Exception in thread "main" java.lang.IllegalStateException: Detected more 
> than one setter
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.isValidPojoField(TypeExtractor.java:1646)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.analyzePojo(TypeExtractor.java:1692)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1580)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1479)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:737)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:543)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:497)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:493)
>   at SampleBeanProperty$.main(SampleBeanProperty.scala:18)
>   at SampleBeanProperty.main(SampleBeanProperty.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:498)
>   at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
> {code}
> If the class is changed into case class, code with BeanProperty works fine.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-5070) Unable to use Scala's BeanProperty with classes

2017-05-09 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-5070?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-5070:
--
Fix Version/s: 1.4.0
   1.3.0

> Unable to use Scala's BeanProperty with classes
> ---
>
> Key: FLINK-5070
> URL: https://issues.apache.org/jira/browse/FLINK-5070
> Project: Flink
>  Issue Type: Bug
>  Components: Core, Scala API, Type Serialization System
>Affects Versions: 1.1.3
>Reporter: Jakub Nowacki
>Assignee: Timo Walther
> Fix For: 1.3.0, 1.4.0
>
>
> When using Scala class with with property (both var and val) annotated as 
> BeanProperty, throws an exception {{java.lang.IllegalStateException: Detected 
> more than one getter}}.
> The simple code which follows throws that exception:
> {code:java}
> class SomeClass(@BeanProperty var prop: Int)
> object SampleBeanProperty {
> def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.createLocalEnvironment()
> // Create a DataSet from a list of elements
> env.fromElements(1,2)
> .map(new SomeClass(_))
> .print
> env.execute()
> }
> }
> {code}
> Full exception:
> {code}
> Exception in thread "main" java.lang.IllegalStateException: Detected more 
> than one setter
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.isValidPojoField(TypeExtractor.java:1646)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.analyzePojo(TypeExtractor.java:1692)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1580)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1479)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:737)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:543)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:497)
>   at 
> org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:493)
>   at SampleBeanProperty$.main(SampleBeanProperty.scala:18)
>   at SampleBeanProperty.main(SampleBeanProperty.scala)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:498)
>   at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
> {code}
> If the class is changed into case class, code with BeanProperty works fine.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6514) Cannot start Flink Cluster in standalone mode

2017-05-09 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6514?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16002964#comment-16002964
 ] 

Greg Hogan commented on FLINK-6514:
---

[~aljoscha] are you running a newer version of Maven? I was seeing this issue 
yesterday until reverting to 3.2.5.

> Cannot start Flink Cluster in standalone mode
> -
>
> Key: FLINK-6514
> URL: https://issues.apache.org/jira/browse/FLINK-6514
> Project: Flink
>  Issue Type: Bug
>  Components: Build System, Cluster Management
>Reporter: Aljoscha Krettek
>Priority: Blocker
> Fix For: 1.3.0
>
>
> The changes introduced for FLINK-5998 change what goes into the 
> {{flink-dost}} fat jar. As it is, this means that trying to start a cluster 
> results in a {{ClassNotFoundException}} of {{LogFactory}} in 
> {{commons-logging}}.
> One solution is to now make the shaded Hadoop jar a proper fat-jar, so that 
> we again have all the dependencies.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-5742) Breakpoints on documentation website

2017-05-09 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-5742?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-5742.
-
Resolution: Implemented

master: a1ec761628051ad5dd04ea469f1a23b07b99ea70
release-1.3: 21961481e6fd53f001158b1348f9ff8ea7eab4a3

> Breakpoints on documentation website
> 
>
> Key: FLINK-5742
> URL: https://issues.apache.org/jira/browse/FLINK-5742
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Affects Versions: 1.2.0
>Reporter: Colin Breame
>Assignee: David Anderson
>Priority: Trivial
> Fix For: 1.3.0, 1.4.0
>
>
> When reading the documentation website, I'm finding that unless I have the 
> browser window fully maximised, the content is formatted in the narrow layout.
> See https://ci.apache.org/projects/flink/flink-docs-release-1.2/
> *Expected behaviour:*
> The content should flow on the right-hand-side of the menu.
> *Actual behaviour*
> The menu is the full width of the window and the content flows below the menu.
> *Notes*
> Any width below 1200px causes the content to be placed below the full width 
> menu.  This 2/3 the width of my laptop screen.
> I would suggest making this smaller and consistent with the main project page 
> (http://flink.apache.org/) which has the breakpoint set to about 840px.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-5742) Breakpoints on documentation website

2017-05-09 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-5742?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-5742:
--
Fix Version/s: 1.4.0
   1.3.0

> Breakpoints on documentation website
> 
>
> Key: FLINK-5742
> URL: https://issues.apache.org/jira/browse/FLINK-5742
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Affects Versions: 1.2.0
>Reporter: Colin Breame
>Assignee: David Anderson
>Priority: Trivial
> Fix For: 1.3.0, 1.4.0
>
>
> When reading the documentation website, I'm finding that unless I have the 
> browser window fully maximised, the content is formatted in the narrow layout.
> See https://ci.apache.org/projects/flink/flink-docs-release-1.2/
> *Expected behaviour:*
> The content should flow on the right-hand-side of the menu.
> *Actual behaviour*
> The menu is the full width of the window and the content flows below the menu.
> *Notes*
> Any width below 1200px causes the content to be placed below the full width 
> menu.  This 2/3 the width of my laptop screen.
> I would suggest making this smaller and consistent with the main project page 
> (http://flink.apache.org/) which has the breakpoint set to about 840px.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6438) Expand docs home page a little

2017-05-09 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6438?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6438.
-
Resolution: Implemented

master: e0ab5f5249d285921f400cc4794673380e7aa8d4
release-1.3: fd910f2c969b6c8ef555e68532ffad2fbc674828

> Expand docs home page a little
> --
>
> Key: FLINK-6438
> URL: https://issues.apache.org/jira/browse/FLINK-6438
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Reporter: David Anderson
>Assignee: David Anderson
>Priority: Minor
> Fix For: 1.3.0, 1.4.0
>
>
> The idea is to improve the documentation home page by adding a few links to 
> valuable items that are too easily overlooked.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6438) Expand docs home page a little

2017-05-09 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6438?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6438:
--
Fix Version/s: 1.4.0
   1.3.0

> Expand docs home page a little
> --
>
> Key: FLINK-6438
> URL: https://issues.apache.org/jira/browse/FLINK-6438
> Project: Flink
>  Issue Type: Improvement
>  Components: Documentation
>Reporter: David Anderson
>Assignee: David Anderson
>Priority: Minor
> Fix For: 1.3.0, 1.4.0
>
>
> The idea is to improve the documentation home page by adding a few links to 
> valuable items that are too easily overlooked.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6466) Build Hadoop 2.8.0 convenience binaries

2017-05-05 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6466:
-

 Summary: Build Hadoop 2.8.0 convenience binaries
 Key: FLINK-6466
 URL: https://issues.apache.org/jira/browse/FLINK-6466
 Project: Flink
  Issue Type: New Feature
  Components: Build System
Affects Versions: 1.3.0
Reporter: Greg Hogan
Assignee: Greg Hogan
 Fix For: 1.3.0


As discussed on the dev mailing list, add Hadoop 2.8 to the 
{{create_release_files.sh}} script and TravisCI test matrix.

If there is consensus then references to binaries for old versions of Hadoop 
could be removed.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6375) Fix LongValue hashCode

2017-05-05 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6375?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15998151#comment-15998151
 ] 

Greg Hogan commented on FLINK-6375:
---

[~apsaltis] appreciate your looking into this but it's filed as an API-breaking 
"2.0" change which will need to be first discussed and deliberated in the 
indefinite future.

> Fix LongValue hashCode
> --
>
> Key: FLINK-6375
> URL: https://issues.apache.org/jira/browse/FLINK-6375
> Project: Flink
>  Issue Type: Improvement
>  Components: Core
>Affects Versions: 2.0.0
>Reporter: Greg Hogan
>Assignee: Andrew Psaltis
>Priority: Trivial
>
> Match {{LongValue.hashCode}} to {{Long.hashCode}} (and the other numeric 
> types) by simply adding the high and low words rather than shifting the hash 
> by adding 43.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6399) Update default Hadoop download version

2017-05-04 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15996660#comment-15996660
 ] 

Greg Hogan commented on FLINK-6399:
---

[~plucas] thanks for noting that! Still waiting on resolution from the mailing 
list.

> Update default Hadoop download version
> --
>
> Key: FLINK-6399
> URL: https://issues.apache.org/jira/browse/FLINK-6399
> Project: Flink
>  Issue Type: Bug
>  Components: Project Website
>Reporter: Greg Hogan
>Assignee: mingleizhang
>
> [Update|http://flink.apache.org/downloads.html] "If you don’t want to do 
> this, pick the Hadoop 1 version." since Hadoop 1 versions are no longer 
> provided.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6437) Move history server configuration to a separate file

2017-05-03 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6437?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15994976#comment-15994976
 ] 

Greg Hogan commented on FLINK-6437:
---

+1 to splitting default configuration. Will properties be restricted to a 
single specific file or will this effectively concatenate a normalized file 
listing?

> Move history server configuration to a separate file
> 
>
> Key: FLINK-6437
> URL: https://issues.apache.org/jira/browse/FLINK-6437
> Project: Flink
>  Issue Type: Improvement
>  Components: History Server
>Affects Versions: 1.3.0
>Reporter: Stephan Ewen
> Fix For: 1.3.0
>
>
> I suggest to keep the {{flink-conf.yaml}} leaner by moving configuration of 
> the History Server to a different file.
> In general, I would propose to move configurations of separate, independent 
> and optional components to individual config files.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6107) Add custom checkstyle for flink-streaming-java

2017-05-03 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6107?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15994936#comment-15994936
 ] 

Greg Hogan commented on FLINK-6107:
---

We justify pragmatic changes to earlier decisions quite readily, such as the 
postponed "time-based" code freeze and release. Now is the best time to 
reevaluate any checkstyle change before developers adapt and additional modules 
are updated.

I'd volunteer but I think FLINK-4370 should be resolved first so that we can 
pair enforced checkstyle with importable IDE configurations.

> Add custom checkstyle for flink-streaming-java
> --
>
> Key: FLINK-6107
> URL: https://issues.apache.org/jira/browse/FLINK-6107
> Project: Flink
>  Issue Type: Improvement
>  Components: DataStream API
>Reporter: Aljoscha Krettek
>Assignee: Aljoscha Krettek
> Fix For: 1.3.0
>
>
> There was some consensus on the ML 
> (https://lists.apache.org/thread.html/94c8c5186b315c58c3f8aaf536501b99e8b92ee97b0034dee295ff6a@%3Cdev.flink.apache.org%3E)
>  that we want to have a more uniform code style. We should start 
> module-by-module and by introducing increasingly stricter rules. We have to 
> be aware of the PR situation and ensure that we have minimal breakage for 
> contributors.
> This issue aims at adding a custom checkstyle.xml for 
> {{flink-streaming-java}} that is based on our current checkstyle.xml but adds 
> these checks for Javadocs:
> {code}
> 
> 
> 
> 
>   
>   
>   
>   
>   
>   
>   
>   
> 
> 
> 
> 
>   
>   
>   
> 
> 
>   
>   
> 
> {code}
> This checks:
>  - Every type has a type-level Javadoc
>  - Proper use of {{}} in Javadocs
>  - First sentence must end with a proper punctuation mark
>  - Proper use (including closing) of HTML tags



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6431) Activate strict checkstyle for flink-metrics

2017-05-02 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6431?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15993109#comment-15993109
 ] 

Greg Hogan commented on FLINK-6431:
---

Is the import order still under discussion?

> Activate strict checkstyle for flink-metrics
> 
>
> Key: FLINK-6431
> URL: https://issues.apache.org/jira/browse/FLINK-6431
> Project: Flink
>  Issue Type: Improvement
>  Components: Metrics
>Reporter: Chesnay Schepler
>Assignee: Chesnay Schepler
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6382) Support additional types for generated graphs in Gelly examples

2017-05-01 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6382?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6382.
-
Resolution: Implemented

Implemented in 33695781f9ce2599d18f45de6a465eaefe7d71f4

> Support additional types for generated graphs in Gelly examples
> ---
>
> Key: FLINK-6382
> URL: https://issues.apache.org/jira/browse/FLINK-6382
> Project: Flink
>  Issue Type: Improvement
>  Components: Gelly
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Minor
> Fix For: 1.3.0
>
>
> The Gelly examples current support {{IntValue}}, {{LongValue}}, and 
> {{StringValue}} for {{RMatGraph}}. Allow transformations and tests for all 
> generated graphs for {{ByteValue}}, {{Byte}}, {{ShortValue}}, {{Short}}, 
> {{CharValue}}, {{Character}}, {{Integer}}, {{Long}}, and {{String}}.
> This is additionally of interest for benchmarking and testing modifications 
> to Flink's internal sort.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6421) Unchecked reflection calls in PojoSerializer

2017-04-30 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6421?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15990201#comment-15990201
 ] 

Greg Hogan commented on FLINK-6421:
---

Please describe the attack in greater detail. I am only seeing that we are 
loading a class, not executing arbitrary code, and that the user class itself 
can do these things. The data stream should be protected by SSL.

> Unchecked reflection calls in PojoSerializer
> 
>
> Key: FLINK-6421
> URL: https://issues.apache.org/jira/browse/FLINK-6421
> Project: Flink
>  Issue Type: Bug
>Reporter: Ted Yu
>Priority: Minor
>
> Here is one example:
> {code}
>   String subclassName = source.readUTF();
>   try {
> actualSubclass = Class.forName(subclassName, true, cl);
> {code}
> subclassName may carry tainted value, allowing an attacker to bypass security 
> checks, obtain unauthorized data, or execute arbitrary code



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6382) Support additional types for generated graphs in Gelly examples

2017-04-29 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6382?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6382:
--
Summary: Support additional types for generated graphs in Gelly examples  
(was: Support additional numeric types for generated graphs in Gelly examples)

> Support additional types for generated graphs in Gelly examples
> ---
>
> Key: FLINK-6382
> URL: https://issues.apache.org/jira/browse/FLINK-6382
> Project: Flink
>  Issue Type: Improvement
>  Components: Gelly
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Minor
> Fix For: 1.3.0
>
>
> The Gelly examples current support {{IntValue}}, {{LongValue}}, and 
> {{StringValue}} for {{RMatGraph}}. Allow transformations and tests for all 
> generated graphs for {{ByteValue}}, {{Byte}}, {{ShortValue}}, {{Short}}, 
> {{CharValue}}, {{Character}}, {{Integer}}, {{Long}}, and {{String}}.
> This is additionally of interest for benchmarking and testing modifications 
> to Flink's internal sort.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6382) Support additional numeric types for generated graphs in Gelly examples

2017-04-29 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6382?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6382:
--
Summary: Support additional numeric types for generated graphs in Gelly 
examples  (was: Support all numeric types for generated graphs in Gelly 
examples)

> Support additional numeric types for generated graphs in Gelly examples
> ---
>
> Key: FLINK-6382
> URL: https://issues.apache.org/jira/browse/FLINK-6382
> Project: Flink
>  Issue Type: Improvement
>  Components: Gelly
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>Priority: Minor
> Fix For: 1.3.0
>
>
> The Gelly examples current support {{IntValue}}, {{LongValue}}, and 
> {{StringValue}} for {{RMatGraph}}. Allow transformations and tests for all 
> generated graphs for {{ByteValue}}, {{Byte}}, {{ShortValue}}, {{Short}}, 
> {{CharValue}}, {{Character}}, {{Integer}}, {{Long}}, and {{String}}.
> This is additionally of interest for benchmarking and testing modifications 
> to Flink's internal sort.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6406) Cleanup useless import

2017-04-29 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6406?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6406:
--
Fix Version/s: 1.3.0

> Cleanup useless import 
> ---
>
> Key: FLINK-6406
> URL: https://issues.apache.org/jira/browse/FLINK-6406
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Affects Versions: 1.3.0
>Reporter: sunjincheng
>Assignee: sunjincheng
> Fix For: 1.3.0
>
>
> When browsing the code, it is found that there are some useless reference in 
> the following file which need cleanup.
> {{packages.scala}}
> {{ExternalCatalogTable}}
> {{arithmetic.scala}}
> {{array.scala}}
> {{ColumnStats}}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6406) Cleanup useless import

2017-04-29 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6406?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6406.
-
Resolution: Implemented

Implemented in ea54962ebe274fca9adef32a079de61659be12fd

> Cleanup useless import 
> ---
>
> Key: FLINK-6406
> URL: https://issues.apache.org/jira/browse/FLINK-6406
> Project: Flink
>  Issue Type: Improvement
>  Components: Table API & SQL
>Affects Versions: 1.3.0
>Reporter: sunjincheng
>Assignee: sunjincheng
> Fix For: 1.3.0
>
>
> When browsing the code, it is found that there are some useless reference in 
> the following file which need cleanup.
> {{packages.scala}}
> {{ExternalCatalogTable}}
> {{arithmetic.scala}}
> {{array.scala}}
> {{ColumnStats}}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (FLINK-6395) TestBases not marked as abstract

2017-04-29 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6395?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan updated FLINK-6395:
--
Fix Version/s: 1.3.0

> TestBases not marked as abstract
> 
>
> Key: FLINK-6395
> URL: https://issues.apache.org/jira/browse/FLINK-6395
> Project: Flink
>  Issue Type: Improvement
>  Components: Tests
>Affects Versions: 1.3.0
>Reporter: Chesnay Schepler
>Assignee: Chesnay Schepler
>Priority: Trivial
> Fix For: 1.3.0
>
>
> The following test base classes don't contain any test methods and should be 
> marked as abstract:
> BinaryOperatorTestBase
> DriverTestBase
> UnaryOperatorTestBase



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6395) TestBases not marked as abstract

2017-04-29 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6395?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6395.
-
Resolution: Implemented

Implemented in d27d3dda4e850b90267e4b426a3e4993bd0cbac8

> TestBases not marked as abstract
> 
>
> Key: FLINK-6395
> URL: https://issues.apache.org/jira/browse/FLINK-6395
> Project: Flink
>  Issue Type: Improvement
>  Components: Tests
>Affects Versions: 1.3.0
>Reporter: Chesnay Schepler
>Assignee: Chesnay Schepler
>Priority: Trivial
> Fix For: 1.3.0
>
>
> The following test base classes don't contain any test methods and should be 
> marked as abstract:
> BinaryOperatorTestBase
> DriverTestBase
> UnaryOperatorTestBase



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6414) Use scala.binary.version in place of change-scala-version.sh

2017-04-28 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6414?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15989135#comment-15989135
 ] 

Greg Hogan commented on FLINK-6414:
---

[~StephanEwen], [this|https://github.com/apache/flink/pull/885] is the genesis 
for borrowing Spark's {{change-scala-version.sh}} and I'm not seeing the 
referenced issues. Travis is passing and Maven is installing the artifacts 
locally just fine. The one glitch I am seeing is these cases where 
maven-remote-resources-plugin is downloading 2.10 resources for the 2.11 build:

{noformat}
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ flink-tests_2.11 ---
Downloading: 
https://repository.apache.org/snapshots/org/apache/flink/flink-clients_2.10/1.3-SNAPSHOT/maven-metadata.xml
Downloaded: 
https://repository.apache.org/snapshots/org/apache/flink/flink-clients_2.10/1.3-SNAPSHOT/maven-metadata.xml
 (1.4 kB at 16 kB/s)
Downloading: 
https://repository.apache.org/snapshots/org/apache/flink/flink-clients_2.10/1.3-SNAPSHOT/flink-clients_2.10-1.3-20170428.120310-111.pom
Downloaded: 
https://repository.apache.org/snapshots/org/apache/flink/flink-clients_2.10/1.3-SNAPSHOT/flink-clients_2.10-1.3-20170428.120310-111.pom
 (7.7 kB at 208 kB/s)
Downloading: 
https://repository.apache.org/snapshots/org/apache/flink/flink-runtime_2.10/1.3-SNAPSHOT/maven-metadata.xml
Downloaded: 
https://repository.apache.org/snapshots/org/apache/flink/flink-runtime_2.10/1.3-SNAPSHOT/maven-metadata.xml
 (1.4 kB at 30 kB/s)
Downloading: 
https://repository.apache.org/snapshots/org/apache/flink/flink-runtime_2.10/1.3-SNAPSHOT/flink-runtime_2.10-1.3-20170428.120229-111.pom
Downloaded: 
https://repository.apache.org/snapshots/org/apache/flink/flink-runtime_2.10/1.3-SNAPSHOT/flink-runtime_2.10-1.3-20170428.120229-111.pom
 (14 kB at 368 kB/s)
Downloading: 
https://repository.apache.org/snapshots/org/apache/flink/flink-optimizer_2.10/1.3-SNAPSHOT/maven-metadata.xml
Downloaded: 
https://repository.apache.org/snapshots/org/apache/flink/flink-optimizer_2.10/1.3-SNAPSHOT/maven-metadata.xml
 (1.4 kB at 36 kB/s)
Downloading: 
https://repository.apache.org/snapshots/org/apache/flink/flink-optimizer_2.10/1.3-SNAPSHOT/flink-optimizer_2.10-1.3-20170428.120253-111.pom
Downloaded: 
https://repository.apache.org/snapshots/org/apache/flink/flink-optimizer_2.10/1.3-SNAPSHOT/flink-optimizer_2.10-1.3-20170428.120253-111.pom
 (4.4 kB at 118 kB/s)

[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ flink-examples-streaming_2.11 ---
Downloading: 
https://repository.apache.org/snapshots/org/apache/flink/flink-optimizer_2.10/1.3-SNAPSHOT/flink-optimizer_2.10-1.3-20170428.120253-111.jar
Downloading: 
https://repository.apache.org/snapshots/org/apache/flink/flink-runtime_2.10/1.3-SNAPSHOT/flink-runtime_2.10-1.3-20170428.120229-111.jar
Downloading: 
https://repository.apache.org/snapshots/org/apache/flink/flink-clients_2.10/1.3-SNAPSHOT/flink-clients_2.10-1.3-20170428.120310-111.jar
Downloaded: 
https://repository.apache.org/snapshots/org/apache/flink/flink-clients_2.10/1.3-SNAPSHOT/flink-clients_2.10-1.3-20170428.120310-111.jar
 (94 kB at 1.5 MB/s)
Downloaded: 
https://repository.apache.org/snapshots/org/apache/flink/flink-optimizer_2.10/1.3-SNAPSHOT/flink-optimizer_2.10-1.3-20170428.120253-111.jar
 (2.7 MB at 14 MB/s)
Downloaded: 
https://repository.apache.org/snapshots/org/apache/flink/flink-runtime_2.10/1.3-SNAPSHOT/flink-runtime_2.10-1.3-20170428.120229-111.jar
 (6.6 MB at 16 MB/s)
{noformat}

On master this section is empty:

{noformat}
[INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) 
@ flink-tests_2.11 ---
{noformat}

> Use scala.binary.version in place of change-scala-version.sh
> 
>
> Key: FLINK-6414
> URL: https://issues.apache.org/jira/browse/FLINK-6414
> Project: Flink
>  Issue Type: Improvement
>  Components: Build System
>Affects Versions: 1.3.0
>Reporter: Greg Hogan
>Assignee: Greg Hogan
>
> Recent commits have failed to modify {{change-scala-version.sh}} resulting in 
> broken builds for {{scala-2.11}}. It looks like we can remove the need for 
> this script by replacing hard-coded references to the Scala version with 
> Flink's maven variable {{scala.binary.version}}.
> I had initially realized that the change script is [only used for 
> building|https://ci.apache.org/projects/flink/flink-docs-release-1.3/setup/building.html#scala-versions]
>  and not for switching the IDE environment.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6416) Potential divide by zero issue in InputGateMetrics#refreshAndGetAvg()

2017-04-28 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6416?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15989121#comment-15989121
 ] 

Greg Hogan commented on FLINK-6416:
---

Wouldn't this produce {{NaN}}?

> Potential divide by zero issue in InputGateMetrics#refreshAndGetAvg()
> -
>
> Key: FLINK-6416
> URL: https://issues.apache.org/jira/browse/FLINK-6416
> Project: Flink
>  Issue Type: Bug
>Reporter: Ted Yu
>Priority: Minor
>
> {code}
> int count = 0;
> for (InputChannel channel : inputGate.getInputChannels().values()) {
>   if (channel instanceof RemoteInputChannel) {
> RemoteInputChannel rc = (RemoteInputChannel) channel;
> int size = rc.unsynchronizedGetNumberOfQueuedBuffers();
> total += size;
> ++count;
>   }
> }
> return total / (float) count;
> {code}
> If count is zero at the end of the loop, the division would produce exception.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6399) Update default Hadoop download version

2017-04-28 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15988845#comment-15988845
 ] 

Greg Hogan commented on FLINK-6399:
---

It's in the Flink website repo: 
https://github.com/apache/flink-web/blob/asf-site/downloads.md

> Update default Hadoop download version
> --
>
> Key: FLINK-6399
> URL: https://issues.apache.org/jira/browse/FLINK-6399
> Project: Flink
>  Issue Type: Bug
>  Components: Project Website
>Reporter: Greg Hogan
>
> [Update|http://flink.apache.org/downloads.html] "If you don’t want to do 
> this, pick the Hadoop 1 version." since Hadoop 1 versions are no longer 
> provided.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6360) Failed to create assembly for Scala 2.11 due to tools/change-scala-version.sh not changing flink-gelly-examples_2.10

2017-04-28 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6360?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6360.
-
Resolution: Fixed

Fixed in 03afbca5d047988e7fd8aff5d8b83ddee19570c9

> Failed to create assembly for Scala 2.11 due to tools/change-scala-version.sh 
> not changing flink-gelly-examples_2.10
> 
>
> Key: FLINK-6360
> URL: https://issues.apache.org/jira/browse/FLINK-6360
> Project: Flink
>  Issue Type: Bug
>  Components: Build System
>Affects Versions: 1.3.0
>Reporter: Jacek Laskowski
>Priority: Minor
>
> I'm building Flink for Scala 2.11 using the following command:
> {code}
> oss; cd flink; gco -- .; gl && \
> ./tools/change-scala-version.sh 2.11 && \
> mvn clean install -DskipTests -Dhadoop.version=2.7.3 -Dscala.version=2.11.7 
> {code}
> For the past couple of days I've been unable to build Flink because of the 
> following error:
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-assembly-plugin:2.6:single (bin) on project 
> flink-dist_2.11: Failed to create assembly: Error adding file to archive: 
> /Users/jacek/dev/oss/flink/flink-dist/../flink-libraries/flink-gelly-examples/target/flink-gelly-examples_2.10-1.3-SNAPSHOT.jar
>  -> [Help 1]
> {code}
> I've been able to trace it down to {{flink-dist/src/main/assemblies/bin.xml}} 
> not being changed by {{./tools/change-scala-version.sh}} so the build tries 
> to include {{flink-gelly-examples_2.10}} rather than 
> {{flink-gelly-examples_2.11}}.
> Please comment if my reasoning is correct or not and I'd be happy to work on 
> it. Thanks.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Closed] (FLINK-6410) build fails after changing Scala to 2.11

2017-04-28 Thread Greg Hogan (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-6410?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Greg Hogan closed FLINK-6410.
-
Resolution: Duplicate

> build fails after changing Scala to 2.11
> 
>
> Key: FLINK-6410
> URL: https://issues.apache.org/jira/browse/FLINK-6410
> Project: Flink
>  Issue Type: Bug
>  Components: Build System
>Affects Versions: 1.3.0
>Reporter: Stephan Ewen
>Assignee: Stephan Ewen
>Priority: Blocker
> Fix For: 1.3.0
>
>
> The {{change-scala-version.sh}} script does not correctly adjust the 
> {{bin.xml}} assembly descriptor.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (FLINK-6414) Use scala.binary.version in place of change-scala-version.sh

2017-04-28 Thread Greg Hogan (JIRA)
Greg Hogan created FLINK-6414:
-

 Summary: Use scala.binary.version in place of 
change-scala-version.sh
 Key: FLINK-6414
 URL: https://issues.apache.org/jira/browse/FLINK-6414
 Project: Flink
  Issue Type: Improvement
  Components: Build System
Affects Versions: 1.3.0
Reporter: Greg Hogan
Assignee: Greg Hogan


Recent commits have failed to modify {{change-scala-version.sh}} resulting in 
broken builds for {{scala-2.11}}. It looks like we can remove the need for this 
script by replacing hard-coded references to the Scala version with Flink's 
maven variable {{scala.binary.version}}.

I had initially realized that the change script is [only used for 
building|https://ci.apache.org/projects/flink/flink-docs-release-1.3/setup/building.html#scala-versions]
 and not for switching the IDE environment.




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6360) Failed to create assembly for Scala 2.11 due to tools/change-scala-version.sh not changing flink-gelly-examples_2.10

2017-04-28 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6360?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15988663#comment-15988663
 ] 

Greg Hogan commented on FLINK-6360:
---

[~till.rohrmann] I tried to remove use of `change-scale-version.sh` yesterday 
and Maven works fine (though supporting three Scala versions may cause issues 
in `flink-scala[-shell]/pom.xml`). IntelliJ does not pick up the changes when 
the `scala-2.11` profile is enabled so I think we need to hard-code and change 
the top-level artifact IDs but can use `scala.binary.version` elsewhere (which 
would prevent bugs like these).

> Failed to create assembly for Scala 2.11 due to tools/change-scala-version.sh 
> not changing flink-gelly-examples_2.10
> 
>
> Key: FLINK-6360
> URL: https://issues.apache.org/jira/browse/FLINK-6360
> Project: Flink
>  Issue Type: Bug
>  Components: Build System
>Affects Versions: 1.3.0
>Reporter: Jacek Laskowski
>Priority: Minor
>
> I'm building Flink for Scala 2.11 using the following command:
> {code}
> oss; cd flink; gco -- .; gl && \
> ./tools/change-scala-version.sh 2.11 && \
> mvn clean install -DskipTests -Dhadoop.version=2.7.3 -Dscala.version=2.11.7 
> {code}
> For the past couple of days I've been unable to build Flink because of the 
> following error:
> {code}
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-assembly-plugin:2.6:single (bin) on project 
> flink-dist_2.11: Failed to create assembly: Error adding file to archive: 
> /Users/jacek/dev/oss/flink/flink-dist/../flink-libraries/flink-gelly-examples/target/flink-gelly-examples_2.10-1.3-SNAPSHOT.jar
>  -> [Help 1]
> {code}
> I've been able to trace it down to {{flink-dist/src/main/assemblies/bin.xml}} 
> not being changed by {{./tools/change-scala-version.sh}} so the build tries 
> to include {{flink-gelly-examples_2.10}} rather than 
> {{flink-gelly-examples_2.11}}.
> Please comment if my reasoning is correct or not and I'd be happy to work on 
> it. Thanks.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6393) Add Evenly Graph Generator to Flink Gelly

2017-04-27 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15987983#comment-15987983
 ] 

Greg Hogan commented on FLINK-6393:
---

Hi [~fanzhidongyzby], we'll leave this Jira open until the pull request is 
merged into the repository.

I looked for a description of this graph early today and the closest I could 
find was [CirculantGraph|http://mathworld.wolfram.com/CirculantGraph.html], 
which is a generalization. What algorithms would use the EvenlyGraph?

> Add Evenly Graph Generator to Flink Gelly
> -
>
> Key: FLINK-6393
> URL: https://issues.apache.org/jira/browse/FLINK-6393
> Project: Flink
>  Issue Type: New Feature
>  Components: Gelly
>Reporter: FlorianFan
>Assignee: FlorianFan
>Priority: Minor
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> Evenly graph means every vertex in the graph has the same degree, so the 
> graph can be treated as evenly due to all the edges in the graph are 
> distributed evenly. when vertex degree is 0, an empty graph will be 
> generated. when vertex degree is vertex count - 1, complete graph will be 
> generated.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (FLINK-6403) constructFlinkClassPath produces nondeterministic classpath

2017-04-27 Thread Greg Hogan (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-6403?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15987489#comment-15987489
 ] 

Greg Hogan commented on FLINK-6403:
---

[~arobe...@fuze.com] this is now fixed in the release-1.2 branch for the 1.2.2 
release.

> constructFlinkClassPath produces nondeterministic classpath
> ---
>
> Key: FLINK-6403
> URL: https://issues.apache.org/jira/browse/FLINK-6403
> Project: Flink
>  Issue Type: Bug
>  Components: Startup Shell Scripts
>Affects Versions: 1.2.0
>Reporter: Andrew Roberts
>Priority: Critical
>
> In 1.2.0, `config.sh` moved from a shell glob to a find-based approach for 
> constructing the classpath from `/lib` that gets sent to most flink commands, 
> e.g. `start-cluster.sh`. The `find` command does not guarantee an ordering, 
> and we saw issues with flink constructing different classpaths on different 
> machines.
> constructFlinkClassPath should be modified to produce a deterministic 
> classpath.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


<    1   2   3   4   5   6   7   8   9   10   >