[jira] [Commented] (FLINK-1319) Add static code analysis for UDFs
[ https://issues.apache.org/jira/browse/FLINK-1319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576642#comment-14576642 ] ASF GitHub Bot commented on FLINK-1319: --- Github user twalthr commented on the pull request: https://github.com/apache/flink/pull/729#issuecomment-109875609 Great news :) Thanks Ufuk! > Add static code analysis for UDFs > - > > Key: FLINK-1319 > URL: https://issues.apache.org/jira/browse/FLINK-1319 > Project: Flink > Issue Type: New Feature > Components: Java API, Scala API >Reporter: Stephan Ewen >Assignee: Timo Walther >Priority: Minor > > Flink's Optimizer takes information that tells it for UDFs which fields of > the input elements are accessed, modified, or frwarded/copied. This > information frequently helps to reuse partitionings, sorts, etc. It may speed > up programs significantly, as it can frequently eliminate sorts and shuffles, > which are costly. > Right now, users can add lightweight annotations to UDFs to provide this > information (such as adding {{@ConstandFields("0->3, 1, 2->1")}}. > We worked with static code analysis of UDFs before, to determine this > information automatically. This is an incredible feature, as it "magically" > makes programs faster. > For record-at-a-time operations (Map, Reduce, FlatMap, Join, Cross), this > works surprisingly well in many cases. We used the "Soot" toolkit for the > static code analysis. Unfortunately, Soot is LGPL licensed and thus we did > not include any of the code so far. > I propose to add this functionality to Flink, in the form of a drop-in > addition, to work around the LGPL incompatibility with ALS 2.0. Users could > simply download a special "flink-code-analysis.jar" and drop it into the > "lib" folder to enable this functionality. We may even add a script to > "tools" that downloads that library automatically into the lib folder. This > should be legally fine, since we do not redistribute LGPL code and only > dynamically link it (the incompatibility with ASL 2.0 is mainly in the > patentability, if I remember correctly). > Prior work on this has been done by [~aljoscha] and [~skunert], which could > provide a code base to start with. > *Appendix* > Hompage to Soot static analysis toolkit: http://www.sable.mcgill.ca/soot/ > Papers on static analysis and for optimization: > http://stratosphere.eu/assets/papers/EnablingOperatorReorderingSCA_12.pdf and > http://stratosphere.eu/assets/papers/openingTheBlackBoxes_12.pdf > Quick introduction to the Optimizer: > http://stratosphere.eu/assets/papers/2014-VLDBJ_Stratosphere_Overview.pdf > (Section 6) > Optimizer for Iterations: > http://stratosphere.eu/assets/papers/spinningFastIterativeDataFlows_12.pdf > (Sections 4.3 and 5.3) -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-1319][core] Add static code analysis fo...
Github user twalthr commented on the pull request: https://github.com/apache/flink/pull/729#issuecomment-109875609 Great news :) Thanks Ufuk! --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (FLINK-1319) Add static code analysis for UDFs
[ https://issues.apache.org/jira/browse/FLINK-1319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576617#comment-14576617 ] ASF GitHub Bot commented on FLINK-1319: --- Github user asfgit closed the pull request at: https://github.com/apache/flink/pull/729 > Add static code analysis for UDFs > - > > Key: FLINK-1319 > URL: https://issues.apache.org/jira/browse/FLINK-1319 > Project: Flink > Issue Type: New Feature > Components: Java API, Scala API >Reporter: Stephan Ewen >Assignee: Timo Walther >Priority: Minor > > Flink's Optimizer takes information that tells it for UDFs which fields of > the input elements are accessed, modified, or frwarded/copied. This > information frequently helps to reuse partitionings, sorts, etc. It may speed > up programs significantly, as it can frequently eliminate sorts and shuffles, > which are costly. > Right now, users can add lightweight annotations to UDFs to provide this > information (such as adding {{@ConstandFields("0->3, 1, 2->1")}}. > We worked with static code analysis of UDFs before, to determine this > information automatically. This is an incredible feature, as it "magically" > makes programs faster. > For record-at-a-time operations (Map, Reduce, FlatMap, Join, Cross), this > works surprisingly well in many cases. We used the "Soot" toolkit for the > static code analysis. Unfortunately, Soot is LGPL licensed and thus we did > not include any of the code so far. > I propose to add this functionality to Flink, in the form of a drop-in > addition, to work around the LGPL incompatibility with ALS 2.0. Users could > simply download a special "flink-code-analysis.jar" and drop it into the > "lib" folder to enable this functionality. We may even add a script to > "tools" that downloads that library automatically into the lib folder. This > should be legally fine, since we do not redistribute LGPL code and only > dynamically link it (the incompatibility with ASL 2.0 is mainly in the > patentability, if I remember correctly). > Prior work on this has been done by [~aljoscha] and [~skunert], which could > provide a code base to start with. > *Appendix* > Hompage to Soot static analysis toolkit: http://www.sable.mcgill.ca/soot/ > Papers on static analysis and for optimization: > http://stratosphere.eu/assets/papers/EnablingOperatorReorderingSCA_12.pdf and > http://stratosphere.eu/assets/papers/openingTheBlackBoxes_12.pdf > Quick introduction to the Optimizer: > http://stratosphere.eu/assets/papers/2014-VLDBJ_Stratosphere_Overview.pdf > (Section 6) > Optimizer for Iterations: > http://stratosphere.eu/assets/papers/spinningFastIterativeDataFlows_12.pdf > (Sections 4.3 and 5.3) -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-1319) Add static code analysis for UDFs
[ https://issues.apache.org/jira/browse/FLINK-1319?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576623#comment-14576623 ] ASF GitHub Bot commented on FLINK-1319: --- Github user uce commented on the pull request: https://github.com/apache/flink/pull/729#issuecomment-109868329 Initial version is merged in DISABLE mode! Thanks for the contribution. I will file some follow up JIRAs. :) > Add static code analysis for UDFs > - > > Key: FLINK-1319 > URL: https://issues.apache.org/jira/browse/FLINK-1319 > Project: Flink > Issue Type: New Feature > Components: Java API, Scala API >Reporter: Stephan Ewen >Assignee: Timo Walther >Priority: Minor > > Flink's Optimizer takes information that tells it for UDFs which fields of > the input elements are accessed, modified, or frwarded/copied. This > information frequently helps to reuse partitionings, sorts, etc. It may speed > up programs significantly, as it can frequently eliminate sorts and shuffles, > which are costly. > Right now, users can add lightweight annotations to UDFs to provide this > information (such as adding {{@ConstandFields("0->3, 1, 2->1")}}. > We worked with static code analysis of UDFs before, to determine this > information automatically. This is an incredible feature, as it "magically" > makes programs faster. > For record-at-a-time operations (Map, Reduce, FlatMap, Join, Cross), this > works surprisingly well in many cases. We used the "Soot" toolkit for the > static code analysis. Unfortunately, Soot is LGPL licensed and thus we did > not include any of the code so far. > I propose to add this functionality to Flink, in the form of a drop-in > addition, to work around the LGPL incompatibility with ALS 2.0. Users could > simply download a special "flink-code-analysis.jar" and drop it into the > "lib" folder to enable this functionality. We may even add a script to > "tools" that downloads that library automatically into the lib folder. This > should be legally fine, since we do not redistribute LGPL code and only > dynamically link it (the incompatibility with ASL 2.0 is mainly in the > patentability, if I remember correctly). > Prior work on this has been done by [~aljoscha] and [~skunert], which could > provide a code base to start with. > *Appendix* > Hompage to Soot static analysis toolkit: http://www.sable.mcgill.ca/soot/ > Papers on static analysis and for optimization: > http://stratosphere.eu/assets/papers/EnablingOperatorReorderingSCA_12.pdf and > http://stratosphere.eu/assets/papers/openingTheBlackBoxes_12.pdf > Quick introduction to the Optimizer: > http://stratosphere.eu/assets/papers/2014-VLDBJ_Stratosphere_Overview.pdf > (Section 6) > Optimizer for Iterations: > http://stratosphere.eu/assets/papers/spinningFastIterativeDataFlows_12.pdf > (Sections 4.3 and 5.3) -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-1319][core] Add static code analysis fo...
Github user uce commented on the pull request: https://github.com/apache/flink/pull/729#issuecomment-109868329 Initial version is merged in DISABLE mode! Thanks for the contribution. I will file some follow up JIRAs. :) --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] flink pull request: [FLINK-1319][core] Add static code analysis fo...
Github user asfgit closed the pull request at: https://github.com/apache/flink/pull/729 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Resolved] (FLINK-2177) NullPointer in task resource release
[ https://issues.apache.org/jira/browse/FLINK-2177?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ufuk Celebi resolved FLINK-2177. Resolution: Fixed Fixed via d433ba9 > NullPointer in task resource release > > > Key: FLINK-2177 > URL: https://issues.apache.org/jira/browse/FLINK-2177 > Project: Flink > Issue Type: Bug > Components: Distributed Runtime >Affects Versions: 0.9 >Reporter: Stephan Ewen >Assignee: Ufuk Celebi >Priority: Blocker > Fix For: 0.9 > > > {code} > == > == FATAL === > == > A fatal error occurred, forcing the TaskManager to shut down: FATAL - > exception in task resource cleanup > java.lang.NullPointerException > at > org.apache.flink.runtime.io.network.netty.PartitionRequestClientHandler.cancelRequestFor(PartitionRequestClientHandler.java:89) > at > org.apache.flink.runtime.io.network.netty.PartitionRequestClient.close(PartitionRequestClient.java:182) > at > org.apache.flink.runtime.io.network.partition.consumer.RemoteInputChannel.releaseAllResources(RemoteInputChannel.java:199) > at > org.apache.flink.runtime.io.network.partition.consumer.SingleInputGate.releaseAllResources(SingleInputGate.java:332) > at > org.apache.flink.runtime.io.network.NetworkEnvironment.unregisterTask(NetworkEnvironment.java:368) > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:650) > at java.lang.Thread.run(Thread.java:701) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-2087) Add streaming mode to YARN as well
[ https://issues.apache.org/jira/browse/FLINK-2087?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576604#comment-14576604 ] ASF GitHub Bot commented on FLINK-2087: --- Github user asfgit closed the pull request at: https://github.com/apache/flink/pull/788 > Add streaming mode to YARN as well > -- > > Key: FLINK-2087 > URL: https://issues.apache.org/jira/browse/FLINK-2087 > Project: Flink > Issue Type: Sub-task > Components: YARN Client >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger > Fix For: 0.9 > > > Similarly to the regular start scripts, we also need a way to start Flink on > YARN in the streaming mode. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-2087] Add streaming mode switch to YARN
Github user asfgit closed the pull request at: https://github.com/apache/flink/pull/788 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] flink pull request: Added member currentSplit for FileInputFormat....
Github user asfgit closed the pull request at: https://github.com/apache/flink/pull/791 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Resolved] (FLINK-2087) Add streaming mode to YARN as well
[ https://issues.apache.org/jira/browse/FLINK-2087?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Robert Metzger resolved FLINK-2087. --- Resolution: Fixed Resolved in http://git-wip-us.apache.org/repos/asf/flink/commit/58b9a377 > Add streaming mode to YARN as well > -- > > Key: FLINK-2087 > URL: https://issues.apache.org/jira/browse/FLINK-2087 > Project: Flink > Issue Type: Sub-task > Components: YARN Client >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger > Fix For: 0.9 > > > Similarly to the regular start scripts, we also need a way to start Flink on > YARN in the streaming mode. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-1858) Add further checks to QA bot
[ https://issues.apache.org/jira/browse/FLINK-1858?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576478#comment-14576478 ] Robert Metzger commented on FLINK-1858: --- We could also add a check which ensures that the commit name starts with [FLINK-] > Add further checks to QA bot > > > Key: FLINK-1858 > URL: https://issues.apache.org/jira/browse/FLINK-1858 > Project: Flink > Issue Type: Improvement > Components: Build System >Affects Versions: master >Reporter: Ufuk Celebi >Priority: Minor > > In the course of release testing, we noticed two problems, which can easily > be checked by a QA bot (see FLINK-1166). > 1. Check the JARs in examples folder as well. We don't want to sneak in new > examples. ;) Till fixed such an issue for the milestone release in > 4343e448c7baa1f0327544001959113e774efb46. > 2. Check that we don't import from our shaded namespace (IntelliJ at least > suggests them for imports). I've fixed such an issue for the milestone > release in 365cd4005e9b48397dde04b66499adc46543d64b. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Resolved] (FLINK-2179) when return value from linkedlist or map and use in filter function display error
[ https://issues.apache.org/jira/browse/FLINK-2179?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Robert Metzger resolved FLINK-2179. --- Resolution: Invalid Please write to u...@flink.apache.org or here http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ for asking questions on how to use Flink. This system (the bugtracker) is used by the developers to track system defects and features. I'm closing this ticket as Invalid because. I've answered the question here: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/when-return-value-from-linkedlist-or-map-and-use-in-filter-function-display-error-td1528.html > when return value from linkedlist or map and use in filter function display > error > -- > > Key: FLINK-2179 > URL: https://issues.apache.org/jira/browse/FLINK-2179 > Project: Flink > Issue Type: Bug >Reporter: hagersaleh > > when return value from linkedlist or map and use in filter function display > error when run program from command line but when run from netbeans not > display error > public static Map map = new HashMap(); > public static void main(String[] args) throws Exception { > map.put("C_MKTSEGMENT", 2); > > ExecutionEnvironment env = > ExecutionEnvironment.getExecutionEnvironment(); > DataSet > customers=env.readCsvFile("/home/hadoop/Desktop/Dataset/customer.csv") > .fieldDelimiter('|') > > .includeFields("1110").ignoreFirstLine() > .tupleType(Customer3.class); >customers = customers.filter(new FilterFunction() > { > @Override > public boolean filter(Customer3 c) { > int > index1=Integer.parseInt(map.get("C_MKTSEGMENT").toString()); > return c.getField(index1).equals("AUTOMOBILE"); > } > }); > >customers.print(); >customers.writeAsCsv("/home/hadoop/Desktop/Dataset/out1.csv", > "\n", "|",WriteMode.OVERWRITE); > env.execute("TPCH Query 3 Example"); > } > hadoop@ubuntu:~/Desktop/flink-0.7.0-incubating$ bin/flink run > /home/hadoop/Desktop/where_operation_final/dist/where_operation_final.jar > 06/06/2015 13:12:31: Job execution switched to status RUNNING > 06/06/2015 13:12:31: CHAIN DataSource (CSV Input (|) > /home/hadoop/Desktop/Dataset/customer.csv) -> Filter > (org.apache.flink.examples.java.relational.TPCHQuery3$1) (1/1) switched to > SCHEDULED > 06/06/2015 13:12:31: CHAIN DataSource (CSV Input (|) > /home/hadoop/Desktop/Dataset/customer.csv) -> Filter > (org.apache.flink.examples.java.relational.TPCHQuery3$1) (1/1) switched to > DEPLOYING > 06/06/2015 13:12:31: CHAIN DataSource (CSV Input (|) > /home/hadoop/Desktop/Dataset/customer.csv) -> Filter > (org.apache.flink.examples.java.relational.TPCHQuery3$1) (1/1) switched to > RUNNING > 06/06/2015 13:12:31: CHAIN DataSource (CSV Input (|) > /home/hadoop/Desktop/Dataset/customer.csv) -> Filter > (org.apache.flink.examples.java.relational.TPCHQuery3$1) (1/1) switched to > FAILED > java.lang.NullPointerException > at > org.apache.flink.examples.java.relational.TPCHQuery3$1.filter(TPCHQuery3.java:73) > at > org.apache.flink.examples.java.relational.TPCHQuery3$1.filter(TPCHQuery3.java:70) > at > org.apache.flink.api.java.operators.translation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperator.java:47) > at > org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver.collect(ChainedFlatMapDriver.java:79) > at > org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:215) > at > org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeEnvironment.java:235) > at java.lang.Thread.run(Thread.java:745) > 06/06/2015 13:12:31: Job execution switched to status FAILING > 06/06/2015 13:12:31: DataSink(Print to System.out) (1/1) switched to CANCELED > 06/06/2015 13:12:31: DataSink(CsvOutputFormat (path: > /home/hadoop/Desktop/Dataset/out1.csv, delimiter: |)) (1/1) switched to > CANCELED > 06/06/2015 13:12:31: Job execution switched to status FAILED > Error: The program execution failed: java.lang.NullPointerException > at > org.apache.flink.examples.java.relational.TPCHQuery3$1.filter(TPCHQuery3.java:73) > at > org.apache.flink.examples.java.relational.TPCHQuery3$1.filter(TPCHQuery3.java:70) > at > org.apache.flink.api.java.operators.translation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperator.java:47) > at > org.apache.flink.run
[jira] [Commented] (FLINK-2153) Exclude dependency on hbase annotations module
[ https://issues.apache.org/jira/browse/FLINK-2153?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576470#comment-14576470 ] ASF GitHub Bot commented on FLINK-2153: --- Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/800#issuecomment-109807683 This change will cause some java compiler warnings, because our class `org.apache.flink.addons.hbase.TableInputFormat` has a field `org.apache.hadoop.hbase.client.HTable` which has two annotations: ```java @InterfaceAudience.Public @InterfaceStability.Stable public class HTable implements HTableInterface { ``` And these annotations are in the "hbase-annotations" package. So the compiler will not be able to find the annotations and complain about it (without failing). Still, I think we can merge the change ... > Exclude dependency on hbase annotations module > -- > > Key: FLINK-2153 > URL: https://issues.apache.org/jira/browse/FLINK-2153 > Project: Flink > Issue Type: Bug > Components: Build System >Affects Versions: 0.9 >Reporter: Márton Balassi >Assignee: Márton Balassi > > [ERROR] Failed to execute goal on project flink-hbase: Could not resolve > dependencies for project org.apache.flink:flink-hbase:jar:0.9-SNAPSHOT: > Could not find artifact jdk.tools:jdk.tools:jar:1.7 at specified path > /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/../lib/tools.jar > There is a Spark issue for this [1] with a solution [2]. > [1] https://issues.apache.org/jira/browse/SPARK-4455 > [2] https://github.com/apache/spark/pull/3286/files -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-2153] Exclude Hbase annotations
Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/800#issuecomment-109807683 This change will cause some java compiler warnings, because our class `org.apache.flink.addons.hbase.TableInputFormat` has a field `org.apache.hadoop.hbase.client.HTable` which has two annotations: ```java @InterfaceAudience.Public @InterfaceStability.Stable public class HTable implements HTableInterface { ``` And these annotations are in the "hbase-annotations" package. So the compiler will not be able to find the annotations and complain about it (without failing). Still, I think we can merge the change ... --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (FLINK-2155) Add an additional checkstyle validation for illegal imports
[ https://issues.apache.org/jira/browse/FLINK-2155?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576469#comment-14576469 ] Robert Metzger commented on FLINK-2155: --- Thank you. I've merged the website change: https://github.com/apache/flink-web/pull/1 > Add an additional checkstyle validation for illegal imports > --- > > Key: FLINK-2155 > URL: https://issues.apache.org/jira/browse/FLINK-2155 > Project: Flink > Issue Type: Improvement > Components: Build System >Reporter: Lokesh Rajaram >Assignee: Lokesh Rajaram > > Add an additional check-style validation for illegal imports. > To begin with the following two package import are marked as illegal: > 1. org.apache.commons.lang3.Validate > 2. org.apache.flink.shaded.* > Implementation based on: > http://checkstyle.sourceforge.net/config_imports.html#IllegalImport -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-1061) Describe how to run the examples from the command line
[ https://issues.apache.org/jira/browse/FLINK-1061?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576465#comment-14576465 ] ASF GitHub Bot commented on FLINK-1061: --- Github user rmetzger closed the pull request at: https://github.com/apache/flink/pull/784 > Describe how to run the examples from the command line > -- > > Key: FLINK-1061 > URL: https://issues.apache.org/jira/browse/FLINK-1061 > Project: Flink > Issue Type: Improvement > Components: Documentation >Reporter: Robert Metzger >Assignee: Robert Metzger >Priority: Minor > Labels: starter > Fix For: 0.9 > > > The examples page here: > http://flink.incubator.apache.org/docs/0.6-SNAPSHOT/java_api_examples.html > does not describe how users can run the examples from the command line (or > the web interface). > This is the thread where the user suggested this: > http://flink.incubator.apache.org/docs/0.6-SNAPSHOT/run_example_quickstart.html#comment-1554345488 -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-1061] Document how to run an example
Github user rmetzger closed the pull request at: https://github.com/apache/flink/pull/784 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] flink pull request: [FLINK-2092] Adjust documentation for print()/...
Github user rmetzger closed the pull request at: https://github.com/apache/flink/pull/774 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (FLINK-2164) Document batch and streaming startup modes
[ https://issues.apache.org/jira/browse/FLINK-2164?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576464#comment-14576464 ] ASF GitHub Bot commented on FLINK-2164: --- Github user rmetzger closed the pull request at: https://github.com/apache/flink/pull/795 > Document batch and streaming startup modes > -- > > Key: FLINK-2164 > URL: https://issues.apache.org/jira/browse/FLINK-2164 > Project: Flink > Issue Type: Bug > Components: Documentation >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger > Fix For: 0.9 > > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-2092) Document (new) behavior of print() and execute()
[ https://issues.apache.org/jira/browse/FLINK-2092?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576462#comment-14576462 ] ASF GitHub Bot commented on FLINK-2092: --- Github user rmetzger closed the pull request at: https://github.com/apache/flink/pull/774 > Document (new) behavior of print() and execute() > > > Key: FLINK-2092 > URL: https://issues.apache.org/jira/browse/FLINK-2092 > Project: Flink > Issue Type: Task > Components: Documentation >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger >Priority: Blocker > Fix For: 0.9 > > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-2164] Document streaming and batch mode
Github user rmetzger closed the pull request at: https://github.com/apache/flink/pull/795 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (FLINK-2092) Document (new) behavior of print() and execute()
[ https://issues.apache.org/jira/browse/FLINK-2092?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576461#comment-14576461 ] ASF GitHub Bot commented on FLINK-2092: --- Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/774#issuecomment-109802799 I've merged the change. > Document (new) behavior of print() and execute() > > > Key: FLINK-2092 > URL: https://issues.apache.org/jira/browse/FLINK-2092 > Project: Flink > Issue Type: Task > Components: Documentation >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger >Priority: Blocker > Fix For: 0.9 > > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-2092] Adjust documentation for print()/...
Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/774#issuecomment-109802799 I've merged the change. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Resolved] (FLINK-2092) Document (new) behavior of print() and execute()
[ https://issues.apache.org/jira/browse/FLINK-2092?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Robert Metzger resolved FLINK-2092. --- Resolution: Fixed Fix Version/s: 0.9 Resolved in http://git-wip-us.apache.org/repos/asf/flink/commit/85c55dcb > Document (new) behavior of print() and execute() > > > Key: FLINK-2092 > URL: https://issues.apache.org/jira/browse/FLINK-2092 > Project: Flink > Issue Type: Task > Components: Documentation >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger >Priority: Blocker > Fix For: 0.9 > > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-1061) Describe how to run the examples from the command line
[ https://issues.apache.org/jira/browse/FLINK-1061?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576458#comment-14576458 ] ASF GitHub Bot commented on FLINK-1061: --- Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/784#issuecomment-109802725 Thank you for the feedback. I've removed the `file://` scheme and merged the change (the pull request will close as soon as the GH mirror is working again. > Describe how to run the examples from the command line > -- > > Key: FLINK-1061 > URL: https://issues.apache.org/jira/browse/FLINK-1061 > Project: Flink > Issue Type: Improvement > Components: Documentation >Reporter: Robert Metzger >Assignee: Robert Metzger >Priority: Minor > Labels: starter > Fix For: 0.9 > > > The examples page here: > http://flink.incubator.apache.org/docs/0.6-SNAPSHOT/java_api_examples.html > does not describe how users can run the examples from the command line (or > the web interface). > This is the thread where the user suggested this: > http://flink.incubator.apache.org/docs/0.6-SNAPSHOT/run_example_quickstart.html#comment-1554345488 -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-1061] Document how to run an example
Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/784#issuecomment-109802725 Thank you for the feedback. I've removed the `file://` scheme and merged the change (the pull request will close as soon as the GH mirror is working again. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Resolved] (FLINK-1061) Describe how to run the examples from the command line
[ https://issues.apache.org/jira/browse/FLINK-1061?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Robert Metzger resolved FLINK-1061. --- Resolution: Fixed Fix Version/s: 0.9 Resolved in http://git-wip-us.apache.org/repos/asf/flink/commit/defba67b > Describe how to run the examples from the command line > -- > > Key: FLINK-1061 > URL: https://issues.apache.org/jira/browse/FLINK-1061 > Project: Flink > Issue Type: Improvement > Components: Documentation >Reporter: Robert Metzger >Assignee: Robert Metzger >Priority: Minor > Labels: starter > Fix For: 0.9 > > > The examples page here: > http://flink.incubator.apache.org/docs/0.6-SNAPSHOT/java_api_examples.html > does not describe how users can run the examples from the command line (or > the web interface). > This is the thread where the user suggested this: > http://flink.incubator.apache.org/docs/0.6-SNAPSHOT/run_example_quickstart.html#comment-1554345488 -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-2087) Add streaming mode to YARN as well
[ https://issues.apache.org/jira/browse/FLINK-2087?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576455#comment-14576455 ] ASF GitHub Bot commented on FLINK-2087: --- Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/788#issuecomment-109802590 Rebased to current master (to rerun tests). If they pass --> merging > Add streaming mode to YARN as well > -- > > Key: FLINK-2087 > URL: https://issues.apache.org/jira/browse/FLINK-2087 > Project: Flink > Issue Type: Sub-task > Components: YARN Client >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger > Fix For: 0.9 > > > Similarly to the regular start scripts, we also need a way to start Flink on > YARN in the streaming mode. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-2087] Add streaming mode switch to YARN
Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/788#issuecomment-109802590 Rebased to current master (to rerun tests). If they pass --> merging --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[GitHub] flink pull request: Added member currentSplit for FileInputFormat....
Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/791#issuecomment-109802550 ... merged (github mirror broken?!) --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (FLINK-2164) Document batch and streaming startup modes
[ https://issues.apache.org/jira/browse/FLINK-2164?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576453#comment-14576453 ] ASF GitHub Bot commented on FLINK-2164: --- Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/795#issuecomment-109802275 Merged (github mirror out of sync) > Document batch and streaming startup modes > -- > > Key: FLINK-2164 > URL: https://issues.apache.org/jira/browse/FLINK-2164 > Project: Flink > Issue Type: Bug > Components: Documentation >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger > Fix For: 0.9 > > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-2164] Document streaming and batch mode
Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/795#issuecomment-109802275 Merged (github mirror out of sync) --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Resolved] (FLINK-2164) Document batch and streaming startup modes
[ https://issues.apache.org/jira/browse/FLINK-2164?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Robert Metzger resolved FLINK-2164. --- Resolution: Fixed Fix Version/s: 0.9 Resolved in http://git-wip-us.apache.org/repos/asf/flink/commit/4fa3c0bf > Document batch and streaming startup modes > -- > > Key: FLINK-2164 > URL: https://issues.apache.org/jira/browse/FLINK-2164 > Project: Flink > Issue Type: Bug > Components: Documentation >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger > Fix For: 0.9 > > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: Added member currentSplit for FileInputFormat....
Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/791#issuecomment-109802171 Merging ... --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (FLINK-2164) Document batch and streaming startup modes
[ https://issues.apache.org/jira/browse/FLINK-2164?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576451#comment-14576451 ] ASF GitHub Bot commented on FLINK-2164: --- Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/795#issuecomment-109802106 Merging ... > Document batch and streaming startup modes > -- > > Key: FLINK-2164 > URL: https://issues.apache.org/jira/browse/FLINK-2164 > Project: Flink > Issue Type: Bug > Components: Documentation >Affects Versions: 0.9 >Reporter: Robert Metzger >Assignee: Robert Metzger > -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-2164] Document streaming and batch mode
Github user rmetzger commented on the pull request: https://github.com/apache/flink/pull/795#issuecomment-109802106 Merging ... --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Created] (FLINK-2180) Streaming iterate test fails spuriously
Márton Balassi created FLINK-2180: - Summary: Streaming iterate test fails spuriously Key: FLINK-2180 URL: https://issues.apache.org/jira/browse/FLINK-2180 Project: Flink Issue Type: Bug Components: Streaming Affects Versions: 0.9 Reporter: Márton Balassi Following output seen occasionally: Tests run: 3, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 3.667 sec <<< FAILURE! - in org.apache.flink.streaming.api.IterateTest test(org.apache.flink.streaming.api.IterateTest) Time elapsed: 3.662 sec <<< FAILURE! java.lang.AssertionError: null at org.junit.Assert.fail(Assert.java:86) at org.junit.Assert.assertTrue(Assert.java:41) at org.junit.Assert.assertTrue(Assert.java:52) at org.apache.flink.streaming.api.IterateTest.test(IterateTest.java:154) See: https://travis-ci.org/mbalassi/flink/jobs/65803465 -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Closed] (FLINK-2069) writeAsCSV function in DataStream Scala API creates no file
[ https://issues.apache.org/jira/browse/FLINK-2069?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Márton Balassi closed FLINK-2069. - Resolution: Fixed Assignee: Márton Balassi This is already fixed and tested on master. > writeAsCSV function in DataStream Scala API creates no file > --- > > Key: FLINK-2069 > URL: https://issues.apache.org/jira/browse/FLINK-2069 > Project: Flink > Issue Type: Bug > Components: Streaming >Reporter: Faye Beligianni >Assignee: Márton Balassi >Priority: Blocker > Labels: Streaming > Fix For: 0.9 > > > When the {{writeAsCSV}} function is used in the DataStream Scala API, no file > is created in the specified path. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-1844) Add Normaliser to ML library
[ https://issues.apache.org/jira/browse/FLINK-1844?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576282#comment-14576282 ] Faye Beligianni commented on FLINK-1844: Hey [~tvas] I opened a PR for the normaliser, which I named MinMaxScaler. Any comments are welcomed! Regarding the two tests that I wrote, I think that maybe they are too simple, as I am only checking if the numbers are in the user-specified range. An attempt to cross check the result against a dataset of "expectedScaledVectors" would require to use the same method for calculating the "expectedScaledVectors" which I used in the implementation of the MinMaxScaler (wasn't sure if that would've been correct). > Add Normaliser to ML library > > > Key: FLINK-1844 > URL: https://issues.apache.org/jira/browse/FLINK-1844 > Project: Flink > Issue Type: Improvement > Components: Machine Learning Library >Reporter: Faye Beligianni >Assignee: Faye Beligianni >Priority: Minor > Labels: ML, Starter > > In many algorithms in ML, the features' values would be better to lie between > a given range of values, usually in the range (0,1) [1]. Therefore, a > {{Transformer}} could be implemented to achieve that normalisation. > Resources: > [1][http://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.MinMaxScaler.html] -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[GitHub] flink pull request: [FLINK-2150][gelly] Added library method for a...
GitHub user andralungu opened a pull request: https://github.com/apache/flink/pull/801 [FLINK-2150][gelly] Added library method for assigning unique labels to vertices This PR adds a library method that assigns unique labels to vertices. The following facts are used: * a map function generates an n-bit(n - number of parallel tasks) ID based on its own index * with each record, a counter c is increased * the unique label is then produced by shifting the counter c by the n-bit mapper ID Thanks @fhueske for the useful tips! And thanks @rmetzger for the nice stackoverflow answer (http://stackoverflow.com/questions/30596556/zipwithindex-on-apache-flink) If you guys find some spare minutes, you are more than welcome to have a look at what I did here. If not, we'll just let @vasia do what she does best :) You can merge this pull request into a Git repository by running: $ git pull https://github.com/andralungu/flink generateUniqueIds Alternatively you can review and apply these changes as the patch at: https://github.com/apache/flink/pull/801.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #801 commit d655187a9626be6aca734fdbe035de62e3ead839 Author: andralungu Date: 2015-06-07T13:27:24Z [FLINK-2150][gelly] Added library method for assigning unique labels to vertcies --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. ---
[jira] [Commented] (FLINK-2150) Add a library method that assigns unique Long values to vertices
[ https://issues.apache.org/jira/browse/FLINK-2150?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576239#comment-14576239 ] ASF GitHub Bot commented on FLINK-2150: --- GitHub user andralungu opened a pull request: https://github.com/apache/flink/pull/801 [FLINK-2150][gelly] Added library method for assigning unique labels to vertices This PR adds a library method that assigns unique labels to vertices. The following facts are used: * a map function generates an n-bit(n - number of parallel tasks) ID based on its own index * with each record, a counter c is increased * the unique label is then produced by shifting the counter c by the n-bit mapper ID Thanks @fhueske for the useful tips! And thanks @rmetzger for the nice stackoverflow answer (http://stackoverflow.com/questions/30596556/zipwithindex-on-apache-flink) If you guys find some spare minutes, you are more than welcome to have a look at what I did here. If not, we'll just let @vasia do what she does best :) You can merge this pull request into a Git repository by running: $ git pull https://github.com/andralungu/flink generateUniqueIds Alternatively you can review and apply these changes as the patch at: https://github.com/apache/flink/pull/801.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #801 commit d655187a9626be6aca734fdbe035de62e3ead839 Author: andralungu Date: 2015-06-07T13:27:24Z [FLINK-2150][gelly] Added library method for assigning unique labels to vertcies > Add a library method that assigns unique Long values to vertices > > > Key: FLINK-2150 > URL: https://issues.apache.org/jira/browse/FLINK-2150 > Project: Flink > Issue Type: New Feature > Components: Gelly >Reporter: Vasia Kalavri >Assignee: Andra Lungu >Priority: Minor > Labels: starter > > In some graph algorithms, it is required to initialize the vertex values with > unique values (e.g. label propagation). > This issue proposes adding a Gelly library method that receives an input > graph and initializes its vertex values with unique Long values. > This method can then also be used to improve the MusicProfiles example. -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Created] (FLINK-2179) when return value from linkedlist or map and use in filter function display error
hagersaleh created FLINK-2179: - Summary: when return value from linkedlist or map and use in filter function display error Key: FLINK-2179 URL: https://issues.apache.org/jira/browse/FLINK-2179 Project: Flink Issue Type: Bug Reporter: hagersaleh when return value from linkedlist or map and use in filter function display error when run program from command line but when run from netbeans not display error public static Map map = new HashMap(); public static void main(String[] args) throws Exception { map.put("C_MKTSEGMENT", 2); ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment(); DataSet customers=env.readCsvFile("/home/hadoop/Desktop/Dataset/customer.csv") .fieldDelimiter('|') .includeFields("1110").ignoreFirstLine() .tupleType(Customer3.class); customers = customers.filter(new FilterFunction() { @Override public boolean filter(Customer3 c) { int index1=Integer.parseInt(map.get("C_MKTSEGMENT").toString()); return c.getField(index1).equals("AUTOMOBILE"); } }); customers.print(); customers.writeAsCsv("/home/hadoop/Desktop/Dataset/out1.csv", "\n", "|",WriteMode.OVERWRITE); env.execute("TPCH Query 3 Example"); } hadoop@ubuntu:~/Desktop/flink-0.7.0-incubating$ bin/flink run /home/hadoop/Desktop/where_operation_final/dist/where_operation_final.jar 06/06/2015 13:12:31: Job execution switched to status RUNNING 06/06/2015 13:12:31: CHAIN DataSource (CSV Input (|) /home/hadoop/Desktop/Dataset/customer.csv) -> Filter (org.apache.flink.examples.java.relational.TPCHQuery3$1) (1/1) switched to SCHEDULED 06/06/2015 13:12:31: CHAIN DataSource (CSV Input (|) /home/hadoop/Desktop/Dataset/customer.csv) -> Filter (org.apache.flink.examples.java.relational.TPCHQuery3$1) (1/1) switched to DEPLOYING 06/06/2015 13:12:31: CHAIN DataSource (CSV Input (|) /home/hadoop/Desktop/Dataset/customer.csv) -> Filter (org.apache.flink.examples.java.relational.TPCHQuery3$1) (1/1) switched to RUNNING 06/06/2015 13:12:31: CHAIN DataSource (CSV Input (|) /home/hadoop/Desktop/Dataset/customer.csv) -> Filter (org.apache.flink.examples.java.relational.TPCHQuery3$1) (1/1) switched to FAILED java.lang.NullPointerException at org.apache.flink.examples.java.relational.TPCHQuery3$1.filter(TPCHQuery3.java:73) at org.apache.flink.examples.java.relational.TPCHQuery3$1.filter(TPCHQuery3.java:70) at org.apache.flink.api.java.operators.translation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperator.java:47) at org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver.collect(ChainedFlatMapDriver.java:79) at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:215) at org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeEnvironment.java:235) at java.lang.Thread.run(Thread.java:745) 06/06/2015 13:12:31: Job execution switched to status FAILING 06/06/2015 13:12:31: DataSink(Print to System.out) (1/1) switched to CANCELED 06/06/2015 13:12:31: DataSink(CsvOutputFormat (path: /home/hadoop/Desktop/Dataset/out1.csv, delimiter: |)) (1/1) switched to CANCELED 06/06/2015 13:12:31: Job execution switched to status FAILED Error: The program execution failed: java.lang.NullPointerException at org.apache.flink.examples.java.relational.TPCHQuery3$1.filter(TPCHQuery3.java:73) at org.apache.flink.examples.java.relational.TPCHQuery3$1.filter(TPCHQuery3.java:70) at org.apache.flink.api.java.operators.translation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperator.java:47) at org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver.collect(ChainedFlatMapDriver.java:79) at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:215) at org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeEnvironment.java:235) at java.lang.Thread.run(Thread.java:745) -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (FLINK-2127) The GSA Documentation has trailing s
[ https://issues.apache.org/jira/browse/FLINK-2127?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14576231#comment-14576231 ] Maximilian Michels commented on FLINK-2127: --- It seems like the real issue is not the {{br}} tag but that the Kramdown parser does not parse html in markdown paragraphs (blocks). You need to explicitly start a new html block (i.e. by inserting a new line). In my opionion, this is fine because it keeps markdown and html separated from each other. So in the example, just add a new line before the {{p}} tag. I fixed that on the master. What bothers me is that the Kramdown parser's specification actually states the following: {quote} The original Markdown syntax specifies that an HTML block must start at the left margin, i.e. no indentation is allowed. Also, the HTML block has to be surrounded by blank lines. Both restrictions are lifted for kramdown documents. {quote} http://kramdown.gettalong.org/syntax.html#html-blocks So, although we are using Kramdown, the parsing behaves more like the original markdown specification, i.e. html code needs to be separated from other markdown text. I verified whether this is really Kramdown's behavior by executiong the following in a Ruby shell: {code} require 'kramdown' Kramdown::Document.new("test 1 2 3 bla bla bla\nbla bla bla").to_html => "test 1 2 3bla bla bla
\nbla bla bla\n" {code} Interestingly, it confirms that only html code which is separated by a new line gets parsed as actual html code. I'm stumped :) > The GSA Documentation has trailing s > - > > Key: FLINK-2127 > URL: https://issues.apache.org/jira/browse/FLINK-2127 > Project: Flink > Issue Type: Bug > Components: Documentation, Gelly >Affects Versions: 0.9 >Reporter: Andra Lungu >Assignee: Maximilian Michels >Priority: Minor > Fix For: 0.9 > > > Within the GSA Section of the documentation, there are trailing: class="text-center"> image . > It would be nice to remove them :) -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Assigned] (FLINK-2150) Add a library method that assigns unique Long values to vertices
[ https://issues.apache.org/jira/browse/FLINK-2150?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andra Lungu reassigned FLINK-2150: -- Assignee: Andra Lungu > Add a library method that assigns unique Long values to vertices > > > Key: FLINK-2150 > URL: https://issues.apache.org/jira/browse/FLINK-2150 > Project: Flink > Issue Type: New Feature > Components: Gelly >Reporter: Vasia Kalavri >Assignee: Andra Lungu >Priority: Minor > Labels: starter > > In some graph algorithms, it is required to initialize the vertex values with > unique values (e.g. label propagation). > This issue proposes adding a Gelly library method that receives an input > graph and initializes its vertex values with unique Long values. > This method can then also be used to improve the MusicProfiles example. -- This message was sent by Atlassian JIRA (v6.3.4#6332)