Github user asfgit closed the pull request at:
https://github.com/apache/flink/pull/370
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user zentol commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-75030486
okay, I will the remove the chained combiner.
I'm curious though, the issue description specifically mentions the
AllGroupReduce greatly benefiting from a chained
Github user StephanEwen commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-74565386
I think we should use the chained reducer from this pull request and drop
the chained combiner change.
---
If your project is set up for it, you can reply to this
Github user StephanEwen commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-74236482
I think for the combiner, we cannot do this. The way it is done here easily
blows up the memory, by collecting so many records. Bear in mind that a record
is not
Github user zentol commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-74240805
the SynchronouesChainedCombineDriver stores multiple records as well
doesn't it? except up to a fixed size instead of # of elements. could we not do
the same here?
---
Github user StephanEwen commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-74247098
The synchronous combine driver stores them in the sort buffer in serialized
fashion, which is save.
---
If your project is set up for it, you can reply to this email
Github user zentol commented on a diff in the pull request:
https://github.com/apache/flink/pull/370#discussion_r24633778
--- Diff:
flink-runtime/src/main/java/org/apache/flink/runtime/operators/chaining/ChainedAllGroupReduceCombineDriver.java
---
@@ -0,0 +1,118 @@
+/*
+
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-73914381
Oh, just saw that you updated your PR.
Won't open a PR. You can have a look at my branch here:
https://github.com/fhueske/flink/tree/chained_all_reduce
---
If your
Github user zentol commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-73923184
sorry for not making it clear that i forced pushed some changes, forgot
making a separate branch.
the changes you made to the driver are already in, sorry for
Github user zentol commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-73404704
there's something funky going on with the tests here.
i got 2 failing tests in ObjectReuseITCase:
```
Github user fhueske commented on the pull request:
https://github.com/apache/flink/pull/370#issuecomment-73388209
Looks good in general.
You need to make sure though, that you obey the execution object re-usage
settings.
That basically means you need to pay attention to the
GitHub user zentol opened a pull request:
https://github.com/apache/flink/pull/370
[FLINK-785] Chained AllReduce
This a a preliminary PR to see whether I'm on the right track.
I'm wondering whether this would be everything needed to add a Chained
AllReduce, before i
12 matches
Mail list logo