https://issues.apache.org/jira/browse/FLINK-1994
There are two ways to set the effective learning rate:
Method 1) Several pre-baked ways to calculate the effective learning rate,
set as a switch. E.g.:
val effectiveLearningRate = optimizationMethod match
{
// original effective learning ra
The vote is over and passes. :) Thanks to everyone who voted!
Result:
8 +1s (7 binding):
Stephan Ewen (binding)
Ufuk Celebi (binding)
Aljoscha Krettek (binding)
Henry Saputra (binding)
Robert Metzger (binding)
Till Rohrmann (binding)
Chiwan Park (non-binding)
Maximilian Michels (binding)
No +0s
I did not look into it and I am not super familiar with Flink's
accumulators... Can not provide much help right now. Sorry :(
-Matthias
On 08/31/2015 01:01 PM, huangwei (G) wrote:
> Hi,
> I think I can take a simple design for it.
> Do you have any idea?
>
> =
Hi,
I think I can take a simple design for it.
Do you have any idea?
Maybe. I did not put any thoughts into this...
On 08/31/2015 12:25 PM, Aljoscha Krettek wrote:
> Hi,
> could the metrics not be implemented on top of Flink accumulators?
> They would ev
Maybe. I did not put any thoughts into this...
On 08/31/2015 12:25 PM, Aljoscha Krettek wrote:
> Hi,
> could the metrics not be implemented on top of Flink accumulators? They
> would even be live-updated in the new web front-end.
>
> Cheers,
> Aljoscha
>
> On Mon, 31 Aug 2015 at 11:32 Matthias J
Hi,
could the metrics not be implemented on top of Flink accumulators? They
would even be live-updated in the new web front-end.
Cheers,
Aljoscha
On Mon, 31 Aug 2015 at 11:32 Matthias J. Sax
wrote:
> Hi,
>
> custom metrics are currently not supported by the compatibility layer.
> See README her
Hi,
custom metrics are currently not supported by the compatibility layer.
See README here:
https://github.com/apache/flink/tree/master/flink-contrib/flink-storm-compatibility/flink-storm-compatibility-core
It is an open question, if/how Flink can support this feature.
-Matthias
On 08/31/2015 0
Hi Matthias and all,
I`d like to do the custom metrics in flink-storm-compatibility.
And there are some questions following:
Is there anything like storm metrics in Flink?
If not, does Flink need to have metrics mechanism in internal first?
Or just let the metrics use in flink-storm-compatibility?
+1
- Verified OpenPGP signatures
- Verified MD5 and SHA checksums
- Executed Java and Scala quickstart examples
- Ran tests on a cluster with Hadoop 2.4.2
On Mon, Aug 31, 2015 at 11:01 AM, Till Rohrmann wrote:
> +1
>
> - Tested against Hadoop 2.7 / Scala 2.10
> - Tested local-cluster and cluster
+1
File signature looks correct
Compiled and passed all tests
Built well with Hadoop 2.7 / Scala 2.10 and Scala 2.11 both
Ran examples in local mode and cluster with 3 machines using FliRTT
- ConnectedComponents
- EnumTrianglesBasic
- EnumTrianglesOpt
- KMeans
- PageRankBasic
- Transit
+1
- Tested against Hadoop 2.7 / Scala 2.10
- Tested local-cluster and cluster (4 machine Linux cluster) execution
- Executed all example test with built-in and external data using FliRTT
- Tested YARN execution
- Tested job submission via the web client
- Tested web interface's job visualization
+1 (binding)
Tested:
On a CDH 5.4.5 cluster with 7 Nodes, Hadoop 2.6.0 with YARN HA and HDFS HA
enabled.
- Build and tested Flink from the source release against a custom Hadoop
version: (L3, L3.1) from an empty local maven repository
mvn clean install -Pvendor-repos -Dhadoop.version=2.6.0-cdh5.4
Matthias J. Sax created FLINK-2599:
--
Summary: Failing Test: SlotCountExceedingParallelismTest
Key: FLINK-2599
URL: https://issues.apache.org/jira/browse/FLINK-2599
Project: Flink
Issue Type:
13 matches
Mail list logo