[
https://issues.apache.org/jira/browse/MAHOUT-1570?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15193654#comment-15193654
]
ASF GitHub Bot commented on MAHOUT-1570:
----------------------------------------
GitHub user andrewpalumbo reopened a pull request:
https://github.com/apache/mahout/pull/187
MAHOUT-1570: Flink binding b: to review.. not for merge
still needs to be rebased on top of Mahout 0.11.2 master possibly.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/andrewpalumbo/mahout flink-binding-b
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/mahout/pull/187.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #187
----
commit b86115c44a5ef998cc073f003ce79472a6b9005e
Author: Till Rohrmann <[email protected]>
Date: 2015-11-16T16:58:36Z
Add type information to flink-bindings
commit 6006e7730723f910bbf3ae7b4cfcdb732411a3ce
Author: Till Rohrmann <[email protected]>
Date: 2015-11-16T17:14:47Z
Remove flinkbindings.blas package object
commit e465b5e68faa3a7549b69e6df566019ea2f01624
Author: Suneel Marthi <[email protected]>
Date: 2015-11-16T17:39:55Z
Merge pull request #1 from tillrohrmann/flink-binding
Add type information to flink bindings
commit a41d0bda894224b238da246ef222c1ca1ce101a2
Author: smarthi <[email protected]>
Date: 2015-11-12T06:10:54Z
NOJira: Add missing license header
commit 1c6f73bb747e2072c0af05aa105ca080a998c0f7
Author: Andrew Palumbo <[email protected]>
Date: 2016-01-10T19:09:39Z
fix for FlinkopAtB
commit 1ae78e334b85ff90278d153f925539a8651487c7
Author: smarthi <[email protected]>
Date: 2016-01-10T22:47:36Z
Modified to use Flink Scala Api, fixed FlinkOpABt
commit 1ec2047cb0f1e400f3211cec6f65cb7354ac3bb6
Author: smarthi <[email protected]>
Date: 2016-01-10T23:18:43Z
Implemented Sampling methods, other minor fixes
commit f1437460899374648eb3f8fd96c19003c43bcd53
Author: Andrew Palumbo <[email protected]>
Date: 2016-01-11T01:10:43Z
Merge branch 'flink-binding' of https://github.com/smarthi/mahout into
flink-binding-suneel-2016
checked out --theirs
Conflicts:
flink/src/main/scala/org/apache/mahout/flinkbindings/FlinkEngine.scala
flink/src/main/scala/org/apache/mahout/flinkbindings/blas/FlinkOpAtA.scala
flink/src/main/scala/org/apache/mahout/flinkbindings/blas/FlinkOpAtB.scala
commit f457ad44d1cc4670d969ccb771fde30b90525952
Author: smarthi <[email protected]>
Date: 2016-01-16T03:31:20Z
Code reformatting, added TypeInformation check
commit a351af12f0f0e9381504cae9969f3ffe349d5d6f
Author: Andrew Palumbo <[email protected]>
Date: 2016-01-16T03:36:08Z
Merge branch 'flink-binding' of https://github.com/smarthi/mahout into
flink-bindings-b
commit dcfea65fe3ddb6b438558ad1d48d3fb1c8ce04b9
Author: Andrew Palumbo <[email protected]>
Date: 2016-01-16T23:15:09Z
Allow 'Any' key type infomation for generateTypeInformation[K: ClassTag]:
TypeInformation[K]
commit b94c045160b29b55e08ec1f7025d4d60fb8ae82c
Author: Andrew Palumbo <[email protected]>
Date: 2016-01-17T20:51:17Z
wip
commit a0a319643964cdf0fdded31348104014bd161c36
Author: Andrew Palumbo <[email protected]>
Date: 2016-02-06T19:56:37Z
(1) Refactored flinks Hadoop1Utils into an Hadoop2Utils opject to reflect
their retirement of hadoop 1 API.
(2) wip: flink's checkpoint action seems to not be working; some trial and
error. and a note added in the math-scala base tests to note this problem. ie.
what shoul be deterministic tests are not.
(3) this could possibly be an error due to flink rebalancing
pre-checkpoint action.. though unlikely.
(4) does not compile with today's flink 1.0-SNAPSHOT due to some changes in
exctracting type information made in the last week or so.
commit cbac921171deef7ea559581adae31ed27ccc65d2
Author: Andrew Palumbo <[email protected]>
Date: 2016-02-06T20:54:56Z
Use Object instead of Any
commit 957e9f194f9fbd728c36f2f715f62561b2471a2a
Author: Andrew Palumbo <[email protected]>
Date: 2016-02-06T23:10:59Z
upgrade kryo dep in math-scala to match that used by flink. still getting
OOM errors in tests
commit c82ed88812dfe324a6829ebdf3c25a8c58b67b13
Author: Andrew Palumbo <[email protected]>
Date: 2016-02-07T02:08:11Z
clean up,
make sure flink module is using correct kryo version,
use 'ProblemTestSuite' as example of odd problems
commit 22f456613aa18f16ab35f6861ed279dc8bb1b020
Author: Andrew Palumbo <[email protected]>
Date: 2016-02-07T06:30:47Z
replace accidentaly deleted line
commit e0da8bba715d38b87f1d08353c857161dee75c47
Author: Andrew Palumbo <[email protected]>
Date: 2016-02-09T02:27:55Z
wip: drmDfsWrite: explicitly define all Writable Subclasses for ds.map()
keys
commit 759d024f2f45b232e333b9fd742902d8401ff16b
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-02T21:10:11Z
Include java.io.serilazable interface in Martix and Vector Kryo serilizers.
Update flink pom for flink 1.0 release module renaming. wip: failing
stackOverFlow on FlinkOpTimesRightMatrix
commit 44a36d7b1c0350b6cda731f1b35f5593e5a31cad
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-02T21:44:11Z
Hack to fix kryo stackOverFlow when using native flink broadcasting of a
Mahout matrix in FlinkOpTimesRight
commit dc9104ca064c1f35f4d31dfa1a9c338308566f2a
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-03T02:14:32Z
set the environment's parallelism to 10 in the DistributedFlinkTestSuite so
that tests like dsqDist(X,Y) do not fail witl not enough slots
commit b14f38880be9407395b3d3009b831066ef605c65
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-03T02:19:34Z
remove all instances of @runwith[JunitRunner] from tests
commit 1079d32f78d8cdaabd8e10d366347e576a962838
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-03T02:22:21Z
Bump flink to 1.1-SNAPSHOT
commit 935e9eb0f743c1170913fa7090a203af6c7c993e
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-03T03:36:31Z
wip, move all 6 failing tests into FailingTestSuite
commit dfb1fae5dea753e9bfc62f3f84fc20cf3c12387d
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-03T04:37:54Z
wip: intwritable expected to be longwritable
commit f9c39ea355aaf5b81d3c14f3e9c1d81bc7f3c48e
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-03T22:26:35Z
isolate a skingle key type to show error with drm.dfsWrite()
commit 73f8a70a6ba356427f85bc31c74d1a8499fa947a
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-11T00:47:02Z
some clean up
commit f05418cef87ff8f2d9efa59ce3a71f9f84181fda
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-11T01:14:07Z
set flink version to 1.0.0
commit a06f02d426d96a36800d72206692d03c68f8805f
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-11T03:09:52Z
write a simple (Int,Int) unit test for flink hadoop output
commit 29fe274eeaf6f2cb8d01bd6b75627ed70b624291
Author: Andrew Palumbo <[email protected]>
Date: 2016-03-14T01:30:41Z
fix for failing Writables
----
> Adding support for Apache Flink as a backend for the Mahout DSL
> ---------------------------------------------------------------
>
> Key: MAHOUT-1570
> URL: https://issues.apache.org/jira/browse/MAHOUT-1570
> Project: Mahout
> Issue Type: Improvement
> Reporter: Till Rohrmann
> Assignee: Suneel Marthi
> Labels: DSL, flink, scala
> Fix For: 0.12.0
>
>
> With the finalized abstraction of the Mahout DSL plans from the backend
> operations (MAHOUT-1529), it should be possible to integrate further backends
> for the Mahout DSL. Apache Flink would be a suitable candidate to act as a
> good execution backend.
> With respect to the implementation, the biggest difference between Spark and
> Flink at the moment is probably the incremental rollout of plans, which is
> triggered by Spark's actions and which is not supported by Flink yet.
> However, the Flink community is working on this issue. For the moment, it
> should be possible to circumvent this problem by writing intermediate results
> required by an action to HDFS and reading from there.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)