[post] Publish Gelly blog post
Project: http://git-wip-us.apache.org/repos/asf/flink-web/repo Commit: http://git-wip-us.apache.org/repos/asf/flink-web/commit/5889311a Tree: http://git-wip-us.apache.org/repos/asf/flink-web/tree/5889311a Diff: http://git-wip-us.apache.org/repos/asf/flink-web/diff/5889311a Branch: refs/heads/asf-site Commit: 5889311a587a359ed9ab0a5f47b1633deeccb43a Parents: d571b04 Author: Ufuk Celebi <u...@apache.org> Authored: Tue Aug 25 15:33:46 2015 +0200 Committer: Ufuk Celebi <u...@apache.org> Committed: Tue Aug 25 15:33:46 2015 +0200 ---------------------------------------------------------------------- content/blog/feed.xml | 457 +++++++++++++ content/blog/index.html | 39 +- content/blog/page2/index.html | 36 +- content/blog/page3/index.html | 42 +- content/blog/page4/index.html | 29 + content/community.html | 18 +- content/downloads.html | 6 +- content/faq.html | 54 +- content/how-to-contribute.html | 20 +- content/img/blog/GSA-plan.png | Bin 0 -> 54646 bytes content/img/blog/flink-stack.png | Bin 0 -> 32934 bytes content/img/blog/iteration.png | Bin 0 -> 24531 bytes content/img/blog/neighborhood.png | Bin 0 -> 18813 bytes content/img/blog/reduce-on-neighbors.png | Bin 0 -> 34903 bytes content/img/blog/sssp.png | Bin 0 -> 70736 bytes content/img/blog/user-song-graph.png | Bin 0 -> 16140 bytes content/img/blog/user-song-to-user-user.png | Bin 0 -> 42974 bytes content/img/blog/vertex-centric-plan.png | Bin 0 -> 37105 bytes content/index.html | 8 +- content/material.html | 12 +- .../2015/08/24/introducing-flink-gelly.html | 644 +++++++++++++++++++ 21 files changed, 1261 insertions(+), 104 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/blog/feed.xml ---------------------------------------------------------------------- diff --git a/content/blog/feed.xml b/content/blog/feed.xml index 06cc7ea..9ccb9f9 100644 --- a/content/blog/feed.xml +++ b/content/blog/feed.xml @@ -7,6 +7,463 @@ <atom:link href="http://flink.apache.org/blog/feed.xml" rel="self" type="application/rss+xml" /> <item> +<title>Introducing Gelly: Graph Processing with Apache Flink</title> +<description><p>This blog post introduces <strong>Gelly</strong>, Apache Flinkâs <em>graph-processing API and library</em>. Flinkâs native support +for iterations makes it a suitable platform for large-scale graph analytics. +By leveraging delta iterations, Gelly is able to map various graph processing models such as +vertex-centric or gather-sum-apply to Flink dataflows.</p> + +<p>Gelly allows Flink users to perform end-to-end data analysis in a single system. +Gelly can be seamlessly used with Flinkâs DataSet API, +which means that pre-processing, graph creation, analysis, and post-processing can be done +in the same application. At the end of this post, we will go through a step-by-step example +in order to demonstrate that loading, transformation, filtering, graph creation, and analysis +can be performed in a single Flink program.</p> + +<p><strong>Overview</strong></p> + +<ol> + <li><a href="#what-is-gelly">What is Gelly?</a></li> + <li><a href="#graph-representation-and-creation">Graph Representation and Creation</a></li> + <li><a href="#transformations-and-utilities">Transformations and Utilities</a></li> + <li><a href="#iterative-graph-processing">Iterative Graph Processing</a></li> + <li><a href="#library-of-graph-algorithms">Library of Graph Algorithms</a></li> + <li><a href="#use-case-music-profiles">Use-Case: Music Profiles</a></li> + <li><a href="#ongoing-and-future-work">Ongoing and Future Work</a></li> +</ol> + +<p><a href="#top"></a></p> + +<h2 id="what-is-gelly">What is Gelly?</h2> + +<p>Gelly is a Graph API for Flink. It is currently supported in both Java and Scala. +The Scala methods are implemented as wrappers on top of the basic Java operations. +The API contains a set of utility functions for graph analysis, supports iterative graph +processing and introduces a library of graph algorithms.</p> + +<center> +<img src="/img/blog/flink-stack.png" style="width:90%;margin:15px" /> +</center> + +<p><a href="#top">Back to top</a></p> + +<h2 id="graph-representation-and-creation">Graph Representation and Creation</h2> + +<p>In Gelly, a graph is represented by a DataSet of vertices and a DataSet of edges. +A vertex is defined by its unique ID and a value, whereas an edge is defined by its source ID, +target ID, and value. A vertex or edge for which a value is not specified will simply have the +value type set to <code>NullValue</code>.</p> + +<p>A graph can be created from:</p> + +<ol> + <li><strong>DataSet of edges</strong> and an optional <strong>DataSet of vertices</strong> using <code>Graph.fromDataSet()</code></li> + <li><strong>DataSet of Tuple3</strong> and an optional <strong>DataSet of Tuple2</strong> using <code>Graph.fromTupleDataSet()</code></li> + <li><strong>Collection of edges</strong> and an optional <strong>Collection of vertices</strong> using <code>Graph.fromCollection()</code></li> +</ol> + +<p>In all three cases, if the vertices are not provided, +Gelly will automatically produce the vertex IDs from the edge source and target IDs.</p> + +<p><a href="#top">Back to top</a></p> + +<h2 id="transformations-and-utilities">Transformations and Utilities</h2> + +<p>These are methods of the Graph class and include common graph metrics, transformations +and mutations as well as neighborhood aggregations.</p> + +<h4 id="common-graph-metrics">Common Graph Metrics</h4> +<p>These methods can be used to retrieve several graph metrics and properties, such as the number +of vertices, edges and the node degrees.</p> + +<h4 id="transformations">Transformations</h4> +<p>The transformation methods enable several Graph operations, using high-level functions similar to +the ones provided by the batch processing API. These transformations can be applied one after the +other, yielding a new Graph after each step, in a fashion similar to operators on DataSets:</p> + +<div class="highlight"><pre><code class="language-java"><span class="n">inputGraph</span><span class="o">.</span><span class="na">getUndirected</span><span class="o">().</span><span class="na">mapEdges</span><span class="o">(</span><span class="k">new</span> <span class="nf">CustomEdgeMapper</span><span class="o">());</span></code></pre></div> + +<p>Transformations can be applied on:</p> + +<ol> + <li><strong>Vertices</strong>: <code>mapVertices</code>, <code>joinWithVertices</code>, <code>filterOnVertices</code>, <code>addVertex</code>, â¦</li> + <li><strong>Edges</strong>: <code>mapEdges</code>, <code>filterOnEdges</code>, <code>removeEdge</code>, â¦</li> + <li><strong>Triplets</strong> (source vertex, target vertex, edge): <code>getTriplets</code></li> +</ol> + +<h4 id="neighborhood-aggregations">Neighborhood Aggregations</h4> + +<p>Neighborhood methods allow vertices to perform an aggregation on their first-hop neighborhood. +This provides a vertex-centric view, where each vertex can access its neighboring edges and neighbor values.</p> + +<p><code>reduceOnEdges()</code> provides access to the neighboring edges of a vertex, +i.e. the edge value and the vertex ID of the edge endpoint. In order to also access the +neighboring verticesâ values, one should call the <code>reduceOnNeighbors()</code> function. +The scope of the neighborhood is defined by the EdgeDirection parameter, which can be IN, OUT or ALL, +to gather in-coming, out-going or all edges (neighbors) of a vertex.</p> + +<p>The two neighborhood +functions mentioned above can only be used when the aggregation function is associative and commutative. +In case the function does not comply with these restrictions or if it is desirable to return zero, +one or more values per vertex, the more general <code>groupReduceOnEdges()</code> and +<code>groupReduceOnNeighbors()</code> functions must be called.</p> + +<p>Consider the following graph, for instance:</p> + +<center> +<img src="/img/blog/neighborhood.png" style="width:60%;margin:15px" /> +</center> + +<p>Assume you would want to compute the sum of the values of all incoming neighbors for each vertex. +We will call the <code>reduceOnNeighbors()</code> aggregation method since the sum is an associative and commutative operation and the neighborsâ values are needed:</p> + +<div class="highlight"><pre><code class="language-java"><span class="n">graph</span><span class="o">.</span><span class="na">reduceOnNeighbors</span><span class="o">(</span><span class="k">new</span> <span class="nf">SumValues</span><span class="o">(),</span> <span class="n">EdgeDirection</span><span class="o">.</span><span class="na">IN</span><span class="o">);</span></code></pre></div> + +<p>The vertex with id 1 is the only node that has no incoming edges. The result is therefore:</p> + +<center> +<img src="/img/blog/reduce-on-neighbors.png" style="width:90%;margin:15px" /> +</center> + +<p><a href="#top">Back to top</a></p> + +<h2 id="iterative-graph-processing">Iterative Graph Processing</h2> + +<p>During the past few years, many different programming models for distributed graph processing +have been introduced: <a href="http://delivery.acm.org/10.1145/2490000/2484843/a22-salihoglu.pdf?ip=141.23.53.206&amp;id=2484843&amp;acc=ACTIVE%20SERVICE&amp;key=2BA2C432AB83DA15.0F42380CB8DD3307.4D4702B0C3E38B35.4D4702B0C3E38B35&amp;CFID=706313474&amp;CFTOKEN=60107876&amp;__acm__=1440408958_b131e035942130653e5782409b5c0cde">vertex-centric</a>, +<a href="http://researcher.ibm.com/researcher/files/us-ytian/giraph++.pdf">partition-centric</a>, <a href="http://www.eecs.harvard.edu/cs261/notes/gonzalez-2012.htm">gather-apply-scatter</a>, +<a href="http://infoscience.epfl.ch/record/188535/files/paper.pdf">edge-centric</a>, <a href="http://www.vldb.org/pvldb/vol7/p1673-quamar.pdf">neighborhood-centric</a>. +Each one of these models targets a specific class of graph applications and each corresponding +system implementation optimizes the runtime respectively. In Gelly, we would like to exploit the +flexible dataflow model and the efficient iterations of Flink, to support multiple distributed +graph processing models on top of the same system.</p> + +<p>Currently, Gelly has methods for writing vertex-centric programs and provides support for programs +implemented using the gather-sum(accumulate)-apply model. We are also considering to offer support +for the partition-centric computation model, using Finkâs <code>mapPartition()</code> operator. +This model exposes the partition structure to the user and allows local graph structure exploitation +inside a partition to avoid unnecessary communication.</p> + +<h4 id="vertex-centric">Vertex-centric</h4> + +<p>Gelly wraps Flinkâs <a href="https://ci.apache.org/projects/flink/flink-docs-release-0.8/spargel_guide.html">Spargel APi</a> to +support the vertex-centric, Pregel-like programming model. Gellyâs <code>runVertexCentricIteration</code> method accepts two user-defined functions:</p> + +<ol> + <li><strong>MessagingFunction:</strong> defines what messages a vertex sends out for the next superstep.</li> + <li><strong>VertexUpdateFunction:</strong>* defines how a vertex will update its value based on the received messages.</li> +</ol> + +<p>The method will execute the vertex-centric iteration on the input Graph and return a new Graph, with updated vertex values.</p> + +<p>Gellyâs vertex-centric programming model exploits Flinkâs efficient delta iteration operators. +Many iterative graph algorithms expose non-uniform behavior, where some vertices converge to +their final value faster than others. In such cases, the number of vertices that need to be +recomputed during an iteration decreases as the algorithm moves towards convergence.</p> + +<p>For example, consider a Single Source Shortest Paths problem on the following graph, where S +is the source node, i is the iteration counter and the edge values represent distances between nodes:</p> + +<center> +<img src="/img/blog/sssp.png" style="width:90%;margin:15px" /> +</center> + +<p>In each iteration, a vertex receives distances from its neighbors and adopts the minimum of +these distances and its current distance as the new value. Then, it propagates its new value +to its neighbors. If a vertex does not change value during an iteration, there is no need for +it to propagate its old distance to its neighbors; as they have already taken it into account.</p> + +<p>Flinkâs <code>IterateDelta</code> operator permits exploitation of this property as well as the +execution of computations solely on the active parts of the graph. The operator receives two inputs:</p> + +<ol> + <li>the <strong>Solution Set</strong>, which represents the current state of the input and</li> + <li>the <strong>Workset</strong>, which determines which parts of the graph will be recomputed in the next iteration.</li> +</ol> + +<p>In the SSSP example above, the Workset contains the vertices which update their distances. +The user-defined iterative function is applied on these inputs to produce state updates. +These updates are efficiently applied on the state, which is kept in memory.</p> + +<center> +<img src="/img/blog/iteration.png" style="width:60%;margin:15px" /> +</center> + +<p>Internally, a vertex-centric iteration is a Flink delta iteration, where the initial Solution Set +is the vertex set of the input graph and the Workset is created by selecting the active vertices, +i.e. the ones that updated their value in the previous iteration. The messaging and vertex-update +functions are user-defined functions wrapped inside coGroup operators. In each superstep, +the active vertices (Workset) are coGrouped with the edges to generate the neighborhoods for +each vertex. The messaging function is then applied on each neighborhood. Next, the result of the +messaging function is coGrouped with the current vertex values (Solution Set) and the user-defined +vertex-update function is applied on the result. The output of this coGroup operator is finally +used to update the Solution Set and create the Workset input for the next iteration.</p> + +<center> +<img src="/img/blog/vertex-centric-plan.png" style="width:40%;margin:15px" /> +</center> + +<h4 id="gather-sum-apply">Gather-Sum-Apply</h4> + +<p>Gelly supports a variation of the popular Gather-Sum-Apply-Scatter computation model, +introduced by PowerGraph. In GSA, a vertex pulls information from its neighbors as opposed to the +vertex-centric approach where the updates are pushed from the incoming neighbors. +The <code>runGatherSumApplyIteration()</code> accepts three user-defined functions:</p> + +<ol> + <li><strong>GatherFunction:</strong> gathers neighboring partial values along in-edges.</li> + <li><strong>SumFunction:</strong> accumulates/reduces the values into a single one.</li> + <li><strong>ApplyFunction:</strong> uses the result computed in the sum phase to update the current vertexâs value.</li> +</ol> + +<p>Similarly to vertex-centric, GSA leverages Flinkâs delta iteration operators as, in many cases, +vertex values do not need to be recomputed during an iteration.</p> + +<p>Let us reconsider the Single Source Shortest Paths algorithm. In each iteration, a vertex:</p> + +<ol> + <li><strong>Gather</strong> retrieves distances from its neighbors summed up with the corresponding edge values;</li> + <li><strong>Sum</strong> compares the newly obtained distances in order to extract the minimum;</li> + <li><strong>Apply</strong> and finally adopts the minimum distance computed in the sum step, +provided that it is lower than its current value. If a vertexâs value does not change during +an iteration, it no longer propagates its distance.</li> +</ol> + +<p>Internally, a Gather-Sum-Apply Iteration is a Flink delta iteration where the initial solution +set is the vertex input set and the workset is created by selecting the active vertices.</p> + +<p>The three functions: gather, sum and apply are user-defined functions wrapped in map, reduce +and join operators respectively. In each superstep, the active vertices are joined with the +edges in order to create neighborhoods for each vertex. The gather function is then applied on +the neighborhood values via a map function. Afterwards, the result is grouped by the vertex ID +and reduced using the sum function. Finally, the outcome of the sum phase is joined with the +current vertex values (solution set), the values are updated, thus creating a new workset that +serves as input for the next iteration.</p> + +<center> +<img src="/img/blog/GSA-plan.png" style="width:40%;margin:15px" /> +</center> + +<p><a href="#top">Back to top</a></p> + +<h2 id="library-of-graph-algorithms">Library of Graph Algorithms</h2> + +<p>We are building a library of graph algorithms in Gelly, to easily analyze large-scale graphs. +These algorithms extend the <code>GraphAlgorithm</code> interface and can be simply executed on +the input graph by calling a <code>run()</code> method.</p> + +<p>We currently have implementations of the following algorithms:</p> + +<ol> + <li>PageRank</li> + <li>Single-Source-Shortest-Paths</li> + <li>Label Propagation</li> + <li>Community Detection (based on <a href="http://arxiv.org/pdf/0808.2633.pdf">this paper</a>)</li> + <li>Connected Components</li> + <li>GSA Connected Components</li> + <li>GSA PageRank</li> + <li>GSA Single-Source-Shortest-Paths</li> +</ol> + +<p>Gelly also offers implementations of common graph algorithms through <a href="https://github.com/apache/flink/tree/master/flink-staging/flink-gelly/src/main/java/org/apache/flink/graph/example">examples</a>. +Among them, one can find graph weighting schemes, like Jaccard Similarity and Euclidean Distance Weighting, +as well as computation of common graph metrics.</p> + +<p><a href="#top">Back to top</a></p> + +<h2 id="use-case-music-profiles">Use-Case: Music Profiles</h2> + +<p>In the following section, we go through a use-case scenario that combines the Flink DataSet API +with Gelly in order to process usersâ music preferences to suggest additions to their playlist.</p> + +<p>First, we read a userâs music profile which is in the form of user-id, song-id and the number of +plays that each song has. We then filter out the list of songs the users do not wish to see in their +playlist. Then we compute the top songs per user (i.e. the songs a user listened to the most). +Finally, as a separate use-case on the same data set, we create a user-user similarity graph based +on the common songs and use this resulting graph to detect communities by calling Gellyâs Label Propagation +library method.</p> + +<p>For running the example implementation, please use the 0.10-SNAPSHOT version of Flink as a +dependency. The full example code base can be found <a href="https://github.com/apache/flink/blob/master/flink-staging/flink-gelly/src/main/java/org/apache/flink/graph/example/MusicProfiles.java">here</a>. The public data set used for testing +can be found <a href="http://labrosa.ee.columbia.edu/millionsong/tasteprofile">here</a>. This data set contains <strong>48,373,586</strong> real user-id, song-id and +play-count triplets.</p> + +<p><strong>Note:</strong> The code snippets in this post try to reduce verbosity by skipping type parameters of generic functions. Please have a look at <a href="https://github.com/apache/flink/blob/master/flink-staging/flink-gelly/src/main/java/org/apache/flink/graph/example/MusicProfiles.java">the full example</a> for the correct and complete code.</p> + +<h4 id="filtering-out-bad-records">Filtering out Bad Records</h4> + +<p>After reading the <code>(user-id, song-id, play-count)</code> triplets from a CSV file and after parsing a +text file in order to retrieve the list of songs that a user would not want to include in a +playlist, we use a coGroup function to filter out the mismatches.</p> + +<div class="highlight"><pre><code class="language-java"><span class="c1">// read the user-song-play triplets.</span> +<span class="n">DataSet</span><span class="o">&lt;</span><span class="n">Tuple3</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">String</span><span class="o">,</span> <span class="n">Integer</span><span class="o">&gt;&gt;</span> <span class="n">triplets</span> <span class="o">=</span> + <span class="n">getUserSongTripletsData</span><span class="o">(</span><span class="n">env</span><span class="o">);</span> + +<span class="c1">// read the mismatches dataset and extract the songIDs</span> +<span class="n">DataSet</span><span class="o">&lt;</span><span class="n">Tuple3</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">String</span><span class="o">,</span> <span class="n">Integer</span><span class="o">&gt;&gt;</span> <span class="n">validTriplets</span> <span class="o">=</span> <span class="n">triplets</span> + <span class="o">.</span><span class="na">coGroup</span><span class="o">(</span><span class="n">mismatches</span><span class="o">).</span><span class="na">where</span><span class="o">(</span><span class="mi">1</span><span class="o">).</span><span class="na">equalTo</span><span class="o">(</span><span class="mi">0</span><span class="o">)</span> + <span class="o">.</span><span class="na">with</span><span class="o">(</span><span class="k">new</span> <span class="nf">CoGroupFunction</span><span class="o">()</span> <span class="o">{</span> + <span class="kt">void</span> <span class="nf">coGroup</span><span class="o">(</span><span class="n">Iterable</span> <span class="n">triplets</span><span class="o">,</span> <span class="n">Iterable</span> <span class="n">invalidSongs</span><span class="o">,</span> <span class="n">Collector</span> <span class="n">out</span><span class="o">)</span> <span class="o">{</span> + <span class="k">if</span> <span class="o">(!</span><span class="n">invalidSongs</span><span class="o">.</span><span class="na">iterator</span><span class="o">().</span><span class="na">hasNext</span><span class="o">())</span> <span class="o">{</span> + <span class="k">for</span> <span class="o">(</span><span class="n">Tuple3</span> <span class="n">triplet</span> <span class="o">:</span> <span class="n">triplets</span><span class="o">)</span> <span class="o">{</span> <span class="c1">// valid triplet</span> + <span class="n">out</span><span class="o">.</span><span class="na">collect</span><span class="o">(</span><span class="n">triplet</span><span class="o">);</span> + <span class="o">}</span> + <span class="o">}</span> + <span class="o">}</span> + <span class="o">}</span></code></pre></div> + +<p>The coGroup simply takes the triplets whose song-id (second field) matches the song-id from the +mismatches list (first field) and if the iterator was empty for a certain triplet, meaning that +there were no mismatches found, the triplet associated with that song is collected.</p> + +<h4 id="compute-the-top-songs-per-user">Compute the Top Songs per User</h4> + +<p>As a next step, we would like to see which songs a user played more often. To this end, we +build a user-song weighted, bipartite graph in which edge source vertices are users, edge target +vertices are songs and where the weight represents the number of times the user listened to that +certain song.</p> + +<center> +<img src="/img/blog/user-song-graph.png" style="width:90%;margin:15px" /> +</center> + +<div class="highlight"><pre><code class="language-java"><span class="c1">// create a user -&gt; song weighted bipartite graph where the edge weights</span> +<span class="c1">// correspond to play counts</span> +<span class="n">Graph</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">NullValue</span><span class="o">,</span> <span class="n">Integer</span><span class="o">&gt;</span> <span class="n">userSongGraph</span> <span class="o">=</span> <span class="n">Graph</span><span class="o">.</span><span class="na">fromTupleDataSet</span><span class="o">(</span><span class="n">validTriplets</span><span class="o">,</span> <span class="n">env</span><span class="o">);</span></code></pre></div> + +<p>Consult the <a href="https://ci.apache.org/projects/flink/flink-docs-master/libs/gelly_guide.html">Gelly guide</a> for guidelines +on how to create a graph from a given DataSet of edges or from a collection.</p> + +<p>To retrieve the top songs per user, we call the groupReduceOnEdges function as it perform an +aggregation over the first hop neighborhood taking just the edges into consideration. We will +basically iterate through the edge value and collect the target (song) of the maximum weight edge.</p> + +<div class="highlight"><pre><code class="language-java"><span class="c1">//get the top track (most listened to) for each user</span> +<span class="n">DataSet</span><span class="o">&lt;</span><span class="n">Tuple2</span><span class="o">&gt;</span> <span class="n">usersWithTopTrack</span> <span class="o">=</span> <span class="n">userSongGraph</span> + <span class="o">.</span><span class="na">groupReduceOnEdges</span><span class="o">(</span><span class="k">new</span> <span class="nf">GetTopSongPerUser</span><span class="o">(),</span> <span class="n">EdgeDirection</span><span class="o">.</span><span class="na">OUT</span><span class="o">);</span> + +<span class="kd">class</span> <span class="nc">GetTopSongPerUser</span> <span class="kd">implements</span> <span class="n">EdgesFunctionWithVertexValue</span> <span class="o">{</span> + <span class="kt">void</span> <span class="nf">iterateEdges</span><span class="o">(</span><span class="n">Vertex</span> <span class="n">vertex</span><span class="o">,</span> <span class="n">Iterable</span><span class="o">&lt;</span><span class="n">Edge</span><span class="o">&gt;</span> <span class="n">edges</span><span class="o">)</span> <span class="o">{</span> + <span class="kt">int</span> <span class="n">maxPlaycount</span> <span class="o">=</span> <span class="mi">0</span><span class="o">;</span> + <span class="n">String</span> <span class="n">topSong</span> <span class="o">=</span> <span class="s">&quot;&quot;</span><span class="o">;</span> + + <span class="k">for</span> <span class="o">(</span><span class="n">Edge</span> <span class="n">edge</span> <span class="o">:</span> <span class="n">edges</span><span class="o">)</span> <span class="o">{</span> + <span class="k">if</span> <span class="o">(</span><span class="n">edge</span><span class="o">.</span><span class="na">getValue</span><span class="o">()</span> <span class="o">&gt;</span> <span class="n">maxPlaycount</span><span class="o">)</span> <span class="o">{</span> + <span class="n">maxPlaycount</span> <span class="o">=</span> <span class="n">edge</span><span class="o">.</span><span class="na">getValue</span><span class="o">();</span> + <span class="n">topSong</span> <span class="o">=</span> <span class="n">edge</span><span class="o">.</span><span class="na">getTarget</span><span class="o">();</span> + <span class="o">}</span> + <span class="o">}</span> + <span class="k">return</span> <span class="k">new</span> <span class="nf">Tuple2</span><span class="o">(</span><span class="n">vertex</span><span class="o">.</span><span class="na">getId</span><span class="o">(),</span> <span class="n">topSong</span><span class="o">);</span> + <span class="o">}</span> +<span class="o">}</span></code></pre></div> + +<h4 id="creating-a-user-user-similarity-graph">Creating a User-User Similarity Graph</h4> + +<p>Clustering users based on common interests, in this case, common top songs, could prove to be +very useful for advertisements or for recommending new musical compilations. In a user-user graph, +two users who listen to the same song will simply be linked together through an edge as depicted +in the figure below.</p> + +<center> +<img src="/img/blog/user-song-to-user-user.png" style="width:90%;margin:15px" /> +</center> + +<p>To form the user-user graph in Flink, we will simply take the edges from the user-song graph +(left-hand side of the image), group them by song-id, and then add all the users (source vertex ids) +to an ArrayList.</p> + +<p>We then match users who listened to the same song two by two, creating a new edge to mark their +common interest (right-hand side of the image).</p> + +<p>Afterwards, we perform a <code>distinct()</code> operation to avoid creation of duplicate data. +Considering that we now have the DataSet of edges which present interest, creating a graph is as +straightforward as a call to the <code>Graph.fromDataSet()</code> method.</p> + +<div class="highlight"><pre><code class="language-java"><span class="c1">// create a user-user similarity graph:</span> +<span class="c1">// two users that listen to the same song are connected</span> +<span class="n">DataSet</span><span class="o">&lt;</span><span class="n">Edge</span><span class="o">&gt;</span> <span class="n">similarUsers</span> <span class="o">=</span> <span class="n">userSongGraph</span><span class="o">.</span><span class="na">getEdges</span><span class="o">()</span> + <span class="c1">// filter out user-song edges that are below the playcount threshold</span> + <span class="o">.</span><span class="na">filter</span><span class="o">(</span><span class="k">new</span> <span class="n">FilterFunction</span><span class="o">&lt;</span><span class="n">Edge</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">Integer</span><span class="o">&gt;&gt;()</span> <span class="o">{</span> + <span class="kd">public</span> <span class="kt">boolean</span> <span class="nf">filter</span><span class="o">(</span><span class="n">Edge</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">Integer</span><span class="o">&gt;</span> <span class="n">edge</span><span class="o">)</span> <span class="o">{</span> + <span class="k">return</span> <span class="o">(</span><span class="n">edge</span><span class="o">.</span><span class="na">getValue</span><span class="o">()</span> <span class="o">&gt;</span> <span class="n">playcountThreshold</span><span class="o">);</span> + <span class="o">}</span> + <span class="o">})</span> + <span class="o">.</span><span class="na">groupBy</span><span class="o">(</span><span class="mi">1</span><span class="o">)</span> + <span class="o">.</span><span class="na">reduceGroup</span><span class="o">(</span><span class="k">new</span> <span class="nf">GroupReduceFunction</span><span class="o">()</span> <span class="o">{</span> + <span class="kt">void</span> <span class="nf">reduce</span><span class="o">(</span><span class="n">Iterable</span><span class="o">&lt;</span><span class="n">Edge</span><span class="o">&gt;</span> <span class="n">edges</span><span class="o">,</span> <span class="n">Collector</span><span class="o">&lt;</span><span class="n">Edge</span><span class="o">&gt;</span> <span class="n">out</span><span class="o">)</span> <span class="o">{</span> + <span class="n">List</span> <span class="n">users</span> <span class="o">=</span> <span class="k">new</span> <span class="nf">ArrayList</span><span class="o">();</span> + <span class="k">for</span> <span class="o">(</span><span class="n">Edge</span> <span class="n">edge</span> <span class="o">:</span> <span class="n">edges</span><span class="o">)</span> <span class="o">{</span> + <span class="n">users</span><span class="o">.</span><span class="na">add</span><span class="o">(</span><span class="n">edge</span><span class="o">.</span><span class="na">getSource</span><span class="o">());</span> + <span class="k">for</span> <span class="o">(</span><span class="kt">int</span> <span class="n">i</span> <span class="o">=</span> <span class="mi">0</span><span class="o">;</span> <span class="n">i</span> <span class="o">&lt;</span> <span class="n">users</span><span class="o">.</span><span class="na">size</span><span class="o">()</span> <span class="o">-</span> <span class="mi">1</span><span class="o">;</span> <span class="n">i</span><span class="o">++)</span> <span class="o">{</span> + <span class="k">for</span> <span class="o">(</span><span class="kt">int</span> <span class="n">j</span> <span class="o">=</span> <span class="n">i</span><span class="o">+</span><span class="mi">1</span><span class="o">;</span> <span class="n">j</span> <span class="o">&lt;</span> <span class="n">users</span><span class="o">.</span><span class="na">size</span><span class="o">()</span> <span class="o">-</span> <span class="mi">1</span><span class="o">;</span> <span class="n">j</span><span class="o">++)</span> <span clas s="o">{</span> + <span class="n">out</span><span class="o">.</span><span class="na">collect</span><span class="o">(</span><span class="k">new</span> <span class="nf">Edge</span><span class="o">(</span><span class="n">users</span><span class="o">.</span><span class="na">get</span><span class="o">(</span><span class="n">i</span><span class="o">),</span> <span class="n">users</span><span class="o">.</span><span class="na">get</span><span class="o">(</span><span class="n">j</span><span class="o">)));</span> + <span class="o">}</span> + <span class="o">}</span> + <span class="o">}</span> + <span class="o">}</span> + <span class="o">})</span> + <span class="o">.</span><span class="na">distinct</span><span class="o">();</span> + +<span class="n">Graph</span> <span class="n">similarUsersGraph</span> <span class="o">=</span> <span class="n">Graph</span><span class="o">.</span><span class="na">fromDataSet</span><span class="o">(</span><span class="n">similarUsers</span><span class="o">).</span><span class="na">getUndirected</span><span class="o">();</span></code></pre></div> + +<p>After having created a user-user graph, it would make sense to detect the various communities +formed. To do so, we first initialize each vertex with a numeric label using the +<code>joinWithVertices()</code> function that takes a data set of Tuple2 as a parameter and joins +the id of a vertex with the first element of the tuple, afterwards applying a map function. +Finally, we call the <code>run()</code> method with the LabelPropagation library method passed +as a parameter. In the end, the vertices will be updated to contain the most frequent label +among their neighbors.</p> + +<div class="highlight"><pre><code class="language-java"><span class="c1">// detect user communities using label propagation</span> +<span class="c1">// initialize each vertex with a unique numeric label</span> +<span class="n">DataSet</span><span class="o">&lt;</span><span class="n">Tuple2</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">Long</span><span class="o">&gt;&gt;</span> <span class="n">idsWithInitialLabels</span> <span class="o">=</span> <span class="n">DataSetUtils</span> + <span class="o">.</span><span class="na">zipWithUniqueId</span><span class="o">(</span><span class="n">similarUsersGraph</span><span class="o">.</span><span class="na">getVertexIds</span><span class="o">())</span> + <span class="o">.</span><span class="na">map</span><span class="o">(</span><span class="k">new</span> <span class="n">MapFunction</span><span class="o">&lt;</span><span class="n">Tuple2</span><span class="o">&lt;</span><span class="n">Long</span><span class="o">,</span> <span class="n">String</span><span class="o">&gt;,</span> <span class="n">Tuple2</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">Long</span><span class="o">&gt;&gt;()</span> <span class="o">{</span> + <span class="nd">@Override</span> + <span class="kd">public</span> <span class="n">Tuple2</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">Long</span><span class="o">&gt;</span> <span class="nf">map</span><span class="o">(</span><span class="n">Tuple2</span><span class="o">&lt;</span><span class="n">Long</span><span class="o">,</span> <span class="n">String</span><span class="o">&gt;</span> <span class="n">tuple2</span><span class="o">)</span> <span class="kd">throws</span> <span class="n">Exception</span> <span class=&q uot;o">{</span> + <span class="k">return</span> <span class="k">new</span> <span class="n">Tuple2</span><span class="o">&lt;</span><span class="n">String</span><span class="o">,</span> <span class="n">Long</span><span class="o">&gt;(</span><span class="n">tuple2</span><span class="o">.</span><span class="na">f1</span><span class="o">,</span> <span class="n">tuple2</span><span class="o">.</span><span class="na">f0</span><span class="o">);</span> + <span class="o">}</span> + <span class="o">});</span> + +<span class="c1">// update the vertex values and run the label propagation algorithm</span> +<span class="n">DataSet</span><span class="o">&lt;</span><span class="n">Vertex</span><span class="o">&gt;</span> <span class="n">verticesWithCommunity</span> <span class="o">=</span> <span class="n">similarUsersGraph</span> + <span class="o">.</span><span class="na">joinWithVertices</span><span class="o">(</span><span class="n">idsWithlLabels</span><span class="o">,</span> <span class="k">new</span> <span class="nf">MapFunction</span><span class="o">()</span> <span class="o">{</span> + <span class="kd">public</span> <span class="n">Long</span> <span class="nf">map</span><span class="o">(</span><span class="n">Tuple2</span> <span class="n">idWithLabel</span><span class="o">)</span> <span class="o">{</span> + <span class="k">return</span> <span class="n">idWithLabel</span><span class="o">.</span><span class="na">f1</span><span class="o">;</span> + <span class="o">}</span> + <span class="o">})</span> + <span class="o">.</span><span class="na">run</span><span class="o">(</span><span class="k">new</span> <span class="nf">LabelPropagation</span><span class="o">(</span><span class="n">numIterations</span><span class="o">))</span> + <span class="o">.</span><span class="na">getVertices</span><span class="o">();</span></code></pre></div> + +<p><a href="#top">Back to top</a></p> + +<h2 id="ongoing-and-future-work">Ongoing and Future Work</h2> + +<p>Currently, Gelly matches the basic functionalities provided by most state-of-the-art graph +processing systems. Our vision is to turn Gelly into more than âyet another library for running +PageRank-like algorithmsâ by supporting generic iterations, implementing graph partitioning, +providing bipartite graph support and by offering numerous other features.</p> + +<p>We are also enriching Flink Gelly with a set of operators suitable for highly skewed graphs +as well as a Graph API built on Flink Streaming.</p> + +<p>In the near future, we would like to see how Gelly can be integrated with graph visualization +tools, graph database systems and sampling techniques.</p> + +<p>Curious? Read more about our plans for Gelly in the <a href="https://cwiki.apache.org/confluence/display/FLINK/Flink+Gelly">roadmap</a>.</p> + +<p><a href="#top">Back to top</a></p> + +<h2 id="links">Links</h2> +<p><a href="https://ci.apache.org/projects/flink/flink-docs-master/libs/gelly_guide.html">Gelly Documentation</a></p> +</description> +<pubDate>Mon, 24 Aug 2015 00:00:00 +0200</pubDate> +<link>http://flink.apache.org/news/2015/08/24/introducing-flink-gelly.html</link> +<guid isPermaLink="true">/news/2015/08/24/introducing-flink-gelly.html</guid> +</item> + +<item> <title>Announcing Apache Flink 0.9.0</title> <description><p>The Apache Flink community is pleased to announce the availability of the 0.9.0 release. The release is the result of many months of hard work within the Flink community. It contains many new features and improvements which were previewed in the 0.9.0-milestone1 release and have been polished since then. This is the largest Flink release so far.</p> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/blog/index.html ---------------------------------------------------------------------- diff --git a/content/blog/index.html b/content/blog/index.html index f377375..375548b 100644 --- a/content/blog/index.html +++ b/content/blog/index.html @@ -146,6 +146,22 @@ <!-- Blog posts --> <article> + <h2 class="blog-title"><a href="/news/2015/08/24/introducing-flink-gelly.html">Introducing Gelly: Graph Processing with Apache Flink</a></h2> + <p>24 Aug 2015</p> + + <p><p>This blog post introduces <strong>Gelly</strong>, Apache Flinkâs <em>graph-processing API and library</em>. Flinkâs native support +for iterations makes it a suitable platform for large-scale graph analytics. +By leveraging delta iterations, Gelly is able to map various graph processing models such as +vertex-centric or gather-sum-apply to Flink dataflows.</p> + +</p> + + <p><a href="/news/2015/08/24/introducing-flink-gelly.html">Continue reading »</a></p> + </article> + + <hr> + + <article> <h2 class="blog-title"><a href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Announcing Apache Flink 0.9.0</a></h2> <p>24 Jun 2015</p> @@ -269,19 +285,6 @@ and offers a new API including definition of flexible windows.</p> <hr> - <article> - <h2 class="blog-title"><a href="/news/2015/01/21/release-0.8.html">Apache Flink 0.8.0 available</a></h2> - <p>21 Jan 2015</p> - - <p><p>We are pleased to announce the availability of Flink 0.8.0. This release includes new user-facing features as well as performance and bug fixes, extends the support for filesystems and introduces the Scala API and flexible windowing semantics for Flink Streaming. A total of 33 people have contributed to this release, a big thanks to all of them!</p> - -</p> - - <p><a href="/news/2015/01/21/release-0.8.html">Continue reading »</a></p> - </article> - - <hr> - <!-- Pagination links --> @@ -314,6 +317,16 @@ and offers a new API including definition of flexible windows.</p> <ul id="markdown-toc"> + <li><a href="/news/2015/08/24/introducing-flink-gelly.html">Introducing Gelly: Graph Processing with Apache Flink</a></li> + + + + + + + + + <li><a href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Announcing Apache Flink 0.9.0</a></li> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/blog/page2/index.html ---------------------------------------------------------------------- diff --git a/content/blog/page2/index.html b/content/blog/page2/index.html index 609a749..d46308c 100644 --- a/content/blog/page2/index.html +++ b/content/blog/page2/index.html @@ -146,6 +146,19 @@ <!-- Blog posts --> <article> + <h2 class="blog-title"><a href="/news/2015/01/21/release-0.8.html">Apache Flink 0.8.0 available</a></h2> + <p>21 Jan 2015</p> + + <p><p>We are pleased to announce the availability of Flink 0.8.0. This release includes new user-facing features as well as performance and bug fixes, extends the support for filesystems and introduces the Scala API and flexible windowing semantics for Flink Streaming. A total of 33 people have contributed to this release, a big thanks to all of them!</p> + +</p> + + <p><a href="/news/2015/01/21/release-0.8.html">Continue reading »</a></p> + </article> + + <hr> + + <article> <h2 class="blog-title"><a href="/news/2015/01/06/december-in-flink.html">December 2014 in the Flink community</a></h2> <p>06 Jan 2015</p> @@ -265,19 +278,6 @@ academic and open source project that Flink originates from.</p> <hr> - <article> - <h2 class="blog-title"><a href="/news/2014/02/18/amazon-elastic-mapreduce-cloud-yarn.html">Use Stratosphere with Amazon Elastic MapReduce</a></h2> - <p>18 Feb 2014 by Robert Metzger (<a href="https://twitter.com/rmetzger_">@rmetzger_</a>)</p> - - <p><div class="lead">Get started with Stratosphere within 10 minutes using Amazon Elastic MapReduce.</div> - -</p> - - <p><a href="/news/2014/02/18/amazon-elastic-mapreduce-cloud-yarn.html">Continue reading »</a></p> - </article> - - <hr> - <!-- Pagination links --> @@ -310,6 +310,16 @@ academic and open source project that Flink originates from.</p> <ul id="markdown-toc"> + <li><a href="/news/2015/08/24/introducing-flink-gelly.html">Introducing Gelly: Graph Processing with Apache Flink</a></li> + + + + + + + + + <li><a href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Announcing Apache Flink 0.9.0</a></li> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/blog/page3/index.html ---------------------------------------------------------------------- diff --git a/content/blog/page3/index.html b/content/blog/page3/index.html index fbecfe7..9045eed 100644 --- a/content/blog/page3/index.html +++ b/content/blog/page3/index.html @@ -146,6 +146,19 @@ <!-- Blog posts --> <article> + <h2 class="blog-title"><a href="/news/2014/02/18/amazon-elastic-mapreduce-cloud-yarn.html">Use Stratosphere with Amazon Elastic MapReduce</a></h2> + <p>18 Feb 2014 by Robert Metzger (<a href="https://twitter.com/rmetzger_">@rmetzger_</a>)</p> + + <p><div class="lead">Get started with Stratosphere within 10 minutes using Amazon Elastic MapReduce.</div> + +</p> + + <p><a href="/news/2014/02/18/amazon-elastic-mapreduce-cloud-yarn.html">Continue reading »</a></p> + </article> + + <hr> + + <article> <h2 class="blog-title"><a href="/news/2014/01/28/querying_mongodb.html">Accessing Data Stored in MongoDB with Stratosphere</a></h2> <p>28 Jan 2014 by Robert Metzger (<a href="https://twitter.com/rmetzger_">@rmetzger_</a>)</p> @@ -282,25 +295,6 @@ The demo shows how static code analysis can be leveraged to reordered UDF operat <hr> - <article> - <h2 class="blog-title"><a href="/news/2012/11/12/btw2013demo.html">Stratosphere Demo Paper Accepted for BTW 2013</a></h2> - <p>12 Nov 2012</p> - - <p> <p>Our demo submission<br /> -<strong><cite>"Applying Stratosphere for Big Data Analytics"</cite></strong><br /> -has been accepted for BTW 2013 in Magdeburg, Germany.<br /> -The demo focuses on Stratosphere's query language Meteor, which has been presented in our paper <cite>"Meteor/Sopremo: An Extensible Query Language and Operator Model"</cite> <a href="/assets/papers/Sopremo_Meteor BigData.pdf">[pdf]</a> at the BigData workshop associated with VLDB 2012 in Istanbul.</p> -<p>Visit our demo, and talk to us if you are going to attend BTW 2013.</p> -<p><strong>Abstract:</strong><br /> -Analyzing big data sets as they occur in modern business and science applications requires query languages that allow for the specification of complex data processing tasks. Moreover, these ideally declarative query specifications have to be optimized, parallelized and scheduled for processing on massively parallel data processing platforms. This paper demonstrates the application of Stratosphere to different kinds of Big Data Analytics tasks. Using examples from different application domains, we show how to formulate analytical tasks as Meteor queries and execute them with Stratosphere. These examples include data cleansing and information extraction tasks, and a correlation analysis of microblogging and stock trade volume data that we describe in detail in this paper.</p> - -</p> - - <p><a href="/news/2012/11/12/btw2013demo.html">Continue reading »</a></p> - </article> - - <hr> - <!-- Pagination links --> @@ -333,6 +327,16 @@ Analyzing big data sets as they occur in modern business and science application <ul id="markdown-toc"> + <li><a href="/news/2015/08/24/introducing-flink-gelly.html">Introducing Gelly: Graph Processing with Apache Flink</a></li> + + + + + + + + + <li><a href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Announcing Apache Flink 0.9.0</a></li> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/blog/page4/index.html ---------------------------------------------------------------------- diff --git a/content/blog/page4/index.html b/content/blog/page4/index.html index 98f552b..28bbe90 100644 --- a/content/blog/page4/index.html +++ b/content/blog/page4/index.html @@ -146,6 +146,25 @@ <!-- Blog posts --> <article> + <h2 class="blog-title"><a href="/news/2012/11/12/btw2013demo.html">Stratosphere Demo Paper Accepted for BTW 2013</a></h2> + <p>12 Nov 2012</p> + + <p> <p>Our demo submission<br /> +<strong><cite>"Applying Stratosphere for Big Data Analytics"</cite></strong><br /> +has been accepted for BTW 2013 in Magdeburg, Germany.<br /> +The demo focuses on Stratosphere's query language Meteor, which has been presented in our paper <cite>"Meteor/Sopremo: An Extensible Query Language and Operator Model"</cite> <a href="/assets/papers/Sopremo_Meteor BigData.pdf">[pdf]</a> at the BigData workshop associated with VLDB 2012 in Istanbul.</p> +<p>Visit our demo, and talk to us if you are going to attend BTW 2013.</p> +<p><strong>Abstract:</strong><br /> +Analyzing big data sets as they occur in modern business and science applications requires query languages that allow for the specification of complex data processing tasks. Moreover, these ideally declarative query specifications have to be optimized, parallelized and scheduled for processing on massively parallel data processing platforms. This paper demonstrates the application of Stratosphere to different kinds of Big Data Analytics tasks. Using examples from different application domains, we show how to formulate analytical tasks as Meteor queries and execute them with Stratosphere. These examples include data cleansing and information extraction tasks, and a correlation analysis of microblogging and stock trade volume data that we describe in detail in this paper.</p> + +</p> + + <p><a href="/news/2012/11/12/btw2013demo.html">Continue reading »</a></p> + </article> + + <hr> + + <article> <h2 class="blog-title"><a href="/news/2012/10/15/icde2013.html">Stratosphere Demo Accepted for ICDE 2013</a></h2> <p>15 Oct 2012</p> @@ -220,6 +239,16 @@ We demonstrate our optimizer and a job submission client that allows users to pe <ul id="markdown-toc"> + <li><a href="/news/2015/08/24/introducing-flink-gelly.html">Introducing Gelly: Graph Processing with Apache Flink</a></li> + + + + + + + + + <li><a href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Announcing Apache Flink 0.9.0</a></li> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/community.html ---------------------------------------------------------------------- diff --git a/content/community.html b/content/community.html index 1a05654..8c51e5e 100644 --- a/content/community.html +++ b/content/community.html @@ -147,17 +147,17 @@ <div class="page-toc"> <ul id="markdown-toc"> - <li><a href="#mailing-lists">Mailing Lists</a></li> - <li><a href="#irc">IRC</a></li> - <li><a href="#stack-overflow">Stack Overflow</a></li> - <li><a href="#issue-tracker">Issue Tracker</a></li> - <li><a href="#source-code">Source Code</a> <ul> - <li><a href="#main-source-repositories">Main source repositories</a></li> - <li><a href="#website-repositories">Website repositories</a></li> + <li><a href="#mailing-lists" id="markdown-toc-mailing-lists">Mailing Lists</a></li> + <li><a href="#irc" id="markdown-toc-irc">IRC</a></li> + <li><a href="#stack-overflow" id="markdown-toc-stack-overflow">Stack Overflow</a></li> + <li><a href="#issue-tracker" id="markdown-toc-issue-tracker">Issue Tracker</a></li> + <li><a href="#source-code" id="markdown-toc-source-code">Source Code</a> <ul> + <li><a href="#main-source-repositories" id="markdown-toc-main-source-repositories">Main source repositories</a></li> + <li><a href="#website-repositories" id="markdown-toc-website-repositories">Website repositories</a></li> </ul> </li> - <li><a href="#people">People</a></li> - <li><a href="#former-mentors">Former mentors</a></li> + <li><a href="#people" id="markdown-toc-people">People</a></li> + <li><a href="#former-mentors" id="markdown-toc-former-mentors">Former mentors</a></li> </ul> </div> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/downloads.html ---------------------------------------------------------------------- diff --git a/content/downloads.html b/content/downloads.html index 26167a6..566da2e 100644 --- a/content/downloads.html +++ b/content/downloads.html @@ -156,9 +156,9 @@ $( document ).ready(function() { <div class="page-toc"> <ul id="markdown-toc"> - <li><a href="#latest-stable-release-v090">Latest stable release (v0.9.0)</a></li> - <li><a href="#maven-dependencies">Maven Dependencies</a></li> - <li><a href="#all-releases">All releases</a></li> + <li><a href="#latest-stable-release-v090" id="markdown-toc-latest-stable-release-v090">Latest stable release (v0.9.0)</a></li> + <li><a href="#maven-dependencies" id="markdown-toc-maven-dependencies">Maven Dependencies</a></li> + <li><a href="#all-releases" id="markdown-toc-all-releases">All releases</a></li> </ul> </div> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/faq.html ---------------------------------------------------------------------- diff --git a/content/faq.html b/content/faq.html index 2da1ab4..187ff30 100644 --- a/content/faq.html +++ b/content/faq.html @@ -166,41 +166,41 @@ under the License. <div class="page-toc"> <ul id="markdown-toc"> - <li><a href="#general">General</a> <ul> - <li><a href="#is-flink-a-hadoop-project">Is Flink a Hadoop Project?</a></li> - <li><a href="#do-i-have-to-install-apache-hadoop-to-use-flink">Do I have to install Apache Hadoop to use Flink?</a></li> + <li><a href="#general" id="markdown-toc-general">General</a> <ul> + <li><a href="#is-flink-a-hadoop-project" id="markdown-toc-is-flink-a-hadoop-project">Is Flink a Hadoop Project?</a></li> + <li><a href="#do-i-have-to-install-apache-hadoop-to-use-flink" id="markdown-toc-do-i-have-to-install-apache-hadoop-to-use-flink">Do I have to install Apache Hadoop to use Flink?</a></li> </ul> </li> - <li><a href="#usage">Usage</a> <ul> - <li><a href="#how-do-i-assess-the-progress-of-a-flink-program">How do I assess the progress of a Flink program?</a></li> - <li><a href="#how-can-i-figure-out-why-a-program-failed">How can I figure out why a program failed?</a></li> - <li><a href="#how-do-i-debug-flink-programs">How do I debug Flink programs?</a></li> - <li><a href="#what-is-the-parallelism-how-do-i-set-it">What is the parallelism? How do I set it?</a></li> + <li><a href="#usage" id="markdown-toc-usage">Usage</a> <ul> + <li><a href="#how-do-i-assess-the-progress-of-a-flink-program" id="markdown-toc-how-do-i-assess-the-progress-of-a-flink-program">How do I assess the progress of a Flink program?</a></li> + <li><a href="#how-can-i-figure-out-why-a-program-failed" id="markdown-toc-how-can-i-figure-out-why-a-program-failed">How can I figure out why a program failed?</a></li> + <li><a href="#how-do-i-debug-flink-programs" id="markdown-toc-how-do-i-debug-flink-programs">How do I debug Flink programs?</a></li> + <li><a href="#what-is-the-parallelism-how-do-i-set-it" id="markdown-toc-what-is-the-parallelism-how-do-i-set-it">What is the parallelism? How do I set it?</a></li> </ul> </li> - <li><a href="#errors">Errors</a> <ul> - <li><a href="#why-am-i-getting-a-nonserializableexception-">Why am I getting a âNonSerializableExceptionâ ?</a></li> - <li><a href="#in-scala-api-i-get-an-error-about-implicit-values-and-evidence-parameters">In Scala API, I get an error about implicit values and evidence parameters</a></li> - <li><a href="#i-get-an-error-message-saying-that-not-enough-buffers-are-available-how-do-i-fix-this">I get an error message saying that not enough buffers are available. How do I fix this?</a></li> - <li><a href="#my-job-fails-early-with-a-javaioeofexception-what-could-be-the-cause">My job fails early with a java.io.EOFException. What could be the cause?</a></li> - <li><a href="#my-job-fails-with-various-exceptions-from-the-hdfshadoop-code-what-can-i-do">My job fails with various exceptions from the HDFS/Hadoop code. What can I do?</a></li> - <li><a href="#in-eclipse-i-get-compilation-errors-in-the-scala-projects">In Eclipse, I get compilation errors in the Scala projects</a></li> - <li><a href="#my-program-does-not-compute-the-correct-result-why-are-my-custom-key-types">My program does not compute the correct result. Why are my custom key types</a></li> - <li><a href="#i-get-a-javalanginstantiationexception-for-my-data-type-what-is-wrong">I get a java.lang.InstantiationException for my data type, what is wrong?</a></li> - <li><a href="#i-cant-stop-flink-with-the-provided-stop-scripts-what-can-i-do">I canât stop Flink with the provided stop-scripts. What can I do?</a></li> - <li><a href="#i-got-an-outofmemoryexception-what-can-i-do">I got an OutOfMemoryException. What can I do?</a></li> - <li><a href="#why-do-the-taskmanager-log-files-become-so-huge">Why do the TaskManager log files become so huge?</a></li> - <li><a href="#the-slot-allocated-for-my-task-manager-has-been-released-what-should-i-do">The slot allocated for my task manager has been released. What should I do?</a></li> + <li><a href="#errors" id="markdown-toc-errors">Errors</a> <ul> + <li><a href="#why-am-i-getting-a-nonserializableexception-" id="markdown-toc-why-am-i-getting-a-nonserializableexception-">Why am I getting a âNonSerializableExceptionâ ?</a></li> + <li><a href="#in-scala-api-i-get-an-error-about-implicit-values-and-evidence-parameters" id="markdown-toc-in-scala-api-i-get-an-error-about-implicit-values-and-evidence-parameters">In Scala API, I get an error about implicit values and evidence parameters</a></li> + <li><a href="#i-get-an-error-message-saying-that-not-enough-buffers-are-available-how-do-i-fix-this" id="markdown-toc-i-get-an-error-message-saying-that-not-enough-buffers-are-available-how-do-i-fix-this">I get an error message saying that not enough buffers are available. How do I fix this?</a></li> + <li><a href="#my-job-fails-early-with-a-javaioeofexception-what-could-be-the-cause" id="markdown-toc-my-job-fails-early-with-a-javaioeofexception-what-could-be-the-cause">My job fails early with a java.io.EOFException. What could be the cause?</a></li> + <li><a href="#my-job-fails-with-various-exceptions-from-the-hdfshadoop-code-what-can-i-do" id="markdown-toc-my-job-fails-with-various-exceptions-from-the-hdfshadoop-code-what-can-i-do">My job fails with various exceptions from the HDFS/Hadoop code. What can I do?</a></li> + <li><a href="#in-eclipse-i-get-compilation-errors-in-the-scala-projects" id="markdown-toc-in-eclipse-i-get-compilation-errors-in-the-scala-projects">In Eclipse, I get compilation errors in the Scala projects</a></li> + <li><a href="#my-program-does-not-compute-the-correct-result-why-are-my-custom-key-types" id="markdown-toc-my-program-does-not-compute-the-correct-result-why-are-my-custom-key-types">My program does not compute the correct result. Why are my custom key types</a></li> + <li><a href="#i-get-a-javalanginstantiationexception-for-my-data-type-what-is-wrong" id="markdown-toc-i-get-a-javalanginstantiationexception-for-my-data-type-what-is-wrong">I get a java.lang.InstantiationException for my data type, what is wrong?</a></li> + <li><a href="#i-cant-stop-flink-with-the-provided-stop-scripts-what-can-i-do" id="markdown-toc-i-cant-stop-flink-with-the-provided-stop-scripts-what-can-i-do">I canât stop Flink with the provided stop-scripts. What can I do?</a></li> + <li><a href="#i-got-an-outofmemoryexception-what-can-i-do" id="markdown-toc-i-got-an-outofmemoryexception-what-can-i-do">I got an OutOfMemoryException. What can I do?</a></li> + <li><a href="#why-do-the-taskmanager-log-files-become-so-huge" id="markdown-toc-why-do-the-taskmanager-log-files-become-so-huge">Why do the TaskManager log files become so huge?</a></li> + <li><a href="#the-slot-allocated-for-my-task-manager-has-been-released-what-should-i-do" id="markdown-toc-the-slot-allocated-for-my-task-manager-has-been-released-what-should-i-do">The slot allocated for my task manager has been released. What should I do?</a></li> </ul> </li> - <li><a href="#yarn-deployment">YARN Deployment</a> <ul> - <li><a href="#the-yarn-session-runs-only-for-a-few-seconds">The YARN session runs only for a few seconds</a></li> - <li><a href="#the-yarn-session-crashes-with-a-hdfs-permission-exception-during-startup">The YARN session crashes with a HDFS permission exception during startup</a></li> + <li><a href="#yarn-deployment" id="markdown-toc-yarn-deployment">YARN Deployment</a> <ul> + <li><a href="#the-yarn-session-runs-only-for-a-few-seconds" id="markdown-toc-the-yarn-session-runs-only-for-a-few-seconds">The YARN session runs only for a few seconds</a></li> + <li><a href="#the-yarn-session-crashes-with-a-hdfs-permission-exception-during-startup" id="markdown-toc-the-yarn-session-crashes-with-a-hdfs-permission-exception-during-startup">The YARN session crashes with a HDFS permission exception during startup</a></li> </ul> </li> - <li><a href="#features">Features</a> <ul> - <li><a href="#what-kind-of-fault-tolerance-does-flink-provide">What kind of fault-tolerance does Flink provide?</a></li> - <li><a href="#are-hadoop-like-utilities-such-as-counters-and-the-distributedcache-supported">Are Hadoop-like utilities, such as Counters and the DistributedCache supported?</a></li> + <li><a href="#features" id="markdown-toc-features">Features</a> <ul> + <li><a href="#what-kind-of-fault-tolerance-does-flink-provide" id="markdown-toc-what-kind-of-fault-tolerance-does-flink-provide">What kind of fault-tolerance does Flink provide?</a></li> + <li><a href="#are-hadoop-like-utilities-such-as-counters-and-the-distributedcache-supported" id="markdown-toc-are-hadoop-like-utilities-such-as-counters-and-the-distributedcache-supported">Are Hadoop-like utilities, such as Counters and the DistributedCache supported?</a></li> </ul> </li> </ul> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/how-to-contribute.html ---------------------------------------------------------------------- diff --git a/content/how-to-contribute.html b/content/how-to-contribute.html index 4a691e3..5e714bb 100644 --- a/content/how-to-contribute.html +++ b/content/how-to-contribute.html @@ -147,22 +147,22 @@ <div class="page-toc"> <ul id="markdown-toc"> - <li><a href="#easy-issues-for-starters">Easy Issues for Starters</a></li> - <li><a href="#contributing-code--documentation">Contributing Code & Documentation</a> <ul> - <li><a href="#setting-up-the-infrastructure-and-creating-a-pull-request">Setting up the Infrastructure and Creating a Pull Request</a></li> - <li><a href="#verifying-the-compliance-of-your-code">Verifying the Compliance of your Code</a></li> + <li><a href="#easy-issues-for-starters" id="markdown-toc-easy-issues-for-starters">Easy Issues for Starters</a></li> + <li><a href="#contributing-code--documentation" id="markdown-toc-contributing-code--documentation">Contributing Code & Documentation</a> <ul> + <li><a href="#setting-up-the-infrastructure-and-creating-a-pull-request" id="markdown-toc-setting-up-the-infrastructure-and-creating-a-pull-request">Setting up the Infrastructure and Creating a Pull Request</a></li> + <li><a href="#verifying-the-compliance-of-your-code" id="markdown-toc-verifying-the-compliance-of-your-code">Verifying the Compliance of your Code</a></li> </ul> </li> - <li><a href="#contribute-changes-to-the-website">Contribute changes to the Website</a> <ul> - <li><a href="#files-and-directories-in-the-website-git-repository">Files and Directories in the website git repository</a></li> - <li><a href="#the-buildsh-script">The <code>build.sh</code> script</a></li> + <li><a href="#contribute-changes-to-the-website" id="markdown-toc-contribute-changes-to-the-website">Contribute changes to the Website</a> <ul> + <li><a href="#files-and-directories-in-the-website-git-repository" id="markdown-toc-files-and-directories-in-the-website-git-repository">Files and Directories in the website git repository</a></li> + <li><a href="#the-buildsh-script" id="markdown-toc-the-buildsh-script">The <code>build.sh</code> script</a></li> </ul> </li> - <li><a href="#how-to-become-a-committer">How to become a committer</a> <ul> - <li><a href="#how-to-use-git-as-a-committer">How to use git as a committer</a></li> + <li><a href="#how-to-become-a-committer" id="markdown-toc-how-to-become-a-committer">How to become a committer</a> <ul> + <li><a href="#how-to-use-git-as-a-committer" id="markdown-toc-how-to-use-git-as-a-committer">How to use git as a committer</a></li> </ul> </li> - <li><a href="#snapshots-nightly-builds">Snapshots (Nightly Builds)</a></li> + <li><a href="#snapshots-nightly-builds" id="markdown-toc-snapshots-nightly-builds">Snapshots (Nightly Builds)</a></li> </ul> </div> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/img/blog/GSA-plan.png ---------------------------------------------------------------------- diff --git a/content/img/blog/GSA-plan.png b/content/img/blog/GSA-plan.png new file mode 100644 index 0000000..980e4f3 Binary files /dev/null and b/content/img/blog/GSA-plan.png differ http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/img/blog/flink-stack.png ---------------------------------------------------------------------- diff --git a/content/img/blog/flink-stack.png b/content/img/blog/flink-stack.png new file mode 100644 index 0000000..c2bb81d Binary files /dev/null and b/content/img/blog/flink-stack.png differ http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/img/blog/iteration.png ---------------------------------------------------------------------- diff --git a/content/img/blog/iteration.png b/content/img/blog/iteration.png new file mode 100644 index 0000000..1144ef0 Binary files /dev/null and b/content/img/blog/iteration.png differ http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/img/blog/neighborhood.png ---------------------------------------------------------------------- diff --git a/content/img/blog/neighborhood.png b/content/img/blog/neighborhood.png new file mode 100644 index 0000000..abef960 Binary files /dev/null and b/content/img/blog/neighborhood.png differ http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/img/blog/reduce-on-neighbors.png ---------------------------------------------------------------------- diff --git a/content/img/blog/reduce-on-neighbors.png b/content/img/blog/reduce-on-neighbors.png new file mode 100644 index 0000000..63137b8 Binary files /dev/null and b/content/img/blog/reduce-on-neighbors.png differ http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/img/blog/sssp.png ---------------------------------------------------------------------- diff --git a/content/img/blog/sssp.png b/content/img/blog/sssp.png new file mode 100644 index 0000000..8d7f092 Binary files /dev/null and b/content/img/blog/sssp.png differ http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/img/blog/user-song-graph.png ---------------------------------------------------------------------- diff --git a/content/img/blog/user-song-graph.png b/content/img/blog/user-song-graph.png new file mode 100644 index 0000000..2ac5aec Binary files /dev/null and b/content/img/blog/user-song-graph.png differ http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/img/blog/user-song-to-user-user.png ---------------------------------------------------------------------- diff --git a/content/img/blog/user-song-to-user-user.png b/content/img/blog/user-song-to-user-user.png new file mode 100644 index 0000000..4cdcac6 Binary files /dev/null and b/content/img/blog/user-song-to-user-user.png differ http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/img/blog/vertex-centric-plan.png ---------------------------------------------------------------------- diff --git a/content/img/blog/vertex-centric-plan.png b/content/img/blog/vertex-centric-plan.png new file mode 100644 index 0000000..1943d6f Binary files /dev/null and b/content/img/blog/vertex-centric-plan.png differ http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/index.html ---------------------------------------------------------------------- diff --git a/content/index.html b/content/index.html index 38577dd..7ba636e 100644 --- a/content/index.html +++ b/content/index.html @@ -229,6 +229,10 @@ <ul class="list-group"> + <li class="list-group-item"><span>24 Aug 2015</span> » + <a href="/news/2015/08/24/introducing-flink-gelly.html">Introducing Gelly: Graph Processing with Apache Flink</a> + </li> + <li class="list-group-item"><span>24 Jun 2015</span> » <a href="/news/2015/06/24/announcing-apache-flink-0.9.0-release.html">Announcing Apache Flink 0.9.0</a> </li> @@ -244,10 +248,6 @@ <li class="list-group-item"><span>13 Apr 2015</span> » <a href="/news/2015/04/13/release-0.9.0-milestone1.html">Announcing Flink 0.9.0-milestone1 preview release</a> </li> - - <li class="list-group-item"><span>07 Apr 2015</span> » - <a href="/news/2015/04/07/march-in-flink.html">March 2015 in the Flink community</a> - </li> </ul> http://git-wip-us.apache.org/repos/asf/flink-web/blob/5889311a/content/material.html ---------------------------------------------------------------------- diff --git a/content/material.html b/content/material.html index e7fb4f9..fa1425d 100644 --- a/content/material.html +++ b/content/material.html @@ -145,14 +145,14 @@ <div class="page-toc"> <ul id="markdown-toc"> - <li><a href="#apache-flink-logos">Apache Flink Logos</a> <ul> - <li><a href="#portable-network-graphics-png">Portable Network Graphics (PNG)</a></li> - <li><a href="#scalable-vector-graphics-svg">Scalable Vector Graphics (SVG)</a></li> - <li><a href="#photoshop-psd">Photoshop (PSD)</a></li> + <li><a href="#apache-flink-logos" id="markdown-toc-apache-flink-logos">Apache Flink Logos</a> <ul> + <li><a href="#portable-network-graphics-png" id="markdown-toc-portable-network-graphics-png">Portable Network Graphics (PNG)</a></li> + <li><a href="#scalable-vector-graphics-svg" id="markdown-toc-scalable-vector-graphics-svg">Scalable Vector Graphics (SVG)</a></li> + <li><a href="#photoshop-psd" id="markdown-toc-photoshop-psd">Photoshop (PSD)</a></li> </ul> </li> - <li><a href="#color-scheme">Color Scheme</a></li> - <li><a href="#slides">Slides</a></li> + <li><a href="#color-scheme" id="markdown-toc-color-scheme">Color Scheme</a></li> + <li><a href="#slides" id="markdown-toc-slides">Slides</a></li> </ul> </div>