I will look into it once I have some time (end of this week, or next
week probably)
On Tue, Apr 14, 2015 at 8:51 PM, Robert Metzger wrote:
> Hey Nikolaas,
>
> Thank you for posting on the mailing list. I've met Nikolaas today in
> person and we were talking a bit about an interactive shell for Fl
Hey Nikolaas,
Thank you for posting on the mailing list. I've met Nikolaas today in
person and we were talking a bit about an interactive shell for Flink,
potentially also an integration with Zeppelin.
Great stuff I'm really looking forward to :)
We were wondering if somebody from the list has s
Stefan Bunk created FLINK-1890:
--
Summary: Add withReadFields or sth. similar to Scala API
Key: FLINK-1890
URL: https://issues.apache.org/jira/browse/FLINK-1890
Project: Flink
Issue Type: Wish
Ok, so, exactly as I wrote a few e-mails back in this thread, you can do
this with a vertex-centric iteration :-)
All you need to do is call "myGraph.runVertexCentricIteration(new
MyUpdateFunction(), new MyMessagingFunction(), maxIterations)"
and define MyUpdateFunction and MyMessagingFunction.
T
Hi!
I am trying to implement a scala shell for flink.
I've started with a simple scala object who's main function will drop the
user to the interactive scala shell (repl) at one point:
import scala.tools.nsc.interpreter.ILoop
import scala.tools.nsc.Settings
object Job {
def main(args: Array
Hi Vasia,
for compute subgraph for Person I mean exactly all the vertices that
can be reached
starting from this node and following the graph edges.
I drafted the graph as a set of vertices (where the id is the subject of
the set of triples and the value is all of its triples)
and a set of edges (p
@Till: Yes, it works without the parentheses :) Thanks :)
2015-04-14 16:52 GMT+02:00 Felix Neutatz :
> I don't know. I can only see the following:
>
> def collect : scala.collection.mutable.Buffer[T] = { /* compiled code */ }
>
> When do they update the latest snapshot on Maven?
>
>
> 2015-04-14
Hi Flavio,
I'm not quite familiar with RDF or sparql, so not all of your code is clear
to me.
Your first TODO is "compute subgraph for Person". Is "Person" a vertex id
in your graph? A vertex value?
And by "subgraph of Person", do you mean all the vertices that can be
reached starting from this n
I don't know. I can only see the following:
def collect : scala.collection.mutable.Buffer[T] = { /* compiled code */ }
When do they update the latest snapshot on Maven?
2015-04-14 15:52 GMT+02:00 Till Rohrmann :
> Could you check the definition of the collect method in the DataSet.scala
> file
Hi,
I assume that your graphEdges is of type DataSet[EdgeType], right?
Depending on the version you're using the reason might be that the collect
method in the old version does not expect parameters. Thus your parentheses
actually call the apply method on the returned Scala buffer.
In the latest
I use the latest maven snapshot:
org.apache.flink
flink-scala
0.9-SNAPSHOT
org.apache.flink
flink-clients
0.9-SNAPSHOT
2015-04-14 15:45 GMT+02:00 Robert Metzger :
> Hi,
>
> which version of Flink are you using?
>
> On Tue, Apr 14, 2015 at 3:36 PM, Felix Neutatz
> wrote:
>
> > H
Could you check the definition of the collect method in the DataSet.scala
file? Does it contain parentheses or not?
On Tue, Apr 14, 2015 at 3:48 PM, Felix Neutatz
wrote:
> I use the latest maven snapshot:
>
>
> org.apache.flink
> flink-scala
> 0.9-SNAPSHOT
>
>
> org.apache.flink
> f
Hi,
I want to run the following example:
import org.apache.flink.api.scala._
case class EdgeType(src: Int, target: Int)
object Test {
def main(args: Array[String]) {
implicit val env = ExecutionEnvironment.getExecutionEnvironment
val graphEdges = readEdges("edges.csv")
gr
Hi,
which version of Flink are you using?
On Tue, Apr 14, 2015 at 3:36 PM, Felix Neutatz
wrote:
> Hi,
>
> I want to run the following example:
>
> import org.apache.flink.api.scala._
>
> case class EdgeType(src: Int, target: Int)
>
> object Test {
>def main(args: Array[String]) {
> im
Theodore Vasiloudis created FLINK-1889:
--
Summary: Create optimization framework
Key: FLINK-1889
URL: https://issues.apache.org/jira/browse/FLINK-1889
Project: Flink
Issue Type: New Featu
Maximilian Michels created FLINK-1888:
-
Summary: Download page doesn't contain Maven dependencies for
0.9.0-milestone-1
Key: FLINK-1888
URL: https://issues.apache.org/jira/browse/FLINK-1888
Proje
Sibao Hong created FLINK-1887:
-
Summary: Fix the message in runtime exception
Key: FLINK-1887
URL: https://issues.apache.org/jira/browse/FLINK-1887
Project: Flink
Issue Type: Improvement
hagersaleh created FLINK-1886:
-
Summary: how can handles left join ,right join , FULL OUTER JOIN
in flink
Key: FLINK-1886
URL: https://issues.apache.org/jira/browse/FLINK-1886
Project: Flink
Iss
Can you push the fix to the release-0.8 branch?
On Mon, Apr 13, 2015 at 12:38 AM, Fabian Hueske wrote:
> We should also get the HadoopOF fix in.
> On Apr 12, 2015 10:14 AM, "Robert Metzger" wrote:
>
> > Hi,
> >
> > in this thread [1] we started a discussion whether we should cut a 0.8.2
> > rel
Hi to all,
I made a simple RDF Gelly test and I shared it on my github repo at
https://github.com/fpompermaier/rdf-gelly-test.
I basically setup the Gelly stuff but I can't proceed and compute the
drafted TODOs.
Could someone help me and implementing them..?
I think this could become a nice example
Markus Holzemer created FLINK-1885:
--
Summary: Bulk mode for gelly
Key: FLINK-1885
URL: https://issues.apache.org/jira/browse/FLINK-1885
Project: Flink
Issue Type: Improvement
Compo
21 matches
Mail list logo