Yea unfortunately you need that as well. When 1.0 is released, you wouldn't
need to do that anymore.

BTW - you can also just check out the source code from github to build 1.0.
The current branch-1.0 branch is very already at release candidate status -
so it should be almost identical to the actual 1.0 release.

https://github.com/apache/spark/tree/branch-1.0


On Mon, May 19, 2014 at 3:16 PM, GlennStrycker <glenn.stryc...@gmail.com>wrote:

> Thanks, rxin, this worked!
>
> I am having a similar problem with .reduce... do I need to insert .copy()
> functions in that statement as well?
>
> This part works:
> orig_graph.edges.map(_.copy()).flatMap(edge => Seq(edge) ).map(edge =>
> (Edge(edge.copy().srcId, edge.copy().dstId, edge.copy().attr), 1)).collect
>
> =Array((Edge(1,4,1),1), (Edge(1,5,1),1), (Edge(1,7,1),1), (Edge(2,5,1),1),
> (Edge(2,6,1),1), (Edge(3,5,1),1), (Edge(3,6,1),1), (Edge(3,7,1),1),
> (Edge(4,1,1),1), (Edge(5,1,1),1), (Edge(5,2,1),1), (Edge(5,3,1),1),
> (Edge(6,2,1),1), (Edge(6,3,1),1), (Edge(7,1,1),1), (Edge(7,3,1),1))
>
> But when I try adding on a reduce statement, I only get one element, not
> 16:
> orig_graph.edges.map(_.copy()).flatMap(edge => Seq(edge) ).map(edge =>
> (Edge(edge.copy().srcId, edge.copy().dstId, edge.copy().attr), 1)).reduce(
> (A,B) => { if (A._1.dstId == B._1.srcId) (Edge(A._1.srcId, B._1.dstId, 2),
> 1) else if (A._1.srcId == B._1.dstId) (Edge(B._1.srcId, A._1.dstId, 2), 1)
> else (Edge(0, 0, 0), 0) } )
>
> =(Edge(0,0,0),0)
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/BUG-graph-triplets-does-not-return-proper-values-tp6693p6695.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>

Reply via email to