Thank you Michael! This looks perfect but I have a `NoSuchMethodError` that I 
cannot understand. 

I am trying to implement a weighted shortest path algorithm from your `Spark 
GraphX in Action` book. The part in question is Listing 6.4 "Executing the 
shortest path algorithm that uses breadcrumbs"  from Chapter 6 [here][1].

I have my own graph that I create from two RDDs. There are `344436` vertices 
and `772983` edges. I can perform an unweighted shortest path computation using 
the native GraphX library and I'm confident in the graph construction. 

In this case I use their Dijkstra's implementation as follows:

    val my_graph: Graph[(Long),Double] = Graph.apply(verticesRDD, 
edgesRDD).cache()
        
        def dijkstra[VD](g:Graph[VD,Double], origin:VertexId) = {
                var g2 = g.mapVertices(
                        (vid,vd) => (false, if (vid == origin) 0 else 
Double.MaxValue,
                                                                                
List[VertexId]()))

                        for (i <- 1L to g.vertices.count-1) {
                                val currentVertexId =
                                        g2.vertices.filter(!_._2._1)
                                                
.fold((0L,(false,Double.MaxValue,List[VertexId]())))((a,b) =>
                                                        if (a._2._2 < b._2._2) 
a else b)
                                                ._1

                                val newDistances = 
g2.aggregateMessages[(Double,List[VertexId])](
                                                ctx => if (ctx.srcId == 
currentVertexId)
                                                                 
ctx.sendToDst((ctx.srcAttr._2 + ctx.attr,
                                                                                
                                                ctx.srcAttr._3 :+ ctx.srcId)),
                                                (a,b) => if (a._1 < b._1) a 
else b)

                                g2 = g2.outerJoinVertices(newDistances)((vid, 
vd, newSum) => {
                                        val newSumVal =
                                                
newSum.getOrElse((Double.MaxValue,List[VertexId]()))
                                (vd._1 || vid == currentVertexId,
                                        math.min(vd._2, newSumVal._1),
                                        if (vd._2 < newSumVal._1) vd._3 else 
newSumVal._2)})
                }
        
                g.outerJoinVertices(g2.vertices)((vid, vd, dist) =>
                        (vd, 
dist.getOrElse((false,Double.MaxValue,List[VertexId]()))
                                         .productIterator.toList.tail))
        }

        //  Path Finding - random node from which to find all paths
        val v1 = 4000000028222916L

I then call their function with my graph and a random vertex ID. Previously I 
had issues with `v1` not being recognised as `long` type and the `L` suffix 
solved this.

        val results = dijkstra(my_graph, 1L).vertices.map(_._2).collect
        
        println(results)
        
However, this returns the following:

    Error: Exception in thread "main" java.lang.NoSuchMethodError: 
scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
        at GraphX$.dijkstra$1(GraphX.scala:51)
        at GraphX$.main(GraphX.scala:85)
        at GraphX.main(GraphX.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Line 51 refers to the line `var g2 = g.mapVertices(`
Line 85 refers to the line `val results = dijkstra(my_graph, 
1L).vertices.map(_._2).collect`

What method is this exception referring to? I am able to package with `sbt` 
without error and I canno see what method I am calling whcih does not exist. 

Many thanks!

Brian

  [1]: https://www.manning.com/books/spark-graphx-in-action#downloads 
<https://www.manning.com/books/spark-graphx-in-action#downloads>

> On 24 Oct 2016, at 16:54, Michael Malak <michaelma...@yahoo.com> wrote:
> 
> Chapter 6 of my book implements Dijkstra's Algorithm. The source code is 
> available to download for free. 
> https://www.manning.com/books/spark-graphx-in-action 
> <https://www.manning.com/books/spark-graphx-in-action>
> 
> 
> 
> 
> From: Brian Wilson <brian.wilson....@gmail.com>
> To: user@spark.apache.org 
> Sent: Monday, October 24, 2016 7:11 AM
> Subject: Shortest path with directed and weighted graphs
> 
> I have been looking at the ShortestPaths function inbuilt with Spark here 
> <https://github.com/apache/spark/blob/master/graphx/src/main/scala/org/apache/spark/graphx/lib/ShortestPaths.scala>.
> 
> Am I correct in saying there is no support for weighted graphs with this 
> function? By that I mean that it assumes all edges carry a weight = 1
> 
> Many thanks
> 
> Brian 
> 
> 

Reply via email to