OK, have waded into implementing this and have gotten pretty far, but am now
hitting something I don't understand, an NoSuchMethodError. 

The code looks like

      [...]
   val conf = new SparkConf().setAppName(appName)
    //conf.set("fs.default.name", "file://");
    val sc = new SparkContext(conf)
   
    val lines = sc.textFile(inFileArg)
    val foo = lines.count()
    val edgeTmp = lines.map( line => line.split(" ").slice(0,3)).
              // following filters omit comments, so no need to specifically
filter for comments ("#....")
              filter(x => x(0).startsWith("<") &&  x(0).endsWith(">") &&
                          x(2).startsWith("<") &&  x(2).endsWith(">")).
              map(x => Edge(hashToVId(x(0)),hashToVId(x(2)),x(1)))
    edgeTmp.foreach( edge => print(edge+"\n"))
    val edges: RDD[Edge[String]] = edgeTmp
    println("edges.count="+edges.count)

    val properties: RDD[(VertexId, Map[String, Any])] =
        lines.map( line => line.split(" ").slice(0,3)).
              filter(x => !x(0).startsWith("#")).       // omit RDF comments
              filter(x => !x(2).startsWith("<") || !x(2).endsWith(">")).
              map(x => { val m: Tuple2[VertexId, Map[String, Any]] =
(hashToVId(x(0)), Map((x(1).toString,x(2)))); m })
    properties.foreach( prop => print(prop+"\n"))

    val G = Graph(properties, edges)    /// <======== this is line 114
    println(G)

The (short) traceback looks like

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.graphx.Graph$.apply$default$4()Lorg/apache/spark/storage/StorageLevel;
        at com.cray.examples.spark.graphx.lubm.query9$.main(query9.scala:114)
        at com.cray.examples.spark.graphx.lubm.query9.main(query9.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Is the method that's not found (".../StorageLevel") something I need to
initialize?  Using this same code on a toy problem works fine.  

BTW, this is Spark 1.0, running locally on my laptop.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/representing-RDF-literals-as-vertex-properties-tp20404p20582.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to