Hi,

I wrote below simple spark code, and met a runtime issue which seems that
the system can't find some methods of scala refect library.


package org.apache.spark.examples

import scala.io.Source
import scala.reflect._
import scala.reflect.api.JavaUniverse
import scala.reflect.runtime.universe
import org.apache.spark.SparkContext._
import org.apache.spark.{SparkConf, SparkContext}

import scala.reflect.NameTransformer
import scala.reflect.NameTransformer._


object test{
  def main(args: Array[String]) {
    val sparkConf = new SparkConf().setAppName("test")
    val ctx = new SparkContext(sparkConf)
    val lines = ctx.textFile("data") 

    val rules = lines.map{ s =>
      val parts = s.split(",")
      val part0 = parts(0)
      (part0, s)
    }.distinct().groupByKey().cache() 

    val ru = scala.reflect.runtime.universe
    println("End.....");
 
    ctx.stop()
  }
}
 
after compiling above codes, I used below command to submit the application.
In the submitting command, I used --driver-class-path to set classpath to
include the path of scala-reflect.jar

$ spark-submit --master local   --class org.apache.spark.examples.test 
--driver-class-path
/MY_GRAPH_PATH/lib/graph-core_2.11-1.9.0.jar:/MY_SPARK_PATH/lib/spark-assembly-1.1.0-hadoop2.4.0.jar:/MY_SCALA_PATH/lib/scala-reflect.jar
/MY_APP_PATH/test/bin/test.jar


then I got following failures:

Exception in thread "main" java.lang.NoSuchMethodError:
scala.reflect.NameTransformer$.LOCAL_SUFFIX_STRING()Ljava/lang/String;
        at
scala.reflect.internal.StdNames$CommonNames.<init>(StdNames.scala:97)
        at
scala.reflect.internal.StdNames$Keywords.<init>(StdNames.scala:203)
        at
scala.reflect.internal.StdNames$TermNames.<init>(StdNames.scala:288)
        at scala.reflect.internal.StdNames$nme$.<init>(StdNames.scala:1045)
        at
scala.reflect.internal.SymbolTable.nme$lzycompute(SymbolTable.scala:16)
        at scala.reflect.internal.SymbolTable.nme(SymbolTable.scala:16)
        at scala.reflect.internal.StdNames$class.$init$(StdNames.scala:1041)
        at scala.reflect.internal.SymbolTable.<init>(SymbolTable.scala:16)
        at scala.reflect.runtime.JavaUniverse.<init>(JavaUniverse.scala:16)
        at
scala.reflect.runtime.package$.universe$lzycompute(package.scala:17)
        at scala.reflect.runtime.package$.universe(package.scala:17)
        at RouteChecker$.main(test.scala:32)
        at RouteChecker.main(test.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:483)
        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

However, when commenting out the spark related code from above program, I
used scala to compile and run the program, and found that the program can
work fine. 

Dose anyone know about this? thank you very much.

Best Regards,
Dingfei



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/can-not-found-scala-reflect-related-methods-when-running-spark-program-tp19273.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to