sbt assembly; $SPARK_HOME/bin/spark-submit --class main.scala.TestMain
--master "local[4]" target/scala-2.11/bof-assembly-0.1-SNAPSHOT.jar

using Spark:

/opt/spark-1.4.1-bin-hadoop2.6

On Mon, Sep 7, 2015 at 10:20 PM, Jonathan Coveney <jcove...@gmail.com>
wrote:

> How are you building and running it?
>
>
> El lunes, 7 de septiembre de 2015, Gheorghe Postelnicu <
> gheorghe.posteln...@gmail.com> escribió:
>
>> Interesting idea. Tried that, didn't work. Here is my new SBT file:
>>
>> name := """testMain"""
>>
>> scalaVersion := "2.11.6"
>>
>> libraryDependencies ++= Seq(
>>   "org.apache.spark" %% "spark-core" % "1.4.1" % "provided",
>>   "org.apache.spark" %% "spark-sql" % "1.4.1" % "provided",
>>   "org.scala-lang" % "scala-reflect" % "2.11.6"
>> )
>>
>>
>> On Mon, Sep 7, 2015 at 9:55 PM, Jonathan Coveney <jcove...@gmail.com>
>> wrote:
>>
>>> Try adding the following to your build.sbt
>>>
>>> libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.6"
>>>
>>>
>>> I believe that spark shades the scala library, and this is a library that 
>>> it looks like you need in an unshaded way.
>>>
>>>
>>> 2015-09-07 16:48 GMT-04:00 Gheorghe Postelnicu <
>>> gheorghe.posteln...@gmail.com>:
>>>
>>>> Hi,
>>>>
>>>> The following code fails when compiled from SBT:
>>>>
>>>> package main.scala
>>>>
>>>> import org.apache.spark.SparkContext
>>>> import org.apache.spark.sql.SQLContext
>>>>
>>>> object TestMain {
>>>>   def main(args: Array[String]): Unit = {
>>>>     implicit val sparkContext = new SparkContext()
>>>>     val sqlContext = new SQLContext(sparkContext)
>>>>     import sqlContext.implicits._
>>>>     sparkContext.parallelize(1 to 10).map(i => (i,
>>>> i.toString)).toDF("intCol", "strCol")
>>>>   }
>>>> }
>>>>
>>>> with the following error:
>>>>
>>>> 15/09/07 21:39:21 INFO BlockManagerMaster: Registered BlockManager
>>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>>> scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
>>>> at main.scala.Bof$.main(Bof.scala:14)
>>>> at main.scala.Bof.main(Bof.scala)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:497)
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>> 15/09/07 21:39:22 INFO SparkContext: Invoking stop() from shutdown hook
>>>>
>>>> whereas the code above works in a spark shell.
>>>>
>>>> The code is compiled using Scala 2.11.6 and precompiled Spark 1.4.1
>>>>
>>>> Any suggestion on how to fix this would be much appreciated.
>>>>
>>>> Best,
>>>> Gheorghe
>>>>
>>>>
>>>
>>

Reply via email to