Not sure if this is related all. But I had an issue getting Grape imports available in Groovysh (which is related to Groovy Shell), which cause me to try and tamper with the Grape classloading: http://mail-archives.apache.org/mod_mbox/groovy-dev/201508.mbox/%3CCAByu6UVw1KNVqPnQrjKRCANj6e8od9sGczinz7iDWA1P+=4...@mail.gmail.com%3E
This might be unrelated to your problems, though. On Tue, Nov 17, 2015 at 9:24 PM, tog <guillaume.all...@gmail.com> wrote: > Hello > > Any more ideas regarding my issue? > > Thanks > Guillaume > > On 15 November 2015 at 20:19, tog <guillaume.all...@gmail.com> wrote: >> >> Sorry, my previous email is wrong. >> >> The block: >> groovy.grape.Grape.grab( >> groupId: 'org.apache.spark', >> artifactId: 'spark-core_2.10', >> version: '1.5.2' >> ) >> >> does not seem equivalent to: >> >> @Grab('org.apache.spark:spark-core_2.10:1.5.2') >> >> since the imports cannot be found. >> >> >> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------ >> >> tog GroovySpark $ groovy GroovySparkWordcount.groovy >> >> org.codehaus.groovy.control.MultipleCompilationErrorsException: startup >> failed: >> >> /Users/tog/Work/GroovySpark/GroovySparkWordcount.groovy: 9: unable to >> resolve class org.apache.spark.api.java.JavaSparkContext >> >> @ line 9, column 1. >> >> import org.apache.spark.api.java.JavaSparkContext >> >> ^ >> >> >> /Users/tog/Work/GroovySpark/GroovySparkWordcount.groovy: 8: unable to >> resolve class org.apache.spark.SparkConf >> >> @ line 8, column 1. >> >> import org.apache.spark.SparkConf >> >> ^ >> >> >> 2 errors >> >> >> On 15 November 2015 at 18:55, tog <guillaume.all...@gmail.com> wrote: >>> >>> Thanks, Yes, just realize the typo ... I fixed it and get the very same >>> error. >>> I am getting lost ;-) >>> >>> >>> >>> org.apache.spark.SparkConf@2158ddec java.lang.ClassNotFoundException: >>> org.apache.spark.rpc.akka.AkkaRpcEnvFactory at >>> java.net.URLClassLoader.findClass(URLClassLoader.java:381) at >>> java.lang.ClassLoader.loadClass(ClassLoader.java:424) at >>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at >>> java.lang.ClassLoader.loadClass(ClassLoader.java:357) at >>> java.lang.Class.forName0(Native Method) at >>> java.lang.Class.forName(Class.java:348) at >>> org.apache.spark.rpc.RpcEnv$.getRpcEnvFactory(RpcEnv.scala:40) at >>> org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52) at >>> org.apache.spark.SparkEnv$.create(SparkEnv.scala:247) at >>> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188) at >>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) at >>> org.apache.spark.SparkContext.<init>(SparkContext.scala:424) at >>> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61) >>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at >>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) >>> at >>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at >>> org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:80) >>> at >>> org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105) >>> at >>> org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:60) >>> at >>> org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:235) >>> at >>> org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:247) >>> at Script6.run(Script6.groovy:16) >>> >>> >>> >>> On 15 November 2015 at 18:41, Bahman Movaqar <bah...@bahmanm.com> wrote: >>>> >>>> On 11/15/2015 10:03 PM, tog wrote: >>>> >>>> > @Grap seems to have default repo to look into ... with the change you >>>> > are suggesting I got >>>> > ava.lang.RuntimeException: Error grabbing Grapes -- [unresolved >>>> > dependency: org.apache.spark#spark core_2.10;1.5.2: not found] at >>>> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) >>>> > >>>> > How do I define them? >>>> >>>> It was a typo on my side. `artifactId` should be "spark-core_2.10" (note >>>> the `-` character). >>>> >>>> -- >>>> Bahman Movaqar >>>> >>>> http://BahmanM.com - https://twitter.com/bahman__m >>>> https://github.com/bahmanm - https://gist.github.com/bahmanm >>>> PGP Key ID: 0x6AB5BD68 (keyserver2.pgp.com) >>>> >>> >>> >>> >>> -- >>> PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net >> >> >> >> >> -- >> PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net > > > > > -- > PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net