The only way I know of is to delete/move the servlet-api-2.4.jar.
-Pascal
Am 18.11.2015 um 19:25 schrieb tog:
I did recompile your groovysh-grab, is that really the one because I
face new problem such as:
togGroovySpark $ $GROOVY_HOME/bin/groovy
GroovySparkThroughGroovyShell.groovy
java.lang.SecurityException: class
"javax.servlet.FilterRegistration"'s signer information does not match
signer information of other classes in the same package
Probably multiple jars with the same name. How can I ask Grab not to
load servlet-api-2.4.jar which ships with groovy ?
Cheers
Guillaume
On 17 November 2015 at 21:24, Thibault Kruse <tibokr...@googlemail.com
<mailto:tibokr...@googlemail.com>> wrote:
you'd have to recompile, my PR is growing old
On Tue, Nov 17, 2015 at 10:08 PM, tog <guillaume.all...@gmail.com
<mailto:guillaume.all...@gmail.com>> wrote:
> Thibault
>
> Has your change been pushed in groovy recently or should I
recompile my own
> version to test if that solve my issue?
> Any other way to test it without having to generate my own version ?
>
> Cheers
> Guillaume
>
> On 17 November 2015 at 20:57, Thibault Kruse
<tibokr...@googlemail.com <mailto:tibokr...@googlemail.com>>
> wrote:
>>
>> Not sure if this is related all. But I had an issue getting Grape
>> imports available in Groovysh (which is related to Groovy Shell),
>> which cause me to try and tamper with the Grape classloading:
>>
>>
http://mail-archives.apache.org/mod_mbox/groovy-dev/201508.mbox/%3CCAByu6UVw1KNVqPnQrjKRCANj6e8od9sGczinz7iDWA1P+=4...@mail.gmail.com%3E
>>
>> This might be unrelated to your problems, though.
>>
>>
>> On Tue, Nov 17, 2015 at 9:24 PM, tog
<guillaume.all...@gmail.com <mailto:guillaume.all...@gmail.com>>
wrote:
>> > Hello
>> >
>> > Any more ideas regarding my issue?
>> >
>> > Thanks
>> > Guillaume
>> >
>> > On 15 November 2015 at 20:19, tog <guillaume.all...@gmail.com
<mailto:guillaume.all...@gmail.com>> wrote:
>> >>
>> >> Sorry, my previous email is wrong.
>> >>
>> >> The block:
>> >> groovy.grape.Grape.grab(
>> >> groupId: 'org.apache.spark',
>> >> artifactId: 'spark-core_2.10',
>> >> version: '1.5.2'
>> >> )
>> >>
>> >> does not seem equivalent to:
>> >>
>> >> @Grab('org.apache.spark:spark-core_2.10:1.5.2')
>> >>
>> >> since the imports cannot be found.
>> >>
>> >>
>> >>
>> >>
------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>> >>
>> >> tog GroovySpark $ groovy GroovySparkWordcount.groovy
>> >>
>> >>
org.codehaus.groovy.control.MultipleCompilationErrorsException:
startup
>> >> failed:
>> >>
>> >> /Users/tog/Work/GroovySpark/GroovySparkWordcount.groovy: 9:
unable to
>> >> resolve class org.apache.spark.api.java.JavaSparkContext
>> >>
>> >> @ line 9, column 1.
>> >>
>> >> import org.apache.spark.api.java.JavaSparkContext
>> >>
>> >> ^
>> >>
>> >>
>> >> /Users/tog/Work/GroovySpark/GroovySparkWordcount.groovy: 8:
unable to
>> >> resolve class org.apache.spark.SparkConf
>> >>
>> >> @ line 8, column 1.
>> >>
>> >> import org.apache.spark.SparkConf
>> >>
>> >> ^
>> >>
>> >>
>> >> 2 errors
>> >>
>> >>
>> >> On 15 November 2015 at 18:55, tog
<guillaume.all...@gmail.com <mailto:guillaume.all...@gmail.com>>
wrote:
>> >>>
>> >>> Thanks, Yes, just realize the typo ... I fixed it and get
the very
>> >>> same
>> >>> error.
>> >>> I am getting lost ;-)
>> >>>
>> >>>
>> >>>
>> >>> org.apache.spark.SparkConf@2158ddec
java.lang.ClassNotFoundException:
>> >>> org.apache.spark.rpc.akka.AkkaRpcEnvFactory at
>> >>> java.net.URLClassLoader.findClass(URLClassLoader.java:381) at
>> >>> java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
>> >>>
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at
>> >>> java.lang.ClassLoader.loadClass(ClassLoader.java:357) at
>> >>> java.lang.Class.forName0(Native Method) at
>> >>> java.lang.Class.forName(Class.java:348) at
>> >>>
org.apache.spark.rpc.RpcEnv$.getRpcEnvFactory(RpcEnv.scala:40) at
>> >>> org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52) at
>> >>> org.apache.spark.SparkEnv$.create(SparkEnv.scala:247) at
>> >>>
org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188) at
>> >>>
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
>> >>> at
>> >>> org.apache.spark.SparkContext.<init>(SparkContext.scala:424) at
>> >>>
>> >>>
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
>> >>> at
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> >>> Method) at
>> >>>
>> >>>
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> >>> at
>> >>>
>> >>>
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >>> at
java.lang.reflect.Constructor.newInstance(Constructor.java:422) at
>> >>>
>> >>>
org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:80)
>> >>> at
>> >>>
>> >>>
org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105)
>> >>> at
>> >>>
>> >>>
org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:60)
>> >>> at
>> >>>
>> >>>
org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:235)
>> >>> at
>> >>>
>> >>>
org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:247)
>> >>> at Script6.run(Script6.groovy:16)
>> >>>
>> >>>
>> >>>
>> >>> On 15 November 2015 at 18:41, Bahman Movaqar
<bah...@bahmanm.com <mailto:bah...@bahmanm.com>>
>> >>> wrote:
>> >>>>
>> >>>> On 11/15/2015 10:03 PM, tog wrote:
>> >>>>
>> >>>> > @Grap seems to have default repo to look into ... with
the change
>> >>>> > you
>> >>>> > are suggesting I got
>> >>>> > ava.lang.RuntimeException: Error grabbing Grapes --
[unresolved
>> >>>> > dependency: org.apache.spark#spark core_2.10;1.5.2: not
found] at
>> >>>> >
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> >>>> > Method)
>> >>>> >
>> >>>> > How do I define them?
>> >>>>
>> >>>> It was a typo on my side. `artifactId` should be
"spark-core_2.10"
>> >>>> (note
>> >>>> the `-` character).
>> >>>>
>> >>>> --
>> >>>> Bahman Movaqar
>> >>>>
>> >>>> http://BahmanM.com - https://twitter.com/bahman__m
>> >>>> https://github.com/bahmanm - https://gist.github.com/bahmanm
>> >>>> PGP Key ID: 0x6AB5BD68 (keyserver2.pgp.com
<http://keyserver2.pgp.com>)
>> >>>>
>> >>>
>> >>>
>> >>>
>> >>> --
>> >>> PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net
<http://subkeys.pgp.net>
>> >>
>> >>
>> >>
>> >>
>> >> --
>> >> PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net
<http://subkeys.pgp.net>
>> >
>> >
>> >
>> >
>> > --
>> > PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net
<http://subkeys.pgp.net>
>
>
>
>
> --
> PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net <http://subkeys.pgp.net>
--
PGP KeyID: 2048R/EA31CFC9 subkeys.pgp.net <http://subkeys.pgp.net>