Seems like you might be running into
https://issues.apache.org/jira/browse/SPARK-10910. I've been busy with
other things but plan to take a look at that one when I find time...
right now I don't really have a solution, other than making sure your
application's jars do not include those classes the exception is
complaining about.

On Wed, Oct 7, 2015 at 10:23 AM, Gary Ogden <gog...@gmail.com> wrote:
> What you suggested seems to have worked for unit tests. But now it throws
> this at run time on mesos with spark-submit:
>
> Exception in thread "main" java.lang.LinkageError: loader constraint
> violation: when resolving method
> "org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;"
> the class loader (instance of
> org/apache/spark/util/ChildFirstURLClassLoader) of the current class,
> org/slf4j/LoggerFactory, and the class loader (instance of
> sun/misc/Launcher$AppClassLoader) for resolved class,
> org/slf4j/impl/StaticLoggerBinder, have different Class objects for the type
> LoggerFactory; used in the signature
>       at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:336)
>       at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:284)
>       at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:305)
>       at com.company.spark.utils.SparkJob.<clinit>(SparkJob.java:41)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Unknown Source)
>       at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:634)
>       at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
> On 6 October 2015 at 16:20, Marcelo Vanzin <van...@cloudera.com> wrote:
>>
>> On Tue, Oct 6, 2015 at 12:04 PM, Gary Ogden <gog...@gmail.com> wrote:
>> > But we run unit tests differently in our build environment, which is
>> > throwing the error. It's setup like this:
>> >
>> > I suspect this is what you were referring to when you said I have a
>> > problem?
>>
>> Yes, that is what I was referring to. But, in your test environment,
>> you might be able to work around the problem by setting
>> "spark.ui.enabled=false"; that should disable all the code that uses
>> Jersey, so you can use your newer version in your unit tests.
>>
>>
>> --
>> Marcelo
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to