Re: Spark unit test fails

2015-05-07 Thread NoWisdom
I'm also getting the same error.

Any ideas?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-unit-test-fails-tp22368p22798.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark unit test fails

2015-04-06 Thread Manas Kar
Trying to bump up the rank of the question.
Any example on Github can someone point to?

..Manas

On Fri, Apr 3, 2015 at 9:39 AM, manasdebashiskar  wrote:

> Hi experts,
>  I am trying to write unit tests for my spark application which fails with
> javax.servlet.FilterRegistration error.
>
> I am using CDH5.3.2 Spark and below is my dependencies list.
> val spark   = "1.2.0-cdh5.3.2"
> val esriGeometryAPI = "1.2"
> val csvWriter   = "1.0.0"
> val hadoopClient= "2.3.0"
> val scalaTest   = "2.2.1"
> val jodaTime= "1.6.0"
> val scalajHTTP  = "1.0.1"
> val avro= "1.7.7"
> val scopt   = "3.2.0"
> val config  = "1.2.1"
> val jobserver   = "0.4.1"
> val excludeJBossNetty = ExclusionRule(organization = "org.jboss.netty")
> val excludeIONetty = ExclusionRule(organization = "io.netty")
> val excludeEclipseJetty = ExclusionRule(organization =
> "org.eclipse.jetty")
> val excludeMortbayJetty = ExclusionRule(organization =
> "org.mortbay.jetty")
> val excludeAsm = ExclusionRule(organization = "org.ow2.asm")
> val excludeOldAsm = ExclusionRule(organization = "asm")
> val excludeCommonsLogging = ExclusionRule(organization =
> "commons-logging")
> val excludeSLF4J = ExclusionRule(organization = "org.slf4j")
> val excludeScalap = ExclusionRule(organization = "org.scala-lang",
> artifact = "scalap")
> val excludeHadoop = ExclusionRule(organization = "org.apache.hadoop")
> val excludeCurator = ExclusionRule(organization = "org.apache.curator")
> val excludePowermock = ExclusionRule(organization = "org.powermock")
> val excludeFastutil = ExclusionRule(organization = "it.unimi.dsi")
> val excludeJruby = ExclusionRule(organization = "org.jruby")
> val excludeThrift = ExclusionRule(organization = "org.apache.thrift")
> val excludeServletApi = ExclusionRule(organization = "javax.servlet",
> artifact = "servlet-api")
> val excludeJUnit = ExclusionRule(organization = "junit")
>
> I found the link (
> http://apache-spark-user-list.1001560.n3.nabble.com/Fwd-SecurityException-when-running-tests-with-Spark-1-0-0-td6747.html#a6749
> ) talking about the issue and the work around of the same.
> But that work around does not get rid of the problem for me.
> I am using an SBT build which can't be changed to maven.
>
> What am I missing?
>
>
> Stack trace
> -
> [info] FiltersRDDSpec:
> [info] - Spark Filter *** FAILED ***
> [info]   java.lang.SecurityException: class
> "javax.servlet.FilterRegistration"'s signer information does not match
> signer information of other classes in the same package
> [info]   at java.lang.ClassLoader.checkCerts(Unknown Source)
> [info]   at java.lang.ClassLoader.preDefineClass(Unknown Source)
> [info]   at java.lang.ClassLoader.defineClass(Unknown Source)
> [info]   at java.security.SecureClassLoader.defineClass(Unknown Source)
> [info]   at java.net.URLClassLoader.defineClass(Unknown Source)
> [info]   at java.net.URLClassLoader.access$100(Unknown Source)
> [info]   at java.net.URLClassLoader$1.run(Unknown Source)
> [info]   at java.net.URLClassLoader$1.run(Unknown Source)
> [info]   at java.security.AccessController.doPrivileged(Native Method)
> [info]   at java.net.URLClassLoader.findClass(Unknown Source)
>
> Thanks
> Manas
>  Manas Kar
>
> --
> View this message in context: Spark unit test fails
> 
> Sent from the Apache Spark User List mailing list archive
>  at Nabble.com.
>