Re: Got java.lang.SecurityException: class javax.servlet.FilterRegistration's when running job from intellij Idea
I've switched to maven and all issues are gone, now. 2015-01-23 12:07 GMT+01:00 Sean Owen so...@cloudera.com: Use mvn dependency:tree or sbt dependency-tree to print all of the dependencies. You are probably bringing in more servlet API libs from some other source? On Fri, Jan 23, 2015 at 10:57 AM, Marco marco@gmail.com wrote: Hi, I've exactly the same issue. I've tried to mark the libraries as 'provided' but then IntelliJ IDE seems to have deleted the libraries locallythat is I am not able to build/run the stuff in the IDE. Is the issue resolved ? I'm not very experienced in SBTI've tried to exclude the libraries: /name := SparkDemo version := 1.0 scalaVersion := 2.10.4 libraryDependencies += org.apache.spark %% spark-core % 1.2.0 exclude(org.apache.hadoop, hadoop-client) libraryDependencies += org.apache.spark % spark-sql_2.10 % 1.2.0 libraryDependencies += org.apache.hadoop % hadoop-common % 2.6.0 excludeAll( ExclusionRule(organization = org.eclipse.jetty)) libraryDependencies += org.apache.hadoop % hadoop-mapreduce-client-core % 2.6.0 libraryDependencies += org.apache.hbase % hbase-client % 0.98.4-hadoop2 libraryDependencies += org.apache.hbase % hbase-server % 0.98.4-hadoop2 libraryDependencies += org.apache.hbase % hbase-common % 0.98.4-hadoop2 mainClass in Compile := Some(demo.TruckEvents)/ but this does not work also. Thanks, Marco -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Got-java-lang-SecurityException-class-javax-servlet-FilterRegistration-s-when-running-job-from-intela-tp18035p21332.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org -- Viele Grüße, Marco - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Got java.lang.SecurityException: class javax.servlet.FilterRegistration's when running job from intellij Idea
Hi, I've exactly the same issue. I've tried to mark the libraries as 'provided' but then IntelliJ IDE seems to have deleted the libraries locallythat is I am not able to build/run the stuff in the IDE. Is the issue resolved ? I'm not very experienced in SBTI've tried to exclude the libraries: /name := SparkDemo version := 1.0 scalaVersion := 2.10.4 libraryDependencies += org.apache.spark %% spark-core % 1.2.0 exclude(org.apache.hadoop, hadoop-client) libraryDependencies += org.apache.spark % spark-sql_2.10 % 1.2.0 libraryDependencies += org.apache.hadoop % hadoop-common % 2.6.0 excludeAll( ExclusionRule(organization = org.eclipse.jetty)) libraryDependencies += org.apache.hadoop % hadoop-mapreduce-client-core % 2.6.0 libraryDependencies += org.apache.hbase % hbase-client % 0.98.4-hadoop2 libraryDependencies += org.apache.hbase % hbase-server % 0.98.4-hadoop2 libraryDependencies += org.apache.hbase % hbase-common % 0.98.4-hadoop2 mainClass in Compile := Some(demo.TruckEvents)/ but this does not work also. Thanks, Marco -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Got-java-lang-SecurityException-class-javax-servlet-FilterRegistration-s-when-running-job-from-intela-tp18035p21332.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Got java.lang.SecurityException: class javax.servlet.FilterRegistration's when running job from intellij Idea
Use mvn dependency:tree or sbt dependency-tree to print all of the dependencies. You are probably bringing in more servlet API libs from some other source? On Fri, Jan 23, 2015 at 10:57 AM, Marco marco@gmail.com wrote: Hi, I've exactly the same issue. I've tried to mark the libraries as 'provided' but then IntelliJ IDE seems to have deleted the libraries locallythat is I am not able to build/run the stuff in the IDE. Is the issue resolved ? I'm not very experienced in SBTI've tried to exclude the libraries: /name := SparkDemo version := 1.0 scalaVersion := 2.10.4 libraryDependencies += org.apache.spark %% spark-core % 1.2.0 exclude(org.apache.hadoop, hadoop-client) libraryDependencies += org.apache.spark % spark-sql_2.10 % 1.2.0 libraryDependencies += org.apache.hadoop % hadoop-common % 2.6.0 excludeAll( ExclusionRule(organization = org.eclipse.jetty)) libraryDependencies += org.apache.hadoop % hadoop-mapreduce-client-core % 2.6.0 libraryDependencies += org.apache.hbase % hbase-client % 0.98.4-hadoop2 libraryDependencies += org.apache.hbase % hbase-server % 0.98.4-hadoop2 libraryDependencies += org.apache.hbase % hbase-common % 0.98.4-hadoop2 mainClass in Compile := Some(demo.TruckEvents)/ but this does not work also. Thanks, Marco -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Got-java-lang-SecurityException-class-javax-servlet-FilterRegistration-s-when-running-job-from-intela-tp18035p21332.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Got java.lang.SecurityException: class javax.servlet.FilterRegistration's when running job from intellij Idea
Generally this means you included some javax.servlet dependency in your project deps. You should exclude any of these as they conflict in this bad way with other copies of the servlet API from Spark. On Tue, Nov 4, 2014 at 7:55 AM, Jaonary Rabarisoa jaon...@gmail.com wrote: Hi all, I have a spark job that I build with sbt and I can run without any problem with sbt run. But when I run it inside IntelliJ Idea I got the following error : Exception encountered when invoking run on a nested suite - class javax.servlet.FilterRegistration's signer information does not match signer information of other classes in the same package java.lang.SecurityException: class javax.servlet.FilterRegistration's signer information does not match signer information of other classes in the same package at java.lang.ClassLoader.checkCerts(ClassLoader.java:952) at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666) at java.lang.ClassLoader.defineClass(ClassLoader.java:794) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) at java.net.URLClassLoader.access$100(URLClassLoader.java:71) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.eclipse.jetty.servlet.ServletContextHandler.init(ServletContextHandler.java:136) at org.eclipse.jetty.servlet.ServletContextHandler.init(ServletContextHandler.java:129) at org.eclipse.jetty.servlet.ServletContextHandler.init(ServletContextHandler.java:98) at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:98) at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:89) at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:67) at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:60) at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:60) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:60) at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:66) at org.apache.spark.ui.SparkUI.init(SparkUI.scala:60) at org.apache.spark.ui.SparkUI.init(SparkUI.scala:42) at org.apache.spark.SparkContext.init(SparkContext.scala:223) at org.apache.spark.SparkContext.init(SparkContext.scala:98) How can I solve this ? Cheers, Jao - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Got java.lang.SecurityException: class javax.servlet.FilterRegistration's when running job from intellij Idea
Hi all, I have a spark job that I build with sbt and I can run without any problem with sbt run. But when I run it inside IntelliJ Idea I got the following error : *Exception encountered when invoking run on a nested suite - class javax.servlet.FilterRegistration's signer information does not match signer information of other classes in the same package* *java.lang.SecurityException: class javax.servlet.FilterRegistration's signer information does not match signer information of other classes in the same package* * at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)* * at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666)* * at java.lang.ClassLoader.defineClass(ClassLoader.java:794)* * at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)* * at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)* * at java.net.URLClassLoader.access$100(URLClassLoader.java:71)* * at java.net.URLClassLoader$1.run(URLClassLoader.java:361)* * at java.net.URLClassLoader$1.run(URLClassLoader.java:355)* * at java.security.AccessController.doPrivileged(Native Method)* * at java.net.URLClassLoader.findClass(URLClassLoader.java:354)* * at java.lang.ClassLoader.loadClass(ClassLoader.java:425)* * at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)* * at java.lang.ClassLoader.loadClass(ClassLoader.java:358)* * at org.eclipse.jetty.servlet.ServletContextHandler.init(ServletContextHandler.java:136)* * at org.eclipse.jetty.servlet.ServletContextHandler.init(ServletContextHandler.java:129)* * at org.eclipse.jetty.servlet.ServletContextHandler.init(ServletContextHandler.java:98)* * at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:98)* * at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:89)* * at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:67)* * at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:60)* * at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:60)* * at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)* * at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)* * at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:60)* * at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:66)* * at org.apache.spark.ui.SparkUI.init(SparkUI.scala:60)* * at org.apache.spark.ui.SparkUI.init(SparkUI.scala:42)* * at org.apache.spark.SparkContext.init(SparkContext.scala:223)* * at org.apache.spark.SparkContext.init(SparkContext.scala:98)* How can I solve this ? Cheers, Jao