Re: Compile spark code with idea succesful but run SparkPi error with "java.lang.SecurityException"

2014-08-11 Thread Ron's Yahoo!
Not sure what your environment is but this happened to me before because I had 
a couple of servlet-api jars in the path which did not match.
I was building a system that programmatically submitted jobs so I had my own 
jars that conflicted with that of spark. The solution is do mvn dependency:tree 
from your app and see what jars you have an ensure to exclude those.

Thanks,
Ron

On Aug 11, 2014, at 6:36 AM, Zhanfeng Huo  wrote:

> Hi,
> 
> I have compiled spark-1.0.1 code with Intellij Idea 13.1.4 on ubuntu 
> 14.04 succesful but when I run SparkPi Example in local mode it failed .
> 
> I have set env " export SPARK_HADOOP_VERSION=2.3.0 and export  
> SPARK_YARN=true" before I  start Idea.
> 
> I have attemped to use  patch 
> @https://github.com/apache/spark/pull/1271/files , but it doesn't effect. How 
> can I solve this problem?
> 
>  
>The full message:
> 
> 14/08/11 22:15:56 INFO Utils: Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties 
> 14/08/11 22:15:56 WARN Utils: Your hostname, syn resolves to a loopback 
> address: 127.0.1.1; using 192.168.159.132 instead (on interface eth0) 
> 14/08/11 22:15:56 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to 
> another address 
> 14/08/11 22:15:56 INFO SecurityManager: Changing view acls to: syn 
> 14/08/11 22:15:56 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(syn) 
> 14/08/11 22:15:57 INFO Slf4jLogger: Slf4jLogger started 
> 14/08/11 22:15:57 INFO Remoting: Starting remoting 
> 14/08/11 22:15:57 INFO Remoting: Remoting started; listening on addresses 
> :[akka.tcp://spark@192.168.159.132:50914] 
> 14/08/11 22:15:57 INFO Remoting: Remoting now listens on addresses: 
> [akka.tcp://spark@192.168.159.132:50914]
> 14/08/11 22:15:57 INFO SparkEnv: Registering MapOutputTracker 
> 14/08/11 22:15:57 INFO SparkEnv: Registering BlockManagerMaster 
> 14/08/11 22:15:57 INFO DiskBlockManager: Created local directory at 
> /tmp/spark-local-20140811221557-dd19 
> 14/08/11 22:15:57 INFO MemoryStore: MemoryStore started with capacity 804.3 
> MB. 
> 14/08/11 22:15:57 INFO ConnectionManager: Bound socket to port 56061 with id 
> = ConnectionManagerId(192.168.159.132,56061) 
> 14/08/11 22:15:57 INFO BlockManagerMaster: Trying to register BlockManager 
> 14/08/11 22:15:57 INFO BlockManagerInfo: Registering block manager 
> 192.168.159.132:56061 with 804.3 MB RAM 
> 14/08/11 22:15:57 INFO BlockManagerMaster: Registered BlockManager 
> 14/08/11 22:15:57 INFO HttpServer: Starting HTTP Server 
> 14/08/11 22:15:57 INFO HttpBroadcast: Broadcast server started at 
> http://192.168.159.132:39676 
> 14/08/11 22:15:57 INFO HttpFileServer: HTTP File server directory is 
> /tmp/spark-f8474345-0dcd-41c4-9247-3e916d409b27 
> 14/08/11 22:15:57 INFO HttpServer: Starting HTTP Server 
> Exception in thread "main" java.lang.SecurityException: class 
> "javax.servlet.FilterRegistration"'s signer information does not match signer 
> information of other classes in the same package 
> at java.lang.ClassLoader.checkCerts(ClassLoader.java:952) 
> at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666) 
> at java.lang.ClassLoader.defineClass(ClassLoader.java:794) 
> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) 
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) 
> at java.net.URLClassLoader.access$100(URLClassLoader.java:71) 
> at java.net.URLClassLoader$1.run(URLClassLoader.java:361) 
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
> at java.security.AccessController.doPrivileged(Native Method) 
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) 
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
> at 
> org.eclipse.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:136)
>  
> at 
> org.eclipse.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:129)
>  
> at 
> org.eclipse.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:98)
>  
> at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:98) 
> at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:89) 
> at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:64) 
> at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:57) 
> at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:57) 
> at 
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>  
> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) 
> at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:57) 
> at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:66) 
> at org.apache.spark.ui.SparkUI.(SparkUI.scala:60) 
> at org.apache.spark.ui.SparkUI.(SparkUI.scala:42
> at org.apache.spark.SparkContext.(SparkContext.scala:

Re: Re: Compile spark code with idea succesful but run SparkPi error with "java.lang.SecurityException"

2014-08-13 Thread Zhanfeng Huo
Thank you, Ton.  That helps a lot.

I want to debug spark code for  tracing state transform. So I use sbt as my 
build tools and compile spark code in Intellij IDEA . 



Zhanfeng Huo
 
From: Ron's Yahoo!
Date: 2014-08-12 03:46
To: Zhanfeng Huo
CC: user
Subject: Re: Compile spark code with idea succesful but run SparkPi error with 
"java.lang.SecurityException"
Not sure what your environment is but this happened to me before because I had 
a couple of servlet-api jars in the path which did not match.
I was building a system that programmatically submitted jobs so I had my own 
jars that conflicted with that of spark. The solution is do mvn dependency:tree 
from your app and see what jars you have an ensure to exclude those.

Thanks,
Ron

On Aug 11, 2014, at 6:36 AM, Zhanfeng Huo  wrote:

Hi,

I have compiled spark-1.0.1 code with Intellij Idea 13.1.4 on ubuntu 14.04 
succesful but when I run SparkPi Example in local mode it failed .

I have set env " export SPARK_HADOOP_VERSION=2.3.0 and export  
SPARK_YARN=true" before I  start Idea.

I have attemped to use  patch 
@https://github.com/apache/spark/pull/1271/files , but it doesn't effect. How 
can I solve this problem?

 
   The full message:

14/08/11 22:15:56 INFO Utils: Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties 
14/08/11 22:15:56 WARN Utils: Your hostname, syn resolves to a loopback 
address: 127.0.1.1; using 192.168.159.132 instead (on interface eth0) 
14/08/11 22:15:56 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another 
address 
14/08/11 22:15:56 INFO SecurityManager: Changing view acls to: syn 
14/08/11 22:15:56 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(syn) 
14/08/11 22:15:57 INFO Slf4jLogger: Slf4jLogger started 
14/08/11 22:15:57 INFO Remoting: Starting remoting 
14/08/11 22:15:57 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://spark@192.168.159.132:50914] 
14/08/11 22:15:57 INFO Remoting: Remoting now listens on addresses: 
[akka.tcp://spark@192.168.159.132:50914]
14/08/11 22:15:57 INFO SparkEnv: Registering MapOutputTracker 
14/08/11 22:15:57 INFO SparkEnv: Registering BlockManagerMaster 
14/08/11 22:15:57 INFO DiskBlockManager: Created local directory at 
/tmp/spark-local-20140811221557-dd19 
14/08/11 22:15:57 INFO MemoryStore: MemoryStore started with capacity 804.3 MB. 
14/08/11 22:15:57 INFO ConnectionManager: Bound socket to port 56061 with id = 
ConnectionManagerId(192.168.159.132,56061) 
14/08/11 22:15:57 INFO BlockManagerMaster: Trying to register BlockManager 
14/08/11 22:15:57 INFO BlockManagerInfo: Registering block manager 
192.168.159.132:56061 with 804.3 MB RAM 
14/08/11 22:15:57 INFO BlockManagerMaster: Registered BlockManager 
14/08/11 22:15:57 INFO HttpServer: Starting HTTP Server 
14/08/11 22:15:57 INFO HttpBroadcast: Broadcast server started at 
http://192.168.159.132:39676 
14/08/11 22:15:57 INFO HttpFileServer: HTTP File server directory is 
/tmp/spark-f8474345-0dcd-41c4-9247-3e916d409b27 
14/08/11 22:15:57 INFO HttpServer: Starting HTTP Server 
Exception in thread "main" java.lang.SecurityException: class 
"javax.servlet.FilterRegistration"'s signer information does not match signer 
information of other classes in the same package 
at java.lang.ClassLoader.checkCerts(ClassLoader.java:952) 
at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666) 
at java.lang.ClassLoader.defineClass(ClassLoader.java:794) 
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) 
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) 
at java.net.URLClassLoader.access$100(URLClassLoader.java:71) 
at java.net.URLClassLoader$1.run(URLClassLoader.java:361) 
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) 
at java.security.AccessController.doPrivileged(Native Method) 
at java.net.URLClassLoader.findClass(URLClassLoader.java:354) 
at java.lang.ClassLoader.loadClass(ClassLoader.java:425) 
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) 
at java.lang.ClassLoader.loadClass(ClassLoader.java:358) 
at 
org.eclipse.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:136)
 
at 
org.eclipse.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:129)
 
at 
org.eclipse.jetty.servlet.ServletContextHandler.(ServletContextHandler.java:98)
 
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:98) 
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:89) 
at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:64) 
at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:57) 
at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:57) 
at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) 
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)