Re: Re[2]: HBase 0.96+ with Spark 1.0+

2014-09-12 Thread Aniket Bhatnagar
Hi Reinis

Try if the exclude suggestion from me and Sean works for you. If not, can
you turn on verbose class loading to see from where
javax.servlet.ServletRegistration is loaded? The class should load
from org.mortbay.jetty
% servlet-api % jettyVersion. If it loads from some other jar, you would
have to exclude it from your build.

Hope it helps.

Thanks,
Aniket

On 12 September 2014 02:21, sp...@orbit-x.de wrote:

 Thank you, Aniket for your hint!

 Alas, I am facing really hellish situation as it seems, because I have
 integration tests using BOTH spark and HBase (Minicluster). Thus I get
 either:

 class javax.servlet.ServletRegistration's signer information does not
 match signer information of other classes in the same package
 java.lang.SecurityException: class javax.servlet.ServletRegistration's
 signer information does not match signer information of other classes in
 the same package
 at java.lang.ClassLoader.checkCerts(ClassLoader.java:943)
 at java.lang.ClassLoader.preDefineClass(ClassLoader.java:657)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:785)

 or:

 [info]   Cause: java.lang.ClassNotFoundException:
 org.mortbay.jetty.servlet.Context
 [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
 [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 [info]   at java.security.AccessController.doPrivileged(Native Method)
 [info]   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
 [info]   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
 [info]   at
 org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:661)
 [info]   at
 org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:552)
 [info]   at
 org.apache.hadoop.hdfs.server.namenode.NameNode.init(NameNode.java:720)

 I am searching the web already for a week trying to figure out how to make
 this work :-/

 all the help or hints are greatly appreciated
 reinis


 --
 -Original-Nachricht-
 Von: Aniket Bhatnagar aniket.bhatna...@gmail.com
 An: sp...@orbit-x.de
 Cc: user user@spark.apache.org
 Datum: 11-09-2014 20:00
 Betreff: Re: Re[2]: HBase 0.96+ with Spark 1.0+


 Dependency hell... My fav problem :).

 I had run into a similar issue with hbase and jetty. I cant remember thw
 exact fix, but is are excerpts from my dependencies that may be relevant:

 val hadoop2Common = org.apache.hadoop % hadoop-common % hadoop2Version
 excludeAll(

   ExclusionRule(organization = javax.servlet),

   ExclusionRule(organization = javax.servlet.jsp),

 ExclusionRule(organization = org.mortbay.jetty)

   )

   val hadoop2MapRedClient = org.apache.hadoop %
 hadoop-mapreduce-client-core % hadoop2Version

   val hbase = org.apache.hbase % hbase % hbaseVersion excludeAll(

   ExclusionRule(organization = org.apache.maven.wagon),

   ExclusionRule(organization = org.jboss.netty),

 ExclusionRule(organization = org.mortbay.jetty),

   ExclusionRule(organization = org.jruby) // Don't need HBASE's jruby.
 It pulls in whole lot of other dependencies like joda-time.

   )

 val sparkCore = org.apache.spark %% spark-core % sparkVersion

   val sparkStreaming = org.apache.spark %% spark-streaming %
 sparkVersion

   val sparkSQL = org.apache.spark %% spark-sql % sparkVersion

   val sparkHive = org.apache.spark %% spark-hive % sparkVersion

   val sparkRepl = org.apache.spark %% spark-repl % sparkVersion

   val sparkAll = Seq (

   sparkCore excludeAll(

   ExclusionRule(organization = org.apache.hadoop)), // We assume hadoop
 2 and hence omit hadoop 1 dependencies

   sparkSQL,

   sparkStreaming,

   hadoop2MapRedClient,

   hadoop2Common,

   org.mortbay.jetty % servlet-api % 3.0.20100224

   )

 On Sep 11, 2014 8:05 PM, sp...@orbit-x.de wrote:

 Hi guys,

 any luck with this issue, anyone?

 I aswell tried all the possible exclusion combos to a no avail.

 thanks for your ideas
 reinis

 -Original-Nachricht-
  Von: Stephen Boesch java...@gmail.com
  An: user user@spark.apache.org
  Datum: 28-06-2014 15:12
  Betreff: Re: HBase 0.96+ with Spark 1.0+
 
  Hi Siyuan,
 Thanks for the input. We are preferring to use the SparkBuild.scala
 instead of maven. I did not see any protobuf.version related settings in
 that file. But - as noted by Sean Owen - in any case the issue we are
 facing presently is about the duplicate incompatible javax.servlet entries
 - apparently from the org.mortbay artifacts.


 
  2014-06-28 6:01 GMT-07:00 Siyuan he hsy...@gmail.com:
  Hi Stephen,
 
 I am using spark1.0+ HBase0.96.2. This is what I did:
 1) rebuild spark using: mvn -Dhadoop.version=2.3.0
 -Dprotobuf.version=2.5.0 -DskipTests clean package
 2) In spark-env.sh, set SPARK_CLASSPATH =
 /path-to/hbase-protocol-0.96.2-hadoop2.jar

 
 Hopefully it can help.
 Siyuan


 
  On Sat, Jun 28, 2014 at 8:52

Re[2]: HBase 0.96+ with Spark 1.0+

2014-09-11 Thread spark
Hi guys,

any luck with this issue, anyone?

I aswell tried all the possible exclusion combos to a no avail.

thanks for your ideas
reinis

-Original-Nachricht- 
 Von: Stephen Boesch java...@gmail.com 
 An: user user@spark.apache.org 
 Datum: 28-06-2014 15:12 
 Betreff: Re: HBase 0.96+ with Spark 1.0+ 
 
 Hi Siyuan,
 Thanks for the input. We are preferring to use the SparkBuild.scala instead of 
maven.  I did not see any protobuf.version  related settings in that file. But 
- as noted by Sean Owen - in any case the issue we are facing presently is 
about the duplicate incompatible javax.servlet entries - apparently from the 
org.mortbay artifacts.
 
 
 
 2014-06-28 6:01 GMT-07:00 Siyuan he hsy...@gmail.com:
 Hi Stephen,
 
I am using spark1.0+ HBase0.96.2. This is what I did:
1) rebuild spark using: mvn -Dhadoop.version=2.3.0 -Dprotobuf.version=2.5.0 
-DskipTests clean package
2) In spark-env.sh, set SPARK_CLASSPATH = 
/path-to/hbase-protocol-0.96.2-hadoop2.jar 

 
Hopefully it can help.
Siyuan
 
 
 
 On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch java...@gmail.com wrote:
  
 
Thanks Sean.  I had actually already added exclusion rule for org.mortbay.jetty 
- and that had not resolved it.
 
Just in case I used your precise formulation:

 
val excludeMortbayJetty = ExclusionRule(organization = org.mortbay.jetty)
..
 
  ,(org.apache.spark % spark-core_2.10 % sparkVersion  
withSources()).excludeAll(excludeMortbayJetty)
  ,(org.apache.spark % spark-sql_2.10 % sparkVersion  
withSources()).excludeAll(excludeMortbayJetty)

 
However the same error still recurs:

 
14/06/28 05:48:35 INFO HttpServer: Starting HTTP Server
[error] (run-main-0) java.lang.SecurityException: class 
javax.servlet.FilterRegistration's signer information does not match signer 
information of other classes in the same package
java.lang.SecurityException: class javax.servlet.FilterRegistration's signer 
information does not match signer information of other classes in the same 
package
 
 

 

 

 
 2014-06-28 4:22 GMT-07:00 Sean Owen so...@cloudera.com:

 This sounds like an instance of roughly the same item as in
 https://issues.apache.org/jira/browse/SPARK-1949  Have a look at
 adding that exclude to see if it works.
 

 On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch java...@gmail.com wrote:
  The present trunk is built and tested against HBase 0.94.
 
 
  I have tried various combinations of versions of HBase 0.96+ and Spark 1.0+
  and all end up with
 
  14/06/27 20:11:15 INFO HttpServer: Starting HTTP Server
  [error] (run-main-0) java.lang.SecurityException: class
  javax.servlet.FilterRegistration's signer information does not match
  signer information of other classes in the same package
  java.lang.SecurityException: class javax.servlet.FilterRegistration's
  signer information does not match signer information of other classes in the
  same package
  at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
 
 
  I have tried a number of different ways to exclude javax.servlet related
  jars. But none have avoided this error.
 
  Anyone have a (small-ish) build.sbt that works with later versions of HBase?
 
 
  
 
 
  
 
 
  
 
  




-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Re[2]: HBase 0.96+ with Spark 1.0+

2014-09-11 Thread Aniket Bhatnagar
Dependency hell... My fav problem :).

I had run into a similar issue with hbase and jetty. I cant remember thw
exact fix, but is are excerpts from my dependencies that may be relevant:

val hadoop2Common = org.apache.hadoop % hadoop-common % hadoop2Version
excludeAll(

ExclusionRule(organization = javax.servlet),

ExclusionRule(organization = javax.servlet.jsp),

ExclusionRule(organization = org.mortbay.jetty)

   )

  val hadoop2MapRedClient = org.apache.hadoop %
hadoop-mapreduce-client-core % hadoop2Version

  val hbase = org.apache.hbase % hbase % hbaseVersion excludeAll(

ExclusionRule(organization = org.apache.maven.wagon),

ExclusionRule(organization = org.jboss.netty),

ExclusionRule(organization = org.mortbay.jetty),

ExclusionRule(organization = org.jruby) // Don't need
HBASE's jruby. It pulls in whole lot of other dependencies like joda-time.

)

val sparkCore = org.apache.spark %% spark-core % sparkVersion

  val sparkStreaming = org.apache.spark %% spark-streaming %
sparkVersion

  val sparkSQL = org.apache.spark %% spark-sql % sparkVersion

  val sparkHive = org.apache.spark %% spark-hive % sparkVersion

  val sparkRepl = org.apache.spark %% spark-repl % sparkVersion

  val sparkAll = Seq (

sparkCore excludeAll(

  ExclusionRule(organization = org.apache.hadoop)), // We assume
hadoop 2 and hence omit hadoop 1 dependencies

sparkSQL,

sparkStreaming,

hadoop2MapRedClient,

hadoop2Common,

org.mortbay.jetty % servlet-api % 3.0.20100224

  )

On Sep 11, 2014 8:05 PM, sp...@orbit-x.de wrote:

 Hi guys,

 any luck with this issue, anyone?

 I aswell tried all the possible exclusion combos to a no avail.

 thanks for your ideas
 reinis

 -Original-Nachricht-
  Von: Stephen Boesch java...@gmail.com
  An: user user@spark.apache.org
  Datum: 28-06-2014 15:12
  Betreff: Re: HBase 0.96+ with Spark 1.0+
 
  Hi Siyuan,
  Thanks for the input. We are preferring to use the SparkBuild.scala
 instead of maven.  I did not see any protobuf.version  related settings in
 that file. But - as noted by Sean Owen - in any case the issue we are
 facing presently is about the duplicate incompatible javax.servlet entries
 - apparently from the org.mortbay artifacts.


 
  2014-06-28 6:01 GMT-07:00 Siyuan he hsy...@gmail.com:
  Hi Stephen,
 
 I am using spark1.0+ HBase0.96.2. This is what I did:
 1) rebuild spark using: mvn -Dhadoop.version=2.3.0
 -Dprotobuf.version=2.5.0 -DskipTests clean package
 2) In spark-env.sh, set SPARK_CLASSPATH =
 /path-to/hbase-protocol-0.96.2-hadoop2.jar

 
 Hopefully it can help.
 Siyuan


 
  On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch java...@gmail.com
 wrote:
 
 
 Thanks Sean.  I had actually already added exclusion rule for
 org.mortbay.jetty - and that had not resolved it.
 
 Just in case I used your precise formulation:

 
 val excludeMortbayJetty = ExclusionRule(organization = org.mortbay.jetty)
 ..

   ,(org.apache.spark % spark-core_2.10 % sparkVersion
 withSources()).excludeAll(excludeMortbayJetty)
   ,(org.apache.spark % spark-sql_2.10 % sparkVersion
 withSources()).excludeAll(excludeMortbayJetty)

 
 However the same error still recurs:

 
 14/06/28 05:48:35 INFO HttpServer: Starting HTTP Server
 [error] (run-main-0) java.lang.SecurityException: class
 javax.servlet.FilterRegistration's signer information does not match
 signer information of other classes in the same package
 java.lang.SecurityException: class javax.servlet.FilterRegistration's
 signer information does not match signer information of other classes in
 the same package



 

 

 
  2014-06-28 4:22 GMT-07:00 Sean Owen so...@cloudera.com:

  This sounds like an instance of roughly the same item as in
  https://issues.apache.org/jira/browse/SPARK-1949  Have a look at
  adding that exclude to see if it works.
 

  On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch java...@gmail.com
 wrote:
   The present trunk is built and tested against HBase 0.94.
  
  
   I have tried various combinations of versions of HBase 0.96+ and Spark
 1.0+
   and all end up with
  
   14/06/27 20:11:15 INFO HttpServer: Starting HTTP Server
   [error] (run-main-0) java.lang.SecurityException: class
   javax.servlet.FilterRegistration's signer information does not match
   signer information of other classes in the same package
   java.lang.SecurityException: class javax.servlet.FilterRegistration's
   signer information does not match signer information of other classes
 in the
   same package
   at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
  
  
   I have tried a number of different ways to exclude javax.servlet
 related
   jars. But none have avoided this error.
  
   Anyone have a (small-ish) build.sbt that works with later versions of
 HBase?
  
  
 


 


 

 




 -
 To unsubscribe, e-mail: 

Re[2]: HBase 0.96+ with Spark 1.0+

2014-09-11 Thread spark
Thank you, Aniket for your hint!

Alas, I am facing really hellish situation as it seems, because I have 
integration tests using BOTH spark and HBase (Minicluster). Thus I get either:

class javax.servlet.ServletRegistration's signer information does not match 
signer information of other classes in the same package
java.lang.SecurityException: class javax.servlet.ServletRegistration's signer 
information does not match signer information of other classes in the same 
package
    at java.lang.ClassLoader.checkCerts(ClassLoader.java:943)
    at java.lang.ClassLoader.preDefineClass(ClassLoader.java:657)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:785)

or:

[info]   Cause: java.lang.ClassNotFoundException: 
org.mortbay.jetty.servlet.Context
[info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
[info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
[info]   at java.security.AccessController.doPrivileged(Native Method)
[info]   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
[info]   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
[info]   at 
org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:661)
[info]   at 
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:552)
[info]   at 
org.apache.hadoop.hdfs.server.namenode.NameNode.init(NameNode.java:720)

I am searching the web already for a week trying to figure out how to make this 
work :-/

all the help or hints are greatly appreciated
reinis



-Original-Nachricht-
Von: Aniket Bhatnagar aniket.bhatna...@gmail.com
An: sp...@orbit-x.de
Cc: user user@spark.apache.org
Datum: 11-09-2014 20:00
Betreff: Re: Re[2]: HBase 0.96+ with Spark 1.0+


Dependency hell... My fav problem :).
I had run into a similar issue with hbase and jetty. I cant remember thw exact 
fix, but is are excerpts from my dependencies that may be relevant:
val hadoop2Common = org.apache.hadoop % hadoop-common % hadoop2Version 
excludeAll(
    ExclusionRule(organization = javax.servlet),
    ExclusionRule(organization = javax.servlet.jsp),
ExclusionRule(organization = org.mortbay.jetty)
   )
  val hadoop2MapRedClient = org.apache.hadoop % 
hadoop-mapreduce-client-core % hadoop2Version
  val hbase = org.apache.hbase % hbase % hbaseVersion excludeAll(
    ExclusionRule(organization = org.apache.maven.wagon),
    ExclusionRule(organization = org.jboss.netty),
ExclusionRule(organization = org.mortbay.jetty),
    ExclusionRule(organization = org.jruby) // Don't need HBASE's 
jruby. It pulls in whole lot of other dependencies like joda-time.
    )
val sparkCore = org.apache.spark %% spark-core % sparkVersion
  val sparkStreaming = org.apache.spark %% spark-streaming % sparkVersion
  val sparkSQL = org.apache.spark %% spark-sql % sparkVersion
  val sparkHive = org.apache.spark %% spark-hive % sparkVersion
  val sparkRepl = org.apache.spark %% spark-repl % sparkVersion
  val sparkAll = Seq (
    sparkCore excludeAll(
  ExclusionRule(organization = org.apache.hadoop)), // We assume hadoop 2 
and hence omit hadoop 1 dependencies
    sparkSQL,
    sparkStreaming,
    hadoop2MapRedClient,
    hadoop2Common,
    org.mortbay.jetty % servlet-api % 3.0.20100224
  )

On Sep 11, 2014 8:05 PM, sp...@orbit-x.de wrote:
Hi guys,

any luck with this issue, anyone?

I aswell tried all the possible exclusion combos to a no avail.

thanks for your ideas
reinis

-Original-Nachricht-
 Von: Stephen Boesch java...@gmail.com
 An: user user@spark.apache.org
 Datum: 28-06-2014 15:12
 Betreff: Re: HBase 0.96+ with Spark 1.0+

 Hi Siyuan,
 Thanks for the input. We are preferring to use the SparkBuild.scala instead of 
maven.  I did not see any protobuf.version  related settings in that file. But 
- as noted by Sean Owen - in any case the issue we are facing presently is 
about the duplicate incompatible javax.servlet entries - apparently from the 
org.mortbay artifacts.
 
 

 2014-06-28 6:01 GMT-07:00 Siyuan he hsy...@gmail.com:
 Hi Stephen,

I am using spark1.0+ HBase0.96.2. This is what I did:
1) rebuild spark using: mvn -Dhadoop.version=2.3.0 -Dprotobuf.version=2.5.0 
-DskipTests clean package
2) In spark-env.sh, set SPARK_CLASSPATH = 
/path-to/hbase-protocol-0.96.2-hadoop2.jar 


Hopefully it can help.
Siyuan
 
 

 On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch java...@gmail.com wrote:
  

Thanks Sean.  I had actually already added exclusion rule for org.mortbay.jetty 
- and that had not resolved it.

Just in case I used your precise formulation:


val excludeMortbayJetty = ExclusionRule(organization = org.mortbay.jetty)
..
 
  ,(org.apache.spark % spark-core_2.10 % sparkVersion  
withSources()).excludeAll(excludeMortbayJetty)
  ,(org.apache.spark % spark-sql_2.10

Re: Re[2]: HBase 0.96+ with Spark 1.0+

2014-09-11 Thread Sean Owen
This was already answered at the bottom of this same thread -- read below.

On Thu, Sep 11, 2014 at 9:51 PM,  sp...@orbit-x.de wrote:
 class javax.servlet.ServletRegistration's signer information does not
 match signer information of other classes in the same package
 java.lang.SecurityException: class javax.servlet.ServletRegistration's
 signer information does not match signer information of other classes in the
 same package
 at java.lang.ClassLoader.checkCerts(ClassLoader.java:943)
 at java.lang.ClassLoader.preDefineClass(ClassLoader.java:657)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:785)

  2014-06-28 4:22 GMT-07:00 Sean Owen so...@cloudera.com:

  This sounds like an instance of roughly the same item as in
  https://issues.apache.org/jira/browse/SPARK-1949 Have a look at
  adding that exclude to see if it works.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org