Looks like - Session 0x0 for server null, Session and that Exception isn’t 
loaded.

On Wed, Nov 19, 2014 at 1:13 PM, Mark Webb <[email protected]> wrote:

> So when I add in the line:
> <DynamicImport-Package>*</DynamicImport-Package>
> to my pom.xml, I get the following error:
> Bundle XYZ is not compatible with this blueprint extender
> When I take out that line from the pom.xml file, I get the following stack
> trace in the ServiceMix log file.  It appears that I'm attempting to
> connect to HBase, but now am running into authentication and/or class
> loading errors.
> 15:06:58,053 | WARN  | .120.1.128:2181) | ClientCnxn
> | 230 - org.apache.hadoop.zookeeper - 3.4.6 | Session 0x0 for server null,
> unexpected error, closing socket co
> nnection and attempting reconnect
> java.lang.NoClassDefFoundError: org/ietf/jgss/GSSException
>         at
> org.apache.zookeeper.ClientCnxn$SendThread.startConnect(ClientCnxn.java:945)[230:org.apache.hadoop.zookeeper:3.4.6]
>         at
> org.apache.zookeeper.ClientCnxn$SendThread.run(ClientCnxn.java:1003)[230:org.apache.hadoop.zookeeper:3.4.6]
> Caused by: java.lang.ClassNotFoundException: org.ietf.jgss.GSSException not
> found by org.apache.hadoop.zookeeper [230]
>         at
> org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1460)[org.apache.felix.framework-4.0.3.jar:]
>         at
> org.apache.felix.framework.BundleWiringImpl.access$400(BundleWiringImpl.java:72)[org.apache.felix.framework-4.0.3.jar:]
>         at
> org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.loadClass(BundleWiringImpl.java:1843)[org.apache.felix.framework-4.0.3.jar:]
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)[:1.7.0_71]
>         ... 2 more
> 15:06:58,173 | WARN  | md-feed_Worker-1 | RecoverableZooKeeper
> | 232 - org.apache.servicemix.bundles.hbase - 0.94.6.1 | Possibly transient
> ZooKeeper exception: org.apache.zo
> okeeper.KeeperException$ConnectionLossException: KeeperErrorCode =
> ConnectionLoss for /hbase/hbaseid
> On Wed, Nov 19, 2014 at 2:01 PM, <[email protected]> wrote:
>> Here,
>> https://github.com/jgoodyear/ApacheKarafCookbook/blob/master/chapter9/chapter-9-recipe1/src/main/java/com/packt/hadoop/demo/hdfs/HdfsConfigServiceImpl.java
>>
>>
>> Line 30 - we replace the TCCL with our class loader, then new threads that
>> you pretty much can bet on
>>
>> will be spawned by those external libraries use our CL.
>>
>>
>>
>>
>> ClassLoader tccl = Thread.currentThread().getContextClassLoader();
>>
>>             try {
>>
>>
>>
>> Thread.currentThread().setContextClassLoader(getClass().getClassLoader());
>>
>>
>> Just replace it afterwards.
>>
>> On Wed, Nov 19, 2014 at 11:50 AM, Mark Webb <[email protected]> wrote:
>>
>> > I'm not exactly sure what you mean by "replacing the context class
>> > loader".  Is this done in the java code, or the blueprint file?
>> > Thanks for your help on this.
>> > On Wed, Nov 19, 2014 at 1:26 PM, Mark Webb <[email protected]>
>> wrote:
>> >> I believe you are correct.  I will add that to my pom.xml and see if it
>> >> helps.
>> >>
>> >>
>> >> On Wed, Nov 19, 2014 at 12:21 PM, <[email protected]> wrote:
>> >>
>> >>> Replacing the context class loader is probably the trick you need, then
>> >>> combining that with
>> >>>
>> >>> <DynamicImport-Package>*</DynamicImport-Package>
>> >>>
>> >>>
>> >>>
>> >>>
>> >>> Is probably what you need.
>> >>>
>> >>> On Wed, Nov 19, 2014 at 9:59 AM, null <[email protected]> wrote:
>> >>>
>> >>> > I did write a chapter on that for the Apache Karaf CookBook.
>> >>> >
>> https://www.packtpub.com/application-development/apache-karaf-cookbook
>> >>> > The example code is here -
>> >>> > https://github.com/jgoodyear/ApacheKarafCookbook
>> >>> > I just write to HDFS but I suspect you’ll get a head start from that
>> >>> stuff.
>> >>> > And the goal in that chapter wasn’t to embed Hadoop, rather to be
>> able
>> >>> to
>> >>> > externally access an existing system.
>> >>> > /je
>> >>> > On Wed, Nov 19, 2014 at 9:40 AM, Mark Webb <[email protected]>
>> >>> wrote:
>> >>> >> Has anyone had any luck getting a Camel route in ServiceMix to write
>> >>> to a
>> >>> >> HBase database?  I've been running into problems avro impatabilities
>> >>> (fixed
>> >>> >> in 5.1.4) and now with LoginModule not found errors.  I have to
>> believe
>> >>> >> that someone has gotten this all to work, but I am getting
>> frustrated
>> >>> and
>> >>> >> looking for some assistance.  I've read a few blogs online that
>> state
>> >>> that
>> >>> >> HBase is using classloaders that make the integration with an OSGi
>> >>> >> container very difficult.  Does anyone know of a tutorial or
>> >>> documentation
>> >>> >> on how to get ServiceMix/HBase/Camel working together?
>> >>> >> Thank you,
>> >>> >> Mark
>> >>>
>> >>
>> >>
>>

Reply via email to