I'm not exactly sure what you mean by "replacing the context class
loader".  Is this done in the java code, or the blueprint file?

Thanks for your help on this.


On Wed, Nov 19, 2014 at 1:26 PM, Mark Webb <[email protected]> wrote:

> I believe you are correct.  I will add that to my pom.xml and see if it
> helps.
>
>
> On Wed, Nov 19, 2014 at 12:21 PM, <[email protected]> wrote:
>
>> Replacing the context class loader is probably the trick you need, then
>> combining that with
>>
>> <DynamicImport-Package>*</DynamicImport-Package>
>>
>>
>>
>>
>> Is probably what you need.
>>
>> On Wed, Nov 19, 2014 at 9:59 AM, null <[email protected]> wrote:
>>
>> > I did write a chapter on that for the Apache Karaf CookBook.
>> > https://www.packtpub.com/application-development/apache-karaf-cookbook
>> > The example code is here -
>> > https://github.com/jgoodyear/ApacheKarafCookbook
>> > I just write to HDFS but I suspect you’ll get a head start from that
>> stuff.
>> > And the goal in that chapter wasn’t to embed Hadoop, rather to be able
>> to
>> > externally access an existing system.
>> > /je
>> > On Wed, Nov 19, 2014 at 9:40 AM, Mark Webb <[email protected]>
>> wrote:
>> >> Has anyone had any luck getting a Camel route in ServiceMix to write
>> to a
>> >> HBase database?  I've been running into problems avro impatabilities
>> (fixed
>> >> in 5.1.4) and now with LoginModule not found errors.  I have to believe
>> >> that someone has gotten this all to work, but I am getting frustrated
>> and
>> >> looking for some assistance.  I've read a few blogs online that state
>> that
>> >> HBase is using classloaders that make the integration with an OSGi
>> >> container very difficult.  Does anyone know of a tutorial or
>> documentation
>> >> on how to get ServiceMix/HBase/Camel working together?
>> >> Thank you,
>> >> Mark
>>
>
>

Reply via email to