You can use wrap:mvn:groupid/artefactid/version - the handler wrap will 
dynamically create the manifest for you.  (there's more info in the Karaf 
manual included in the binary distribution)

----- Reply message -----
From: "Geoffry Roberts" <[email protected]>
To: <[email protected]>
Subject: Camel and Hadoop in Karaf
Date: Thu, Oct 6, 2011 21:18
I did some further thinking on this.  

Doen anyone know if the camel-hdfs feature includes all necessary dependencies  
e.g. hadoop-common-xx.jar? or must these be added to the Karaf classpath to get 
things to work?  


While we're on the subject, does anyone know how to add a library that is not 
an OSGi bundle to the Karaf classpath?  I tried dropping the aforementioned 
hadoop-common-xx.jar into the lib/endorsed directory.  I then restarted Karaf.  
I got a ClassNotfoundException message on the console.  Now this class is 
indeed included in the jar file.  What gives?


If I remove the jar then restart the error does not occur.

 log4j:ERROR Could not instantiate class 
[org.apache.hadoop.metrics.jvm.EventCounter].
java.lang.ClassNotFoundException: org.apache.hadoop.metrics.jvm.EventCounter 
not found by org.ops4j.pax.logging.pax-logging-service [3]

        at 
org.apache.felix.framework.ModuleImpl.findClassOrResourceByDelegation(ModuleImpl.java:787)
        at org.apache.felix.framework.ModuleImpl.access$400(ModuleImpl.java:71)
        at 
org.apache.felix.framework.ModuleImpl$ModuleClassLoader.loadClass(ModuleImpl.java:1768)



On 4 October 2011 08:25, Geoffry Roberts <[email protected]> wrote:

Jean,

Yes, both camel-spring and camel-hdfs are showing as installed; both are also 
showing as active.

The aforementioned error message is all that is given.  The level is set to 
debug. The blueprint shows as active but nothing happens. 



On 3 October 2011 22:01, Jean-Baptiste Onofré <[email protected]> wrote:


Hi Geoffry,



did you install camel-spring and camel-hdfs feature in Karaf ?



Could you send the log (log:display) ?



Regards

JB



On 10/03/2011 10:52 PM, Geoffry Roberts wrote:


All,



I'm having a problem getting things to work with hadoop's hdfs.  This is

my first try with this so I'm just trying to something simple.  I want

to read a file from hdfs and write it's contents to the console.



Can anyone see what I'm doing wrong?  Thanks.



Here's the error (from karaf):



org.osgi.service.blueprint.container.ComponentDefinitionException:

Unable to intialize bean camel-16



Here's my blueprint:



<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0";>

<camelContext xmlns="http://camel.apache.org/schema/blueprint";>

<route>

<from uri="hdfs://qq000:54310/user/hadoop/epistate.xmi?noop=true" />

<to uri="stream:out" />

</route>

</camelContext>

</blueprint>



Here's where hdfs is bound (from core-site.xml in the hadoop config):



<configuration>

...

<property>

<name>fs.default.name <http://fs.default.name></name>

<value>hdfs://qq000:54310</value>

</property>

</configuration>



--

Geoffry Roberts






-- 

Jean-Baptiste Onofré

[email protected]

http://blog.nanthrax.net

Talend - http://www.talend.com



-- 
Geoffry Roberts




-- 
Geoffry Roberts

Reply via email to