Hi,
Our architecture team wants to run Hadoop/Hbase and the mapreduce jobs using
OSGi container. This is to take advantages of the OSGi framework to have a
pluggable architecture.
I have searched through the net and looks like people are working or have
achieved success in this. Can some one pleas
Hi Ninad,
I don't know if anyone has looked at this for Hadoop Core or HBase
(although there is this Jira:
https://issues.apache.org/jira/browse/HADOOP-4604), but there's some
work for making ZooKeeper's jar OSGi compliant at
https://issues.apache.org/jira/browse/ZOOKEEPER-425.
Cheers,
Tom
On Th
;
, "Neelesh Salgaonkar"
, "rakhi khatwani"
Sent: Thursday, June 11, 2009 11:31:24 AM GMT -08:00 US/Canada Pacific
Subject: Re: Running Hadoop/Hbase in a OSGi container
Hi Ninad,
I don't know if anyone has looked at this for Hadoop Core or HBase
(although there is
dik" <
> saurabh.maha...@germinait.com>, "Neelesh Salgaonkar" <
> neelesh.salgaon...@germinait.com>, "rakhi khatwani" <
> rakhi.khatw...@germinait.com>
> Sent: Thursday, June 11, 2009 11:31:24 AM GMT -08:00 US/Canada Pacific
> Subject: Re: Running
Ninad Raut wrote:
OSGi provides navigability to your components and create a life cycle for
each of those components viz; install. start, stop, un- deploy etc.
This is the reason why we are thinking of creating components using OSGi.
The problem we are facing is our components using mapreduce and
Steve,
Thanks for the reply. I will surely have a look at it. I made a typo
for "navigability "
I meat pluggability.
Ninad.
On Fri, Jun 12, 2009 at 3:56 PM, Steve Loughran wrote:
> Ninad Raut wrote:
>
>> OSGi provides navigability to your components and create a life cycle for
>> each of those