https://cwiki.apache.org/confluence/display/AMBARI/Stacks+and+Services


On Wed, Jun 4, 2014 at 10:35 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:

> Custom stack ? Is this new feature with 1.6.0 ?
> Is this exposed from Web UI ? Or i this only REST interface ?
>
>
> On Wed, Jun 4, 2014 at 10:00 PM, Pradeep Gollakota <pradeep...@gmail.com>
> wrote:
>
>> Ambari has a concept of custom stacks. So, you can write a custom stack
>> to deploy Druid. At installation time, you can choose to install your Druid
>> stack but not the Hadoop stack.
>>
>>
>> On Wed, Jun 4, 2014 at 9:21 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com>
>> wrote:
>>
>>> Hello
>>> I really like the way ambari can be used to setup cluster effortlessly.
>>> I would like to use this benefits
>>>
>>> 1. Druid is a distributed columnar store (druid.io)
>>> 2. Druid has several node types like, historical, realtime, broker and
>>> coordinator.
>>> 3. Each of these can be started by unpacking a tar and start the java
>>> class with Java 1.7. Its that simple.
>>> 4. Now i would like to use ambari to install druid on these various
>>> nodes, just like we install hadoop on various nodes and then assign various
>>> service to each of them.
>>>
>>> How easy or difficult is to remove hadoop* out of ambari and replace it
>>> with druid* or any other installation?
>>>
>>> What do you think ? Can ambari be made that generic ?
>>>
>>> 1. Ambari allows you to install hadoop on a bunch of nodes and provide
>>> monitoring like (service, cpu, disk, io)
>>> 2. Can ambari be made hadoop agnostic ? or generic enough that any
>>> cluster could be installed with it ?
>>>  --
>>> Deepak
>>>
>>>
>>
>
>
> --
> Deepak
>
>

Reply via email to