Hi Jb, I like the idea. thanks for the explanation of the objectives
for me is +1 Regards --Filippo 2014-10-15 8:08 GMT+02:00 Jean-Baptiste Onofré <[email protected]>: > For sirona, the scope is the same, but the implementation/view is different > (I'm Sirona PPMC ;)). However, I see Decanter being able to send/interact > with Sirona. > > As explained in the proposal, I don't want to "external" middlewares for > monitoring (for now Sirona runs in Tomcat for instance). > > Kibana is available as a feature, but it's optional: if the users wants a > ready to use solution, they can install decanter-collector-*, > decanter-simple-scheduler, decanter-appender-elasticsearch, elasticsearch, > and kibana features. But if they don't want to use Kibana or Elasticsearch, > they can use alternative appender (decanter-appender-jdbc, > decanter-appender-zabbix, decanter-appender-nagios, or > decanter-appender-sirona for instance). > > For codehale, good idea. Let me take a look of a collector for that. > > Regards > JB > > > On 10/15/2014 07:53 AM, Łukasz Dywicki wrote: >> >> I think that there is project which might have similar scope - sirona. >> I like general idea but I do not like idea of embedding kibana. Forcing >> usage of any particular tool is just wrong. It also makes sense to start >> supporting codehale metrics since early beginning as this library gets >> more >> and more popular. >> >> +1 from me >> >> Best regards, >> Lukasz >> >> 2014-10-15 5:17 GMT+02:00 Andreas Pieber <[email protected]>: >> >>> Hey, >>> >>> The collection definitely sounds like a perfect idea for a Karaf sub >>> project to me. Beside the great potential for the components I like the >>> especially fitting name 😊 +1 >>> >>> Kind regards, >>> Andreas >>> On Oct 14, 2014 5:13 PM, "Jean-Baptiste Onofré" <[email protected]> wrote: >>> >>>> Hi all, >>>> >>>> First of all, sorry for this long e-mail ;) >>>> >>>> Some weeks ago, I blogged about the usage of ELK >>>> (Logstash/Elasticsearch/Kibana) >>>> with Karaf, Camel, ActiveMQ, etc to provide a monitoring dashboard (know >>>> what's happen in Karaf and be able to store it for a long period): >>>> >>>> http://blog.nanthrax.net/2014/03/apache-karaf-cellar-camel- >>>> activemq-monitoring-with-elk-elasticsearch-logstash-and-kibana/ >>>> >>>> If this solution works fine, there are some drawbacks: >>>> - it requires additional middlewares on the machines. Additionally to >>>> Karaf itself, we have to install logstash, elasticsearch nodes, and >>>> kibana >>>> console >>>> - it's not usable "out of the box": you need at least to configure >>>> logstash (with the different input/output plugins), kibana (to create >>>> the >>>> dashboard that you need) >>>> - it doesn't cover all the monitoring needs, especially in term of SLA: >>>> we want to be able to raise some alerts depending of some events (for >>>> instance, when a regex is match in the log messages, when a feature is >>>> uninstalled, when a JMX metric is greater than a given value, etc) >>>> >>>> Actually, Karaf (and related projects) already provides most (all) data >>>> required for the monitoring. However, it would be very helpful to have a >>>> "glue", ready to use and more user friendly, including a storage of the >>>> metrics/monitoring data. >>>> >>>> Regarding this, I started a prototype of a monitoring solution for Karaf >>>> and the applications running in Karaf. >>>> The purpose is to be very extendible, flexible, easy to install and use. >>>> >>>> In term of architecture, we can find the following component: >>>> >>>> 1/ Collectors & SLA Policies >>>> The collectors are services responsible of harvesting monitoring data. >>>> We have two kinds of collectors: >>>> - the polling collectors are invoked by a scheduler periodically. >>>> - the event driven collectors react to some events. >>>> Two collectors are already available: >>>> - the JMX collector is a polling collector which harvest all MBeans >>>> attributes >>>> - the Log collector is a event driven collector, implementing a >>>> PaxAppender which react when a log message occurs >>>> We can planned the following collectors: >>>> - a Camel Tracer collector would be an event driven collector, acting as >>>> a Camel Interceptor. It would allow to trace any Exchange in Camel. >>>> >>>> It's very dynamic (thanks to OSGi services), so it's possible to add a >>>> new custom collector (user/custom implementation). >>>> >>>> The Collectors are also responsible of checking the SLA. As the SLA >>>> policies are tight to the collected data, it makes sense that the >>>> collector >>>> validates the SLA and call/delegate the alert to SLA services. >>>> >>>> 2/ Scheduler >>>> The scheduler service is responsible to call the Polling Collectors, >>>> gather the harvested data, and delegate to the dispatcher. >>>> We already have a simple scheduler (just a thread), but we can plan a >>>> quartz scheduler (for advanced cron/trigger configuration), and another >>>> one >>>> leveraging the Karaf scheduler. >>>> >>>> 3/ Dispatcher >>>> The dispatcher is called by the scheduler or the event driven collectors >>>> to dispatch the collected data to the appenders. >>>> >>>> 4/ Appenders >>>> The appender services are responsible to send/store the collected data >>>> to >>>> target systems. >>>> For now, we have two appenders: >>>> - a log appender which just log the collected data >>>> - a elasticsearch appender which send the collected data to a >>>> elasticsearch instance. For now, it uses "external" elasticsearch, but >>>> I'm >>>> working on an elasticsearch feature allowing to embed elasticsearch in >>>> Karaf (it's mostly done). >>>> We can plan the following other appenders: >>>> - redis to send the collected data in Redis messaging system >>>> - jdbc to store the collected data in a database >>>> - jms to send the collected data to a JMS broker (like ActiveMQ) >>>> - camel to send the collected data to a Camel direct-vm/vm endpoint of a >>>> route (it would create an internal route) >>>> >>>> 5/ Console/Kibana >>>> The console is composed by two parts: >>>> - a angularjs or bootstrap layer allowing to configure the SLA and >>>> global >>>> settings >>>> - embedded kibana instance with pre-configured dashboard (when the >>>> elasticsearch appender is used). We will have a set of already created >>>> lucene queries and a kind of "Karaf/Camel/ActiveMQ/CXF" dashboard >>>> template. >>>> The kibana instance will be embedded in Karaf (not external). >>>> >>>> Of course, we have ready to use features, allowing to very easily >>>> install >>>> modules that we want. >>>> >>>> I named the prototype Karaf Decanter. I don't have preference about the >>>> name, and the location of the code (it could be as Karaf subproject like >>>> Cellar or Cave, or directly in the Karaf codebase). >>>> >>>> Thoughts ? >>>> >>>> Regards >>>> JB >>>> -- >>>> Jean-Baptiste Onofré >>>> [email protected] >>>> http://blog.nanthrax.net >>>> Talend - http://www.talend.com >>>> >>> >> > > -- > Jean-Baptiste Onofré > [email protected] > http://blog.nanthrax.net > Talend - http://www.talend.com
