Hi,
The way forward to go on log monitoring is with Log Analytics solution that
we are working on. Anyhow the old logging publishing method is broken and
cannot be used with latest carbon release products, because that's tightly
coupled with cassandra, Hadoop, etc, and hence we can't use that
Since Log Analytics solution will take some time to come
Can we release log publishing part of the Log Analytics solution ASAP such
that others can publish log to DAS.
Suho
On Wed, Dec 2, 2015 at 12:08 PM, Malith Dhanushka wrote:
> Yes. Log analyzer which is being written on
Yes. Log analyzer which is being written on top of DAS platform will be
based on log stash http publisher.
Thanks
On Wed, Dec 2, 2015 at 11:58 AM, Sinthuja Ragendran
wrote:
> Hi,
>
> The way forward to go on log monitoring is with Log Analytics solution
> that we are working
Hi Suho/Anjana,
I noticed that we are working on a feature called Log Analyzer. Is this for
centralized logging?
If not what's the approach we are taking for $subject with DAS?
Thanks
On Wed, Dec 2, 2015 at 11:16 AM, Anuruddha Liyanarachchi <
anurudd...@wso2.com> wrote:
> Hi,
>
> I am trying
On Wed, Dec 2, 2015 at 12:10 PM, Sriskandarajah Suhothayan
wrote:
> Since Log Analytics solution will take some time to come
> Can we release log publishing part of the Log Analytics solution ASAP
> such that others can publish log to DAS.
>
Just publishing the logs to DAS is
Hi DAS team
The current log publishing is broken.
Whats the recommended log publishing approach going forward?
Suho
On Wed, Dec 2, 2015 at 11:27 AM, Imesh Gunaratne wrote:
> Hi Suho/Anjana,
>
> I noticed that we are working on a feature called Log Analyzer. Is this
> for
Hi Suho,
In the log analysis solution we are using a Http publisher(OOB from
logstash) to publish data to a REST end point. Rather than coupling
publishing agent to log4j appender, it would be a much cleaner way, there
are couple of other options too, fluentd for example.
If users need to
Hi,
I am trying to publish carbon logs to DAS and I am facing following
problems.
*In carbon 4.2.0 products (APIM 1.9.1) :*
For each day stream definitions are created [1], therefore I can't use a
common event receiver to persist data.
*In carbon 4.4.0 products (ESB 4.9.0) :*
Throws class not