Re: [SLUG] Open source log analyser tools (or: Alternatives to Splunk)
On 9 May 2012 15:20, Mark Walkom markwal...@gmail.com wrote: We are looking at Splunk for syslog analysing to close a hole in our application visibility, but it's expensive. I've looked at alternatives like logstash and graylog2, but I wanted to see if anyone had some experiences they would be willing to share on either splunk or other. This was raised a couple of years ago but I figure the scene has changed a fair bit since then! Splunk isn't expensive and it's bloody amazing. US$6,000 for 500 megs a day perpetual license. How much log data are you generating that this seems expensive? I doubt you'll find anything that comes close. -- Simon Rumble si...@rumble.net -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
Re: [SLUG] Open source log analyser tools (or: Alternatives to Splunk)
We're looking at 2G a day, which is AUS$30K a year. And were on the S end of SME so it's a hell of a lot. The only way we could cut this amount down would be is if we wrote a customer parser that read the application logs that cut out all the replicated crap (mostly environment variable stuff) and spat out the logs to a separate dir for splunk to read. But then we need to deal with extra storage requirements. Again, when you are a small operation with a small budget, money rules. -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
Re: [SLUG] Open source log analyser tools (or: Alternatives to Splunk)
Hi, It really depends on what are your needs. I don't know much about the fancy things from splunk, but you can do some cool things with logstash. There is also a nice ui for logstash, kibana ( https://github.com/rashidkpc/Kibana) On 9 May 2012 17:04, Mark Walkom markwal...@gmail.com wrote: We're looking at 2G a day, which is AUS$30K a year. And were on the S end of SME so it's a hell of a lot. The only way we could cut this amount down would be is if we wrote a customer parser that read the application logs that cut out all the replicated crap (mostly environment variable stuff) and spat out the logs to a separate dir for splunk to read. But then we need to deal with extra storage requirements. Again, when you are a small operation with a small budget, money rules. -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
Re: [SLUG] Open source log analyser tools (or: Alternatives to Splunk)
On 9 May 2012 17:13, Daniel Solsona dsols...@gmail.com wrote: Hi, It really depends on what are your needs. We want to track a file as it hits our comms server (via FTP/HTTP), transfers to our app server, processes and is then generated out to the comms server and then picked up by the recipient. That will use the following logs; 1. FTP/HTTP daemon 2. FTP process 3. Application processing (x2, in then out) 1. Here we will probably have to link the in and out process using some database queries 4. FTP process 5. FTP/HTTP daemon It's convoluted, but our architecture restricts what we can do, and I don't know if we can make enough changes in the time I have to make it easier. I can probably create the parser I mentioned earlier. -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
Re: [SLUG] Open source log analyser tools (or: Alternatives to Splunk)
With logstash you have an agent in each server. There you can configure inputs, filters and outputs. Inputs: file, syslog, etc Filters: grep, regexp. Here you can do magic. Outputs: file, elasticsearch, redis, amqp, etc (lots of possibilities) Check the docs to see if there is any filter/output that works for you. I reckon you can do what you need with logstash. But you can probably do the same with some scripts. On May 9, 2012 5:26 PM, Mark Walkom markwal...@gmail.com wrote: -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
Re: [SLUG] Open source log analyser tools (or: Alternatives to Splunk)
On 9 May 2012 19:23, Daniel Solsona dsols...@gmail.com wrote: With logstash you have an agent in each server. There you can configure inputs, filters and outputs. Inputs: file, syslog, etc Filters: grep, regexp. Here you can do magic. Outputs: file, elasticsearch, redis, amqp, etc (lots of possibilities) Check the docs to see if there is any filter/output that works for you. I reckon you can do what you need with logstash. But you can probably do the same with some scripts. We want a nice front end for helpdesk, plus we'd like to potentially roll it out to other aspects of the business. BI is something that would be a good value add to sell as well :) -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
[SLUG] Open source log analyser tools (or: Alternatives to Splunk)
We are looking at Splunk for syslog analysing to close a hole in our application visibility, but it's expensive. I've looked at alternatives like logstash and graylog2, but I wanted to see if anyone had some experiences they would be willing to share on either splunk or other. This was raised a couple of years ago but I figure the scene has changed a fair bit since then! Cheers, Mark -- SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/ Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html