Hi,
I got much further now! It actually is now shipping all the logs to
ElasticSearch and I am able to see the logs in ElasticSearch, I use now the
following: (it is indeed json file not logstash).
module(load="imfile")module(load="mmjsonparse")module(load="omelasticsearch")
template(name="logstash-index" type="list") { constant(value="logstash-")
property(name="timereported" dateFormat="rfc3339" position.from="1"
position.to="4") constant(value=".") property(name="timereported"
dateFormat="rfc3339" position.from="6" position.to="7") constant(value=".")
property(name="timereported" dateFormat="rfc3339" position.from="9"
position.to="10")}
input(type="imfile" File="/var/log/nginx/access.json" Tag="nginxulyaoth"
PersistStateInterval="10000" StateFile="nginxulyaoth"
Severity="info" MaxSubmitAtOnce="20000" Facility="user"
Ruleset="nginxrule")
template(name="uly-nginx" type="list") { constant(value="{")
constant(value="\"@timestamp\":\"") property(name="timereported"
dateFormat="rfc3339") constant(value="\",\"host\":\"")
property(name="hostname") constant(value="\",\"severity\":\"")
property(name="syslogseverity-text") constant(value="\",\"facility\":\"")
property(name="syslogfacility-text") constant(value="\",\"tag\":\"")
property(name="syslogtag" format="json") constant(value="\",\"message\":\"")
property(name="msg" format="json") constant(value="\"}")}
ruleset(name="nginxrule") { action(type="mmjsonparse" name="jsonparse")
action(type="omelasticsearch" server="loghost.ulyaoth.net" serverport="9200"
template="uly-nginx" searchIndex="logstash-index" dynSearchIndex="on"
errorFile="/var/log/rsyslog/ES-error.log")}
This is what the debug does say, so it looks all correct.
8566.967355859:7fa3b2bbe700: omelasticsearch:
beginTransaction8566.967361436:7fa3b2bbe700: Action 0x7fa3bde6cd90 transitioned
to state: itx8566.967366495:7fa3b2bbe700: entering actionCalldoAction(), state:
itx8566.968573170:7fa3b2bbe700: omelasticsearch: pData replyLen =
'103'8566.968588657:7fa3b2bbe700: omelasticsearch: pData reply:
'{"_index":"logstash-index","_type":"events","_id":"OvpqDR7WT4uPosXVV2n74Q","_version":1,"created":true}'8566.968636314:7fa3b2bbe700:
omelasticsearch: error record: '{ "request": {
"url": "http://10.8.153.71:9200/logstash-index/events?",
"postdata":
"{\"@timestamp\":\"2014-10-08T10:35:55.943964+02:00\",\"host\":\"loghost\",\"severity\":\"info\",\"facility\":\"user\",\"tag\":\"nginxulyaoth\",\"message\":\"{
\\\"@timestamp\\\": \\\"2014-10-08T10:35:55+02:00\\\", \\\"message\\\":
\\\"127.0.0.1 - admin [08/Oct/2014:10:38:47 +0200] \\\\\\\"GET /__status
HTTP/1.1\\\\\\\" 200 1443 \\\\\\\"-\\\\\\\" \\\\\\\"
Go 1.1 package http\\\\\\\"\\\", \\\"tags\\\": [\\\"nginx_access\\\"],
\\\"realip\\\": \\\"\\\"127.0.0.1\\\", \\\"proxyip\\\": \\\"-\\\",
\\\"remote_user\\\": \\\"admin\\\", \\\"contenttype\\\":
\\\"application/json\\\", \\\"bytes\\\": 1443, \\\"duration\\\": \\\"0.012\\\",
\\\"status\\\": \\\"200\\\", \\\"request\\\": \\\"GET /__status HTTP/1.1\\\",
\\\"method\\\": \\\"GET\\\", \\\"referrer\\\": \\\"-\\\", \\\"useragent\\\":
\\\"Go 1.1 package http\\\" }\"}" }, "reply": {
"_index": "logstash-index", "_type": "events",
"_id": "OvpqDR7WT4uPosXVV2n74Q", "_version":
1, "created": true }}'8566.968711815:7fa3b2bbe700:
omelasticsearch: result doAction: 0 (bulkmode 0)8566.968718959:7fa3b2bbe700:
Action 0x7fa3bde6cd90 transitioned to state: rdy
It now seems to place the full json string as one line "message" in
kibana:message:{ "@timestamp": "2014-10-08T10:35:55+02:00", "message":
"127.0.0.1 - - [08/Oct/2014:10:35:55 +0200] \"GET /test HTTP/1.1\" 200 84 \"-\"
\"curl/7.30.0\"", "tags": ["nginx_access"], "realip": ""127.0.0.1", "proxyip":
"-", "remote_user": "-", "contenttype": "application/json; charset=utf-8",
"bytes": 84, "duration": "0.006", "status": "200", "request": "GET /test
HTTP/1.1", "method": "GET", "referrer": "-", "useragent": "curl/7.30.0" }
-----Original Message-----
From: "Radu Gheorghe"<[email protected]>
To: "SjirBagmeijer"<[email protected]>;
"rsyslog-users"<[email protected]>;
Cc:
Sent: 2014-10-08 (Wed) 17:30:00
Subject: Re: [rsyslog] json files directly to ES
Hi,
I'm not sure where Logstash fits in this picture - I thought you'd get JSONs
from a file and send them to Elasticsearch.
ES version 1.1.1 should be OK, I'm not sure which rsyslog version you're on
but all recent ones should work fine.
You seem to have stuff in ES, but all the documents are empty (or at least
_source is), so you either: - have a mapping where you don't include anything
in _source. Less likely, you would have probably remembered that. You can check
by running curl localhost:9200/logstash-2014.10.07/_mapping?pretty - rsyslog
sends empty logs to ES. You can confirm this by writing the same logs to a file
with the same template you use for ES. If logs are indeed empty, follow David's
advice of using RSYSLOG_DebugFormat to see what rsyslog sees in each variable.
Maybe things are parsed incorrectly
Best regards,
Radu --Performance Monitoring * Log Analytics * Search AnalyticsSolr &
Elasticsearch Support * http://sematext.com/
On Wed, Oct 8, 2014 at 8:47 AM, SjirBagmeijer <[email protected]>
wrote:
Hello and thank you once more for the responses, so I have been testing but it
seems whatever I try it indeed does not somehow save the log in ElasticSearch
even it does say it did.
I looked at ElasticSearch but there is no errors in the logs or any strange
behavior apart from this.
The output from the command below is as following:
[loghost ~]# curl localhost:9200/logstash-2014.10.07/_search?pretty
{
"took" : 78,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 8232067,
"max_score" : 1.0,
"hits" : [ {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "NWKSiq5NTvysGkVZ9OR8XA",
"_score" : 1.0, "_source" : {}
}, {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "-5wf5CK_R5iCe1RF0zzsPg",
"_score" : 1.0, "_source" : {}
}, {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "LB8gFrTaRGyI5YreIPBS9w",
"_score" : 1.0, "_source" : {}
}, {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "aCck3E1GTqeanVAeHGGDsg",
"_score" : 1.0, "_source" : {}
}, {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "uCAMM4TGRD205AEMZMWtTQ",
"_score" : 1.0, "_source" : {}
}, {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "5JzfuFyXRiCxhhj73A249w",
"_score" : 1.0, "_source" : {}
}, {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "zNHmX6udT5GB7z5H2qD5bw",
"_score" : 1.0, "_source" : {}
}, {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "x_-nDe19SCCIuOKsd92CWw",
"_score" : 1.0, "_source" : {}
}, {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "s9wg8HhnRA6XSTf2kRMc1A",
"_score" : 1.0, "_source" : {}
}, {
"_index" : "logstash-2014.10.07",
"_type" : "events",
"_id" : "TGDNq7udT3uU5r-W6uI9TQ",
"_score" : 1.0, "_source" : {}
} ]
}
}
It looks to me the logs somehow is not saved, I tried use some other rsyslog
configuration by shipping the logs to a tcp port and then let logstash catch it
this works without issues by doing this:
if $programname == 'default-nginx-accesslog' then @loghost:5544
if $programname == 'default-nginx-accesslog' then ~
But this way it seems I have to grok all the logs correctly, any idea what else
I could look? Is there perhaps a version requirement of ElasticSearch in order
to sent it directly to it?
I currently use: (rpms on rhel7)
elasticsearch-1.1.1-1
logstash-1.4.1-1
Thanks again,
Sjir Bagmeijer
_______________________________________________
rsyslog mailing list
http://lists.adiscon.net/mailman/listinfo/rsyslog
http://www.rsyslog.com/professional-services/
What's up with rsyslog? Follow https://twitter.com/rgerhards
NOTE WELL: This is a PUBLIC mailing list, posts are ARCHIVED by a myriad of
sites beyond our control. PLEASE UNSUBSCRIBE and DO NOT POST if you DON'T LIKE
THAT.