Hi Jose!
The script worked beautifully! rsrs
Very thanks!
Out of this topic, I'm thinking of improving the rules for some Windows
security
events. I do not know if there is already a topic or work on it.
For the ossec generate alerts, for example, the login types:
And then would release on github. I would like to contribute, if possible.
Em quinta-feira, 29 de setembro de 2016 09:53:05 UTC-3, jose escreveu:
>
> Hi Roberto,
>
> About your osseccall you wrote this in the mail
>
> But the file "template =>" /etc/logstash/elastic-ossec-template2.json "I
> modified the lines 3 and 8.
> Line 3: from "template", "ossec *" to "template", "ossecall *"
> Line 8: from "ossec": to "ossecall":
>
> You have an space between ossec, ossecall and the wildcard?, if you have,
> you should not. And with the curl procedure:
>
> $ Cd ~ / ossec_tmp / ossec-wazuh / extensions / ElasticSearch / && curl -XPUT
> "http: // localhost: 9200 / _template / ossec /" -d "@
> elastic-ossec-template.json"
>
> You need to apply the templates for both index.
>
> For your last question, in this mail you have a bash script to reindex the
> index. Please use carefully and check with curl
> 'localhost:9200/_cat/indices?v' after every step that the script is doing
> well.
>
> This script has 4 steps:
>
> 1. We move all index without mapping applied to a backup index, we do
> that with the option reindex to apply the new template.
> 2. After the reindex is has finished we can delete the old index.
> 3. Now we can move the backup index to the original name.
> 4. When the step 3 has finished we can delete the backup index.
>
> Pleas take a look the lines 72, 73 and 76, 77 in order to change the index
> name from ossec-$index_elastic_name and ossec-$index_elastic_name by
> ossecall-$index_elastic_name and ossecall-$index_elastic_name because
> probably you need to run this script for your two index.
>
> This one of a few utils that wazuh will release soon.
>
> #!/bin/bash
>
> # Copyright (C) 2015-2016 Wazuh, Inc.All rights reserved.
> # Wazuh.com
> #
> # This program is a free software; you can redistribute it
> # and/or modify it under the terms of the GNU General Public
> # License (version 2) as published by the FSF - Free Software
> # Foundation.
>
> # Elasticsearch Reindexing
> # Requires:
> # Elasticsearch 2.3 or superior
>
> if [ $# -ne 4 ]
> then
> echo "Usage: ./wazuh_elastic_reindex_index.sh date_from date_to
> elasticsearch_ip step"
> echo -e "\tDate format: YYYY-MM-DD"
> echo -e "\tStep: 1|2|3|4"
> echo -e "\tExample: ./wazuh_elastic_reindex_index.sh 20160826 20160901
> 10.0.0.20 1"
> echo -e "\tNote: Each step takes its time to perform the actions
> required. Review: tail -f /var/log/elasticsearch/ossec.log"
> exit 0
> fi
>
> ## Arguments
> FROM=$1
> TO=$2
> ELASTIC_IP=$3
> STEP=$4
>
> ## Main
> startdate=$(date -d $FROM +"%Y%m%d")
> enddate=$(date -d $TO +"%Y%m%d")
>
> if [ $startdate -ge $enddate ];
> then
> echo "The date_from $startdate is bigger than date_to $enddate, please
> review this arguments";
> exit 1
> fi
>
> startdate=$(date -I -d "$FROM") || exit -1
> enddate=$(date -I -d "$TO") || exit -1
>
> echo -e "\n### Start reindexing [STEP $STEP], from $startdate to $enddate are
> you sure? please confirm with YES/NO?"
> read ADDRANSWER
>
> exist_index () {
> request="$ELASTIC_IP:9200/$1"
> exist=`curl -s -XHEAD -i $request | head -n 1 | cut -d' ' -f2`
> }
>
> reindex () {
> request="$ELASTIC_IP:9200/_reindex"
> request_body='{ "source": { "index": "'"$1"'" }, "dest": { "index":
> "'"$2"'" }}'
> curl_result=`curl -s -XPOST $request -d "$request_body"`
> echo $curl_result
> }
>
> delete_index () {
> request="$ELASTIC_IP:9200/$1"
> curl_result=`curl -s -XDELETE $request`
> echo $curl_result
> }
>
> if [ $ADDRANSWER == 'YES' ]
> then
> d="$FROM"
> while [ "$d" != "$enddate" ]; do
> index_elastic_name=` echo $d | sed 's/-/\./g'`
>
> if [ $STEP == '1' ] || [ $STEP == '2' ]; then
> src_index="ossec-$index_elastic_name"
> dst_index="ossec-$index_elastic_name-b"
> exist_index $src_index
> elif [ $STEP == '3' ] || [ $STEP == '4' ]; then
> src_index="ossec-$index_elastic_name-b"
> dst_index="ossec-$index_elastic_name"
> exist_index $src_index
> else
> echo "Bad argument: step: $STEP"
> exit 1
> fi
>
> if [ $exist != '404' ]; then
> if [ $STEP == '1' ]; then
> echo "### 1. Reindexing: $src_index -> $dst_index"
> reindex $src_index $dst_index
> elif [ $STEP == '2' ]; then
> echo "### 2. Deleting old index: $src_index"
> delete_index $src_index
> elif [ $STEP == '3' ]; then
> echo "### 3. Reindexing: $src_index"
> reindex $src_index $dst_index
> elif [ $STEP == '4' ]; then
> echo "### 4. Deleting intemediate index: $src_index"
> delete_index $src_index
> fi
> else
> echo "### Index $src_index doest not exist. Skipping."
> fi
>
> # Update date.
> d=$(date -I -d "$d + 1 day")
> done
>
> echo -e "\nPlease check 'curl -XGET ${ELASTIC_IP}:9200/_cat/indices' to
> re-check the indices"
> echo "Reindexing ended [STEP $STEP]."
> else
> echo "This script is finished because you don't confirm with YES"
> fi
>
> i hope this helps.
>
> Regards
> -----------------------
> Jose Luis Ruiz
> Wazuh Inc.
> [email protected] <javascript:>
>
> On September 29, 2016 at 7:25:09 AM, [email protected]
> <javascript:> ([email protected] <javascript:>) wrote:
>
> Hi Jose, thanks for reply!
>
> Indeed, today the index is in template format. But only ossec index, the
> index ossecall did not work, the fields still appear as "Analyzed Field".
>
> I did not do the procedure:
> $ Cd ~ / ossec_tmp / ossec-wazuh / extensions / ElasticSearch / && curl
> -XPUT "http: // localhost: 9200 / _template / ossec /" -d "@ elastic-ossec
> -template.json"
>
> Just put the logstash output that I said.
>
> But the file "template =>" /etc/logstash/elastic-ossec-template*2*.json "I
> modified the lines 3 and 8.
> Line 3: *from* "template", "ossec *" *to* "template", "ossecall *"
> Line 8: *from* "ossec": *to* "ossecall":
>
> I do not know if it was really necessary to do this. I did this because I
> decided to create a separate index for logs archives.json file. Where
> ossec are logging all.
>
> About "After that, probably you will need to reindex all your index to
> apply the new template."
> Do you have any procedure to do this?
>
>
> Em quarta-feira, 28 de setembro de 2016 18:01:12 UTC-3, jose escreveu:
>>
>> Hi Roberto,
>>
>> Have you applied the custom mapping?
>>
>>
>> http://documentation.wazuh.com/en/latest/ossec_elk_elasticsearch.html#ossec-alerts-template
>>
>> If you have the custom mapping applied, and the template in Logstash, you
>> need to wait until next day, when the next index is created with the new
>> mapping and template.
>>
>> After that, probably you will need to reindex all your index to apply the
>> new template.
>>
>>
>> Regards
>> -----------------------
>> Jose Luis Ruiz
>> Wazuh Inc.
>> [email protected]
>>
>> On September 28, 2016 at 3:26:38 PM, [email protected]
>> ([email protected]) wrote:
>>
>> Hi Pedro!
>>
>> I am using the ossec wazuh, I have a question about indexes.
>> I had implemented the logstash without using the file "elastic-ossec-
>> template.json". But I saw it would be good to use it. I am wanting use
>> some indexes and Kibana shows "Analyzed Field", like "AgentName".
>>
>> I put the template in the configuration of logstash and the index has
>> not changed to "not analized".
>>
>>
>> My logstash output :
>>
>> output {
>>
>> #for archives.json log
>> if [type] == "ossecall" {
>> elasticsearch {
>> hosts => "127.0.0.1:9200"
>> index => "ossecall-%{+YYYY.MM.dd}"
>> document_type => "ossecall"
>> template => "/etc/logstash/elastic-ossec-template2.json"
>> template_name => "ossecall"
>> template_overwrite => true
>> }
>> }
>> #for alerts.json log
>> else {
>> elasticsearch {
>> hosts => "127.0.0.1:9200"
>> index => "ossec-%{+YYYY.MM.dd}"
>> document_type => "ossec"
>> template => "/etc/logstash/elastic-ossec-template.json"
>> template_name => "ossec"
>> template_overwrite => true
>> }
>> }
>> }
>>
>> Can you help me?
>>
>>
>>
>> Em quinta-feira, 2 de junho de 2016 08:25:09 UTC-3, Pedro S escreveu:
>>>
>>> Hi Maxim,
>>>
>>> How are you forwarding the alerts/archives to Kibana?
>>>
>>> I think you will need the archives JSON output setting, if you are using
>>> Wazuh <http://wazuh.com/>, edit *ossec.conf* and add the following
>>> setting:
>>>
>>> <global>
>>>> *<logall_json>yes</logall_json>*
>>>> </global>
>>>
>>>
>>>
>>> Once you do it, you will find new archives.json events files at:
>>>
>>> /var/ossec/logs/archives/archives.json
>>>
>>>
>>>
>>> The next step is forward these archives events to Elasticsearch, in
>>> order to do it we need to edit Logstash configuration.
>>>
>>> My personal advice to index archives events is to create a dedicated
>>> index pattern just for them, so you will be able to distinguish between
>>> events and alerts, adding inside "output" section the following
>>> configuration:
>>>
>>> output {
>>> if [type] == "ossec-alerts" {
>>> elasticsearch {
>>> hosts => ["127.0.0.1:9200"]
>>> index => "ossec-%{+YYYY.MM.dd}"
>>> document_type => "ossec"
>>> template => "/etc/logstash/elastic-ossec-template.json"
>>> template_name => "ossec"
>>> template_overwrite => true
>>> }
>>> }
>>> if [type] == "ossec-archives" {
>>> elasticsearch {
>>> hosts => ["127.0.0.1:9200"]
>>> index => "ossec-archives-%{+YYYY.MM.dd}"
>>> document_type => "ossec"
>>> template => "/etc/logstash/elastic-ossec-template.json"
>>> template_name => "ossec"
>>> template_overwrite => true
>>> }
>>> }
>>> }
>>>
>>>
>>> Later in Kibana you will need to create a new index pattern
>>> (Settings->indices) matching for "ossec-archives-*".
>>>
>>> If you need to "reindex" or read the a log file from the beginning using
>>> Logstash, you can use the file input with option *start_position* set
>>> to *beginning* (+ info)
>>> <https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html#plugins-inputs-file-start_position>
>>>
>>>
>>>
>>> On Monday, May 30, 2016 at 4:53:10 PM UTC+2, Maxim Surdu wrote:
>>>>
>>>> i have this archives files with logs but in kibana i can not see them
>>>> can i reindex this files?
>>>> if i can, please help me step by step
>>>>
>>>> joi, 19 mai 2016, 10:17:51 UTC+3, Maxim Surdu a scris:
>>>>>
>>>>> Hi dear community,
>>>>>
>>>>> i had a problem with logstash, after i resolve it i saw what in kibana
>>>>> are missing logs, how can i resolve the problem and reindexing all my
>>>>> logs
>>>>> to kibana
>>>>> I will be thankful if someone will help me step by step
>>>>>
>>>>>
>>>>> i appreciate your help, and a lot of respect for developers and
>>>>> community!
>>>>>
>>>> --
>>
>> ---
>> You received this message because you are subscribed to the Google Groups
>> "ossec-list" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> For more options, visit https://groups.google.com/d/optout.
>>
>> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "ossec-list" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected] <javascript:>.
> For more options, visit https://groups.google.com/d/optout.
>
>
--
---
You received this message because you are subscribed to the Google Groups
"ossec-list" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.