GitHub user dlyle65535 opened a pull request:

    https://github.com/apache/incubator-metron/pull/561

    METRON-913: Create IP Report in Zeppelin

    ## Contributor Comments
    For a given IP address,  produces a report of:
    * Most Frequent Connections (Yaf, defaults to 24 hours) 
    * Recent Connections (Yaf, defaults to 1 hour)
    * Top DNS Queries (Bro, defaults to 24 hours)
    * All Ports Used (Yaf, defaults to 24 hours)
    * HTTP User Agents (Bro, defaults to 24 hours)
    
    ## Test Plan
    ### Setup
    The setup is exactly the same as @justinleet described in PR-556, repeated 
here:
    
    To test, spin up full-dev. Ensure data flows through for the sensors.  For 
Yaf, which is not enabled by default, it'll be necessary to start the 
sensor-stub
    ```
    service sensor-stubs start yaf
    ```
    
    It'll also be necessary to add Yaf to the list of sensors run in Ambari. To 
do so, stop Metron, edit "Metron Parsers" to include "yaf" (or if the other 
sensors already have data output, feel free to just make it "yaf" only).
    
    Let data flow through for a little while (enough so that we can reasonable 
make range filters).
    
    Once some data has gone through, we'll need to have an instance of 
Zeppelin.  Because of the size of the Vagrant instance, we'll want to shut down 
unneeded services.  Shutdown Metron, Kibana, Storm, Kafka, and HBase.
    
    Install Zeppelin from "Actions - Add Service".  It'll prompt you to install 
Spark and Hive, do so.  Configuration is pretty trivial, all that's necessary 
is to set an arbitrary Hive database password.  Let this run.  The Hive service 
check likes to fail on our Vagrant, but it's benign (some impersonation 
configuration issue unrelated to actually running our queries).  Ignore it and 
accept the installation.
    
    From Metron's "Service Actions", run the "Zeppelin Notebook Import", to 
load our notebooks into Zeppelin.  Use the quick links to navigate to the 
Zeppelin UI.
    
    Go into the "Metron IP Report" notebook. Instructions are in the first 
paragraph of the readme.
    
    Note: If you've run for less than an hour (which I recommend), the first 2 
reports will have the same data.
    
    If you wish to validate the data, use the following queries. The first 4 
will validate against Elasticsearch and the final report will validate against 
HDFS. This is because we don't store an unanalyzed user_data field.
    
    Run the following from an ssh session to node1 (as root).
    
    ### Most Frequent Connections (and Recent Connections)
    ```
    curl -XPOST 'node1:9200/y*/_search?pretty' -H 'Content-Type: 
application/json' -d'
    {
       "size": 0,
       "query": {
          "filtered": {
             "query": {
                "match": {
                   "ip_src_addr": "192.168.138.158"
                }
             }
          }
       },
       "aggs": {
          "ip_dst_addr": {
             "terms": {
                "field": "ip_dst_addr"
             }
          }
       }
    }
    '
    ```
    ### Top DNS Queries 
    ```
    curl -XPOST 'node1:9200/b*/_search?pretty' -H 'Content-Type: 
application/json' -d'
    {
       "size": 0,
       "query": {
          "bool": {
             "must": [
                {
                   "term": {
                      "ip_src_addr": "192.168.138.158"
                   }
                },
                {
                   "term": {
                      "protocol": "dns"
                   }
                }
             ]
          }
       },
       "aggs": {
          "domain_name": {
             "terms": {
                "field": "query"
             }
          }
       }
    }
    '
    ```
    ### All Ports Used
    ```
    curl -XPOST 'node1:9200/y*/_search?pretty' -H 'Content-Type: 
application/json' -d'
    {
       "size": 0,
       "query": {
          "bool": {
             "must": [
                {
                   "term": {
                      "ip_src_addr": "192.168.138.158"
                   }
                }
             ]
          }
       },
       "aggs": {
          "user_agent": {
             "terms": { "field" : "ip_dst_port"}
          }
       }
    }
    '
    ```
    ### HTTP User Agents (verify count of MSIE 8.0 User Agent)
    ```
    hdfs dfs -cat /apps/metron/indexing/indexed/bro/en* | grep http | grep 
'MSIE 8.0' | wc -l
    ```
    ## Pull Request Checklist
    
    Thank you for submitting a contribution to Apache Metron.  
    Please refer to our [Development 
Guidelines](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=61332235)
 for the complete guide to follow for contributions.  
    Please refer also to our [Build Verification 
Guidelines](https://cwiki.apache.org/confluence/display/METRON/Verifying+Builds?show-miniview)
 for complete smoke testing guides.  
    
    
    In order to streamline the review of the contribution we ask you follow 
these guidelines and ask you to double check the following:
    
    ### For all changes:
    - [X] Is there a JIRA ticket associated with this PR? If not one needs to 
be created at [Metron 
Jira](https://issues.apache.org/jira/browse/METRON/?selectedTab=com.atlassian.jira.jira-projects-plugin:summary-panel).
 
    - [X] Does your PR title start with METRON-XXXX where XXXX is the JIRA 
number you are trying to resolve? Pay particular attention to the hyphen "-" 
character.
    - [X] Has your PR been rebased against the latest commit within the target 
branch (typically master)?
    
    
    ### For code changes:
    - [X] Have you included steps to reproduce the behavior or problem that is 
being changed or addressed?
    - [X] Have you included steps or a guide to how the change may be verified 
and tested manually?
    - [X] Have you ensured that the full suite of tests and checks have been 
executed in the root incubating-metron folder via:
      ```
      mvn -q clean integration-test install && build_utils/verify_licenses.sh 
      ```
    
    - [N/A] Have you written or updated unit tests and or integration tests to 
verify your changes?
    - [N/A] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
    - [X] Have you verified the basic functionality of the build by building 
and running locally with Vagrant full-dev environment or the equivalent?
    
    ### For documentation related changes:
    - [N/A] Have you ensured that format looks appropriate for the output in 
which it is rendered by building and verifying the site-book? If not then run 
the following commands and the verify changes via 
`site-book/target/site/index.html`:
    
      ```
      cd site-book
      bin/generate-md.sh
      mvn site:site
      ```
    
    #### Note:
    Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.
    It is also recommended that [travis-ci](https://travis-ci.org) is set up 
for your personal repository such that your branches are built there before 
submitting a pull request.
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/dlyle65535/incubator-metron METRON-913

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/incubator-metron/pull/561.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #561
    
----
commit 1cace9ff29f31301d74fa6a7b2630d471452e985
Author: David Lyle <dlyle65...@gmail.com>
Date:   2017-05-02T18:02:48Z

    METRON-913: Create IP Report in Zeppelin

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to