Github user cestella commented on the issue:
https://github.com/apache/incubator-metron/pull/541
# Testing Plan
## Preliminaries
* Please perform the following tests on the `full-dev` vagrant environment.
* Set an environment variable to indicate `METRON_HOME`:
`export METRON_HOME=/usr/metron/0.4.0`
## Ensure Data Flows from the Indices
Ensure that with a basic full-dev we get data into the elasticsearch
indices and into HDFS.
## (Optional) Free Up Space on the virtual machine
First, let's free up some headroom on the virtual machine. If you are
running this on a
multinode cluster, you would not have to do this.
* Stop and disable Metron in Ambari
* Kill monit via `service monit stop`
* From ambari, stop the metron service
* Kill the sensors via `service sensor-stubs stop`
## Install and start pycapa
```
# set env vars
export PYCAPA_HOME=/opt/pycapa
export PYTHON27_HOME=/opt/rh/python27/root
# Install these packages via yum (RHEL, CentOS)
yum -y install epel-release centos-release-scl
yum -y install "@Development tools" python27 python27-scldevel
python27-python-virtualenv libpcap-devel libselinux-python
# Setup directories
mkdir $PYCAPA_HOME && chmod 755 $PYCAPA_HOME
#Grab pycapa from git
cd ~
git clone https://github.com/apache/incubator-metron.git
cp -R ~/incubator-metron/metron-sensors/pycapa* $PYCAPA_HOME
# Create virtualenv
export LD_LIBRARY_PATH="/opt/rh/python27/root/usr/lib64"
${PYTHON27_HOME}/usr/bin/virtualenv pycapa-venv
# Build it
cd ${PYCAPA_HOME}/pycapa
# activate the virtualenv
source ${PYCAPA_HOME}/pycapa-venv/bin/activate
pip install -r requirements.txt
python setup.py install
# Run it
cd ${PYCAPA_HOME}/pycapa-venv/bin
pycapa --producer --topic pcap -i eth1 -k node1:6667
```
## Ensure pycapa can write to HDFS
* Ensure that `/apps/metron/pcap` exists and can be written to by the
storm user. If not, then:
```
sudo su - hdfs
hadoop fs -mkdir -p /apps/metron/pcap
hadoop fs -chown metron:hadoop /apps/metron/pcap
hadoop fs -chmod 775 /apps/metron/pcap
exit
```
* Start the pcap topology via `$METRON_HOME/bin/start_pcap_topology.sh`
* Watch the topology in the Storm UI and kill the packet capture utility
from before, when the number of packets ingested is over 3k. Ensure that at at
least 3 files exist on HDFS by running `hadoop fs -ls /apps/metron/pcap`
Note that if your MR job fails because of a lack of user directory for
`root`, then the following will create the directory appropriately:
```
sudo su - hdfs
hadoop fs -mkdir /user/root
hadoop fs -chown root:hadoop /user/root
hadoop fs -chmod 755 /user/root
exit
```
### Regression Test
#### Fixed
* Run a fixed pcap query by executing a command similar to the following:
```
$METRON_HOME/bin/pcap_query.sh fixed --ip_dst_port 8080 -st "20170425" -df
"yyyyMMdd"
```
* Verify the MR job finishes successfully. Upon completion, you should see
multiple files named with relatively current datestamps in your current
directory, e.g. pcap-data-20160617160549737+0000.pcap
* Copy the files to your local machine and verify you can them it in
Wireshark. Open the files and ensure that they contain only packets to the
destination port of 8080.
#### Stellar
* Run a fixed pcap query by executing a command similar to the following:
```
$METRON_HOME/bin/pcap_query.sh query --query "ip_dst_port == 8080" -st
"20170425" -df "yyyyMMdd"
```
* Verify the MR job finishes successfully. Upon completion, you should see
multiple files named with relatively current datestamps in your current
directory, e.g. pcap-data-20160617160549737+0000.pcap
* Copy the files to your local machine and verify you can them it in
Wireshark. Open the files and ensure that they contain only packets to the
destination port of 8080.
### Binary Payload Search : Strings
#### Fixed
* Run a fixed pcap query by executing a command similar to the following:
```
$METRON_HOME/bin/pcap_query.sh fixed --ip_dst_port 8080 --packet_filter
"\`persist\`" -st "20170425" -df "yyyyMMdd"
```
* Verify the MR job finishes successfully. Upon completion, you should see
multiple files named with relatively current datestamps in your current
directory, e.g. pcap-data-20160617160549737+0000.pcap
* Copy the files to your local machine and verify you can them it in
Wireshark. Open the files and ensure that they contain only packets to the
destination port of 8080 and are api calls involving `
/api/v1/persist/wizard-data` in Ambari.
#### Stellar
* Run a fixed pcap query by executing a command similar to the following:
```
$METRON_HOME/bin/pcap_query.sh query --query "ip_dst_port == 8080 &&
BYTEARRAY_MATCHER('\`persist\`', packet)" -st "20170425" -df "yyyyMMdd"
```
* Verify the MR job finishes successfully. Upon completion, you should see
multiple files named with relatively current datestamps in your current
directory, e.g. pcap-data-20160617160549737+0000.pcap
* Copy the files to your local machine and verify you can them it in
Wireshark. Open the files and ensure that they contain only packets to the
destination port of 8080 and are api calls involving `
/api/v1/persist/wizard-data` in Ambari.
### Binary Payload Search : Hex Regex
#### Stellar
NOTE: To the astute reader, 0x1F90 in hex is 8080 in decimal
* Run a fixed pcap query by executing a command similar to the following:
```
$METRON_HOME/bin/pcap_query.sh query --query "BYTEARRAY_MATCHER('1F90',
packet) && BYTEARRAY_MATCHER('\`persist\`', packet)" -st "20170425" -df
"yyyyMMdd"
```
* Verify the MR job finishes successfully. Upon completion, you should see
multiple files named with relatively current datestamps in your current
directory, e.g. pcap-data-20160617160549737+0000.pcap
* Copy the files to your local machine and verify you can them it in
Wireshark. Open the files and ensure that they contain only packets to the
destination port of 8080 and are api calls involving `
/api/v1/persist/wizard-data` in Ambari.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---