And another follow up question if this is ok. I put a date field in for OXCODE and was able to run the script once I lined up the paths etc. I get the following report back. What’s interesting is other scripts I’ve tried I get the same results with the fields populated with no data but the summarization and flows processed has valid numbers. Am I missing something with nfdump or what would you think would give me no data for the actual fields but an aggregate at the end. See the following output as an example.
Report for 062014 on 2014/06/29 The following table shows upload top users, which can include servers Top 25 Src IP Addr ordered by bytes: Date first seen Duration Proto Src IP Addr Flows(%) Packets(%) Bytes(%) pps bps bpp Summary: total flows: 0, total bytes: 0, total packets: 0, avg bps: 0, avg pps: 0, avg bpp: 0 Time window: 2014-05-10 07:14:57 - 2014-06-30 00:15:39 Total flows processed: 75701700, Blocks skipped: 0, Bytes read: 6964735848 Sys: 22.658s flows/second: 3340976.7 Wall: 24.632s flows/second: 3073273.5 The following table shows download top users, which can include servers Top 25 Dst IP Addr ordered by bytes: Date first seen Duration Proto Dst IP Addr Flows(%) Packets(%) Bytes(%) pps bps bpp Summary: total flows: 0, total bytes: 0, total packets: 0, avg bps: 0, avg pps: 0, avg bpp: 0 Time window: 2014-05-10 07:14:57 - 2014-06-30 00:15:39 Total flows processed: 75701700, Blocks skipped: 0, Bytes read: 6964735848 Sys: 22.294s flows/second: 3395515.8 Wall: 22.293s flows/second: 3395757.7 The time window also seems odd. Any pointers would be most appreciated. Thanks Scott On Jun 13, 2014, at 12:41 PM, Giles Coochey <[email protected]> wrote: > On 13/06/2014 17:01, Ge Moua wrote: >> Maybe a combo of: >> * cron >> * nfdump <by time window> >> * wrapped inside shell, awk, sed, perl, python, etc of choice >> >> not the exact recipe but more so for what ingredients that can be used > Incredibly, someone asked me to do this today, so without any ado, I put > the following in /etc/cron.daily/reports.sh > > #!/bin/sh > YESTERDAY=`date -d "yesterday" +"%Y/%m/%d"` > DUMPFILE=`date -d "yesterday" +"%Y%m%d"` > OXCODE="XXYYY" > SUBNET="43" > RISM="[email protected]" > mkdir -p /opt/reporting/data/$OXCODE; > echo "Report for $OXCODE on $YESTERDAY" > > /opt/reporting/data/$OXCODE/$DUMPFILE.txt > echo >> /opt/reporting/data/$OXCODE/$DUMPFILE.txt > echo "The following table shows upload top users, which can include > servers" >> /opt/reporting/data/$OXCODE/$DUMPFILE.txt > echo >> /opt/reporting/data/$OXCODE/$DUMPFILE.txt > nfdump -M /usr/local/nfsen/profiles-data/live/asa5510/$YESTERDAY -R . -n > 20 -s srcip/bytes "src net 192.168.$SUBNET.0/24" >> > /opt/reporting/data/$OXCODE/$DUMPFILE.txt > echo >> /opt/reporting/data/$OXCODE/$DUMPFILE.txt > echo "The following table shows download top users, which can include > servers" >> /opt/reporting/data/$OXCODE/$DUMPFILE.txt > echo >> /opt/reporting/data/$OXCODE/$DUMPFILE.txt > nfdump -M /usr/local/nfsen/profiles-data/live/asa5510/$YESTERDAY -R . -n > 20 -s dstip/bytes "dst net 192.168.$SUBNET.0/24" >> > /opt/reporting/data/$OXCODE/$DUMPFILE.txt > mailx -s "$OXCODE Report for $YESTERDAY" -r "Networks > <[email protected]>" $RISM < /opt/reporting/data/$OXCODE/$DUMPFILE.txt > > > >> Regards, >> Ge Moua >> University of Minnesota Alumnus >> [email protected] >> -- >> >> On 6/13/14, 10:55 AM, Scott Granados wrote: >>> Hi, I’m new to NFSEN so apologize if this has been asked before. >>> I’ve been asked to generate weekly or daily reports of things like TOP >>> AS, top IP addresses or subnets, etc. It’s not obvious to me in the tool >>> how to do this and in fact doesn’t seem possible natively. I did some >>> googling and found a reference to some scripts that could be run in CRON >>> that would generate these reports and email them but the actual pointer to >>> the tar file containing the scripts seemed broken. Does anyone have a >>> pointer to scripts that could be adjusted to fit this purpose or some >>> suggestions on a starting point as to how I can automate some of the >>> reporting? >>> >>> Thank you >>> Scott >>> >>> >>> ------------------------------------------------------------------------------ >>> HPCC Systems Open Source Big Data Platform from LexisNexis Risk Solutions >>> Find What Matters Most in Your Big Data with HPCC Systems >>> Open Source. Fast. Scalable. Simple. Ideal for Dirty Data. >>> Leverages Graph Analysis for Fast Processing & Easy Data Exploration >>> http://p.sf.net/sfu/hpccsystems >>> _______________________________________________ >>> Nfsen-discuss mailing list >>> [email protected] >>> https://lists.sourceforge.net/lists/listinfo/nfsen-discuss >> >> ------------------------------------------------------------------------------ >> HPCC Systems Open Source Big Data Platform from LexisNexis Risk Solutions >> Find What Matters Most in Your Big Data with HPCC Systems >> Open Source. Fast. Scalable. Simple. Ideal for Dirty Data. >> Leverages Graph Analysis for Fast Processing & Easy Data Exploration >> http://p.sf.net/sfu/hpccsystems >> _______________________________________________ >> Nfsen-discuss mailing list >> [email protected] >> https://lists.sourceforge.net/lists/listinfo/nfsen-discuss > > > -- > Regards, > > Giles Coochey, CCNP, CCNA, CCNAS > NetSecSpec Ltd > +44 (0) 8444 780677 > +44 (0) 7983 877438 > http://www.coochey.net > http://www.netsecspec.co.uk > [email protected] > > > <smime.p7s><ATT00001.c><ATT00002.c> ------------------------------------------------------------------------------ Open source business process management suite built on Java and Eclipse Turn processes into business applications with Bonita BPM Community Edition Quickly connect people, data, and systems into organized workflows Winner of BOSSIE, CODIE, OW2 and Gartner awards http://p.sf.net/sfu/Bonitasoft _______________________________________________ Nfsen-discuss mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/nfsen-discuss
