I was never able to get libdnet to compile, with the following errors:
C:\MinGW\bin\gcc.exe -mdll -O -Wall -I../include -I../../WPdpack/Include -IC:\Python27\include -IC:\Python27\PC -c ./dnet.c -o build\temp.win-amd64-2.7\Release\.\dnet.o In file included from ../include/dnet.h:12:0, from ./dnet.c:22: ../include/dnet/os.h:26:17: error: conflicting types for 'ssize_t' typedef long ssize_t; ^ In file included from c:\mingw\include\io.h:33:0, from C:\Python27\include/pyconfig.h:68, from C:\Python27\include/Python.h:8, from ./dnet.c:4: c:\mingw\include\sys\types.h:136:18: note: previous declaration of 'ssize_t' was here typedef _ssize_t ssize_t; ... ./dnet.c:2729:29: error: lvalue required as left operand of assignment ((PyObject*)__pyx_v_next) = Py_None; Py_INCREF(((PyObject*)__pyx_v_next)); ^ ... ./dnet.c:2741:32: error: lvalue required as left operand of assignment ((PyObject *)__pyx_v_next) = __pyx_3; ^
Instead, I just installed python 2.6.3, and run python 2.7.6 64-bit for other things.
Here is the post for prosperity.
It may be appealing to have the MaxMind GeoIP databases available to you for a variety of reasons.
Updating the databases:
echo "wget http://www.maxmind.com/download/geoip/database/asnum/GeoIPASNum.dat.gz" > /usr/local/share/GeoIP/getgeoipdbs.sh echo "gunzip -df ./GeoIPASNum.dat.gz" >> /usr/local/share/GeoIP/getgeoipdbs.sh echo "mv -f ./GeoIPASNum.dat /usr/local/share/GeoIP/" >> /usr/local/share/GeoIP/getgeoipdbs.sh echo "wget http://geolite.maxmind.com/download/geoip/database/GeoLiteCountry/GeoIP.dat.gz" >> /usr/local/share/GeoIP/getgeoipdbs.sh echo "gunzip -df ./GeoIP.dat.gz" >> /usr/local/share/GeoIP/getgeoipdbs.sh echo "mv -f ./GeoIP.dat /usr/local/share/GeoIP/" >> /usr/local/share/GeoIP/getgeoipdbs.sh echo "wget http://www.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz" >> /usr/local/share/GeoIP/getgeoipdbs.sh echo "gunzip -df ./GeoLiteCity.dat.gz" >> /usr/local/share/GeoIP/getgeoipdbs.sh echo "mv -f ./GeoLiteCity.dat /usr/local/share/GeoIP/GeoIPCity.dat" >> /usr/local/share/GeoIP/getgeoipdbs.sh chmod r+x /usr/local/share/GeoIP/getgeoipdbs.sh /usr/local/share/GeoIP/getgeoipdbs.sh
Create a cron job:
crontab -e #cron table entry as (without the ##) ## 0 0 1 * * bash /usr/local/share/GeoIP/getgeoipdbs.sh
Downloading, build and install the python API:
cd wget http://www.maxmind.com/download/geoip/api/c/GeoIP-latest.tar.gz tar zxvf GeoIP-latest.tar.gz cd GeoIP-* ./configure && make && make install cp ./apps/geoipupdate-pureperl.pl /usr/local/bin/ wget http://www.maxmind.com/download/geoip/api/python/GeoIP-Python-latest.tar.gz tar zxvf GeoIP-Python-latest.tar.gz cd GeoIP-Python-*/ python setup.py build echo /usr/local/lib > /etc/ld.so.conf.d/local_libs.conf
Test the API:
I ran an `strace` on test.py and it seems to access `/usr/local/share/GeoIP/GeoIP.dat`, which is the “GeoLiteCountry” database, which can be assumed to be the default when creating an object with GeoIP.new(GeoIP.[read database to here]).
However, test.py and the other examples contain definitions for GeoIP.open() that open other databases, so that you can more easily define any of the above databases for reference.
The python API is very straightforward and the README/docs are very small. The C docs contain real information about the core C class (looks like the py API is just a wrapper).
Too bad the guy doesn’t cover O(nLogN + C).
You may want to set up PyTools for Visual Studio to get an environment going (love that IntelliSense!).
Saw this via Hacker News while drinking my coffee.
Neat. But I have XP on a VM on my Macbook, so how do I install this on XP?
3) Download and install Python (I prefer 2.7, not really sure why).
6) Upgrade distribute:
pip install --upgrade distribute
7) Grab some modules (compiled by Christoph Gohlke) [the following links are for python 2.7]: scipy, numpy, matplotlib, PyMC, MySQL-python, ipython, pyzmq, tornado, pyreadline, pygments, pyqt, pyside, lxml.
8) Run the IDE: C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exe
9) Create a python project.
10) Check out the list of Python Interpreters and the status of the intellisense DB generation via analysis of the installed modules via: Tools> Python Tools> Python Interpreters.
11) Start an interactive Python shell with Tools> Python Tools> “[interpreter] Interactive”
12) Execute python app within the interactive shell with Tools> Python Tools> Execute Python in Python Interactive (or shift+ctrl+f5).
1) There is a bug in the class analyzer for IntelliSense completion database generation where some classes’ pyd files will attempt to be parsed. This occurs with scipy (see Zooba’s response). If you do not wish to rename the exe `Microsoft.PythonTools.Analyzer.exe`, then you can build from source, as described in Zooba’s response on Monday July 22nd, 2013.
2) If you get an “SSL3_GET_SERVER_CERTIFICATE:certificate verify failed” error, then you will have to append the CA that is issuing you the certs to a file. You can use the above process and then copy the output of the pip section as:
move C:\Python27\Lib\site-packages\pip\cacert.pem C:\Python27\Lib\site-packages\pip\cacert.pem.bak pscp root@BOX:/usr/local/lib/python2.7/site-packages/pip/cacert.pem C:\Python27\Lib\site-packages\pip
I have an HTTPS proxy in place that uses a self-contained intermediate CA.
When using `pip` to install a python egg, I was receiving the following error:
SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
Add a CA to a certificate bundle:
Your ca-bundle.crt maybe be located in another directory.
cp /etc/pki/tls/certs/ca-bundle.crt /etc/pki/tls/certs/ca-bundle.crt.bak openssl x509 -text -in untrusted_cacert.cer >> /etc/pki/tls/certs/ca-bundle.crt openssl verify -CAfile /etc/pki/tls/certsca-bundle.crt untrusted-intermediateca.cer # should result in "OK" openssl s_client -showcerts -connect pypi.python.org:443 # should see "Verify return code: 0 (ok)" in resultant
Add a CA to be trusted by pip:
I had to run `strace` on `pip install` to see what it was referencing, as I was still getting the above error.
cp /usr/local/lib/python2.7/site-packages/pip/cacert.pem /usr/local/lib/python2.7/site-packages/pip/cacert.pem.bak openssl x509 -text -in untrusted_cacert.cer >> /usr/local/lib/python2.7/site-packages/pip/cacert.pem pip-2.7 -v install matplotlib -v
A picture of your esteemed author
The following is a conversation with a financial quant I know about finding outliers in data, and the irrelevance of relying on mean and standard deviation in data that is not in a normal distribution.
Here is a quick method I used today, before implementing an argus probe in a sustainable way, to create and parse PCAPs to determine high bandwidth offends.
ifconfig eth0 promisc netstat -i | grep eth #check for the P flag mkdir /pcaps/ nohup tcpdump -i 1 -w /pcaps/pcap -C 20 -W 250 'host 192.168.100.27' -Z root & #not really secure, look at -Z
Note that querycsv.py/sqlite3.py doesn’t like _underscores_, -dashes-, and files that contains only numbers.
This will generate /pcaps/pcapNNN…
Process PCAPs on a Windows box with tshark.exe:
echo echo ipsrc,ipdst,dstporttcp,srcporttcp,len ^> %1.csv > %temp%\tshark_len.bat echo ".\Wireshark\tshark.EXE" -r %1 -T fields -e ip.src -e ip.dst -e tcp.dstport -e tcp.srcport -e frame.len -E separator=, -E quote=d ^>^> %1.csv >> %temp%\tshark_len.bat for /r . %G in (pcap*) do %temp%\tshark_len.bat %G
`tcpdump` has relatively the same syntax.
Query CSVs with SQL statements:
Use a python module to quickly return calculations (http://pythonhosted.org/querycsv/):
pip install querycsv
Find the total bytes incoming to host:
querycsv.py -i test.csv "select ipsrc, ipdst, dstporttcp, dstporttcp, srcporttcp, sum(len) as 'len_in_bytes' from test group by ipsrc" Find the total bytes outgoing from host: querycsv.py -i test.csv "select ipsrc, ipdst, dstporttcp, dstporttcp, srcporttcp, sum(len) as 'len_in_bytes' from test group by ipdst"
For all files:
echo ipsrc,ipdst,dstporttcp,srcporttcp,len > mergedpcap.csv cat *.csv | grep -v "ipsrc,ipdst,dstporttcp,srcporttcp,len" >> mergedpcap.csv
Find the total bytes incoming to host:
querycsv.py -i mergedpcap.csv "select distinct dstporttcp, ipsrc, ipdst, sum(len) as 'len_in_bytes' from mergedpcap group by ipdst" > incoming.log [/source ] The record where ipsrc is the targeted host (in this case 192.168.100.27), will return the TOTAL length of all packets sent from the targeted host. (all uploadeded, yes UPloaded, bytes) Find the total bytes outgoing from host: querycsv.py -i mergedpcap.csv "select distinct dstporttcp, ipsrc, ipdst, sum(len) as 'len_in_bytes' from mergedpcap group by ipdst" > outgoing.log
The record where ipdst is the targeted host (in this case 192.168.100.27), will return the TOTAL length of all packets sent to the targeted host. (all downloaded, yes DOWNloaded, bytes)
You’ll notice that querycsv first imports the csv data to an in memory sqlite3 db. This makes offering a full set of sql queries and functions trivial.
There exists other options to solve this sort of situation:
1) PCAP to SQL platforms: pcap2sql and c5 sigma.
2) SQL querying PCAP directly: PacketQ (which lacks some SQL queries and functions, see here). Here is a neat example of displaying some results.
3) robust solutions like pcapr-local, with integration to mu studio.