Tuesday, 29 May 2012

We Have the Port Scans, what now?


It's been a while, I hope you're good. I'm fine thanks, busy as sin but isn't that always the way? So where did we leave off? From reading back through my previous post, we'd scanned our little guts out and pulled a list of all ports that were open and all the services that can be interacted with. Boy haven't we been busy! 

It just so happens that now is when the real fun begins. Have a bit of a perouse through the results, not that easy to read aye? Sure we can quickly find some 135, 445 and have a quick fiddle through the lovely lovely file shares, but where's the automation? This post should cover some basics about gathering even more data from the services we've identified using our ever faithful set of tools such as nikto, gnome-web-photo, curl et al and keeping the data useable.

First things first, let's bring all of our results together in a more machine readable way. From the previous post we've grabbed all of our nmap output into the three decent formats, plain, greppable and xml. For the purposes of this post we'll be using the xml format and parsing it using xmlstarlet (for those of you that aren't already using starlet, grab a copy, it's a brilliant little command line parser that I can't live without, nessus, nmap, surecheck, anything that dumps xml suddenly becomes friendly to use again!)

The little gem i've been using for a while now is:

cat port_scans/hot-targets.tcp.services | xmlstarlet sel -T -t -m "//state[@state='open']" -m ../../.. -v address/@addr -m hostnames/hostname -i @name -o '  (' -v @name -o ')' -b -b -b -o "," -m .. -v @portid -o ',' -v @protocol -o "," -m service -v @name -i "@tunnel='ssl'" -o 's' -b -o "," -v @product -o ' ' -v @version -v @extrainfo -b -n -| sed 's_^\([^\t ]*\)\( ([^)]*)\)\?\t\([^\t ]*\)_\1.\3\2_' | sort -n -t.

This is a slightly bastardised version of this oneliner brought to you from the lovely folks at redspin. It takes an nmap xml output file (singular in this case) and creates output like this:

10.13.37.10,22,tcp,ssh,OpenSSH 4.3protocol 2.0
10.13.37.10,2301,tcp,http,CompaqHTTPServer *** httpd
10.13.37.10,2381,tcp,http,Apache httpd SSL-only mode
10.13.37.10,3260,tcp,iscsi,
10.13.37.10,5988,tcp,http,Web-Based *** httpd
10.13.37.10,5989,tcp,https,Web-Based *** httpd
10.13.37.11,427,tcp,svrloc,
10.13.37.11,443,tcp,https,VMware ESXi Server httpd
10.13.37.11,5989,tcp,tcpwrapped,
10.13.37.11,8000,tcp,http-alt,
10.13.37.11,8042,tcp,fs-agent,
10.13.37.11,8045,tcp,unknown,
10.13.37.11,80,tcp,http,
10.13.37.11,8100,tcp,tcpwrapped,
10.13.37.11,902,tcp,vmware-auths,VMware Authentication Daemon 1.10

Isn't this a lot more greppable than -oG ? Having the data dumped out to csv allows us to rapidly move through and select the exact services we want to interogate. An example:

root@bt:~# cat output.csv | grep http
10.13.37.10,2301,tcp,http,CompaqHTTPServer *** httpd
10.13.37.10,2381,tcp,http,Apache httpd SSL-only mode
10.13.37.10,5988,tcp,http,Web-Based *** httpd
10.13.37.10,5989,tcp,https,Web-Based *** httpd
10.13.37.11,443,tcp,https,VMware ESXi Server httpd
10.13.37.11,8000,tcp,http-alt,
10.13.37.11,80,tcp,http, 

An even better example:

root@bt:~# cat output.csv | grep http | cut -f 1,2 -d "," | tr "," ":"
10.13.37.10:2301
10.13.37.10:2381
10.13.37.10:5988
10.13.37.10:5989
10.13.37.11:443
10.13.37.11:8000
10.13.37.11:80

An even better example still:

root@bt:~# cat output.csv | grep http | cut -f 1,2 -d "," | tr "," ":" | while read line; do /pentest/web/nikto/nikto.pl -config /pentest/web/nikto/nikto.conf -h $line -output $line.txt; done
....snip....

You get the idea.

While we're on the subject, a nice precursor to nikto is a bit of web scouring. We've all been in the situation before where we've been on a internal test with limited time only to discover 100 webservers spread across the network, it's always a case of best efforts and being left wondering if the ones we missed were the ones that would've bent over. This is where webscour comes in. It's a little script from Geoff over at Cyberis (Available here) that given a list of addresses grabs screenshots (using gnome-web-photo) and header information from each webserver and produces a handy html file to view them all from. Suddenly all the default content and HP OpenViews can be found quickly and we can move straight onto the Accounting App running Classic ASP on IIS4 that hasn't been in use for 5 years.

Incidently this can be run in a similar fashion to the one liner above:

root@bt:~# cat output.csv | grep http | cut -f 1,2 -d "," | tr "," ":" | ./webscour.pl webservers.htm

As you can imagine there is a lot more we can do with webservers, dirbuster, skipfish the sitekiller and any other forcedbrowse/fuzzer can usually be used in this way, and as always the more data the better.

Next steps as far as web servers are concerned usually involve getting this information into Burp, that way we can play with it properly. Buby is a sensible choice and further down the line we'll look into automated spidering and active scanning from the terminal, replaying nikto/dirbuster output directly into Burp and also utilising the FuzzDB to profile out any CMSs we come across. But that unfortunately will have to wait for another post.

As usual any thoughts, critiques or straight forward calling out either head down to the comments or hit me up on twitter. Hwyl am Nawr!

Final Thoughts

It wouldn't be fair if I weren't to go off topic at least once in a post so here you go....

cat output.csv | grep 161,udp | cut -f 1 -d "," | /pentest/enumeration/snmp/onesixtyone/onesixtyone -c /pentest/enumeration/snmp/onesixtyone/dict.txt -o onesixtyone.out -i -

Have some snmp fun ya'll!

7 comments:

  1. Nice post, I like where you're going with this. I've never tried out xmlstarlet before, but I'm definitely going to start playing around with it after reading this over.

    ReplyDelete
  2. Thanks for the mentions guys. A similar script written by @jjkakakk can be found here http://www.cyberis.co.uk/downloads/gnmap.pl and works on gmap output rather than XML.

    e.g. cat /tmp/out.gmap | /opt/scripts/gnmap.pl | grep http | cut -f 1,2 -d "," | tr "," ":" | xargs -i nikto -h {}

    Unfortunately gnome-web-photo is a bit flaky on the distros I've come across, but even just grabbing a title can be a nice starting point for servers of interest.

    Geoff

    ReplyDelete
  3. Awesome post, however I can only seem to get the script to show the MAC address together with the ports and not the IP's. Any tips as to how to have it display only the IP's with ports? Note I am using the -oX option with the latest nmap 6.

    ReplyDelete
    Replies
    1. I should've replied to this months ago, apologies. If you kick off the command with a cat port_scans/hot-targets.tcp.services | grep -v mac | xmlstarlet... that should sort out your problem.

      Delete
  4. Hello,

    Great post.

    I found the xmlstarlet piece interesting. I tried to walk through it here http://pjhartlieb.blogspot.com/2013/02/parsing-output-with-xmlstarlet.html

    The only bit I can't pin down is the last "sed" statement at the end. My output is the same whether I include it or not. What am I missing?

    -pjh

    ReplyDelete
    Replies
    1. You seem to be the only person that caught my typo, I cut off the command a bit early, it should read:

      | sed 's_^\([^\t ]*\)\( ([^)]*)\)\?\t\([^\t ]*\)_\1.\3\2_' | sort -n -t. -k1,1 -k2,2 -k3,3 -k4,4 -k5,5 | sed 's_^\(\([0-9]\{1,3\}\.\)\{3\}[0-9]\{1,3\}\)\.\([^ \t]*\)\( ([^)]*)\)\?_\1\4\t\3_'

      It's a host/port sort.

      Hope that helps, and apologies for the confusion.

      Delete
    2. ah .. excellent ... let me go back and update ..thanks!

      Delete