February 25, 2019

DNS & Web Enumeration Reference

DNS & Web Enumeration Reference

Your enumeration strategy is going to be largely dependent on your scope. Scope defines a lot, and may be a crucial factor in the success of your engagement.

If your scope is an entire company, and not just a set of hosts (or hopefully not a single host, shudder), then you will need adapt your strategy accordingly.

For example, there's little use in doing OSINT and Recon for a physical office building if your scope is a selection of development hosts in an isolated VLAN.

This is going to be mainly passive & active host-based recon. For OSINT enumeration, there will be a separate article posted soon that will contain that.

Passive Recon & Discovery of Assets

Source: https://0x00sec.org/t/osint-passive-recon-and-discovery-of-assets/6715

Assuming you have a starting point, such as a domain, that is a great place to start. Domain enumeration is usually a gold-mine, as DNS is often massively underrated by defensive security teams. Another thing that's fun is that in many compliance regulations, you have to have valid SSL externally, and so very often they have to be connected to actual domain names in some cases (and even some use letsencrypt!).

First thing I like to do is break out DNSDumpster and `Sublist3r`.

DNS Dumpster

https://dnsdumpster.com is a very valuable tool and will even provide you with a little map that's invaluable. Try and look for patterns in the IP ASN's. Lets take 0x00sec.org as an example domain to do some passive recon on.

DNS Dumpster Example Map

If you look at this you can learn so much before even hitting the domain. Look at the A records and where DNS is hosted. These can contain hugely informative pointers such as:

  • Where is their mail hosted?
  • Where are they hosting DNS? Are they using a WAF or Proxy service like cloudflare?
  • What platform are they hosting on? They might reuse this service for other things.
  • What countries / geo locations do they host in?
  • Is there any pattern to the type of subdomains? Can you try and enumerate more? For example (server1.mydomain.com), or (mustang.mydomain.com, charger.mydomain.com).
  • Where did they register the domain? Are there other domains registered from the same registrar?

Now that you have this information, you can correlate it with other DNS enumeration tools, and start to build up a better picture.


Sublist3r is a valuable tool, although I do feel that it provides a lot of false-positives. It pulls subdomains from historical DNS records, SSL Certificate registries, Virustotal, PassiveDNS, DNSDumpster and multiple search engines.

I would take the results from Sublist3r with a grain of salt, but like all the tools we're going to be using here, none of them are a magic bullet for DNS enumeration, but rather another tool in the toolbox.

sublist3r -d 0x00sec.org


                 ____        _     _ _     _   _____
                / ___| _   _| |__ | (_)___| |_|___ / _ __
                \___ \| | | | '_ \| | / __| __| |_ \| '__|
                 ___) | |_| | |_) | | \__ \ |_ ___) | |
                |____/ \__,_|_.__/|_|_|___/\__|____/|_|

                # Coded By Ahmed Aboul-Ela - @aboul3la
[-] Enumerating subdomains now for 0x00sec.org
[-] Searching now in Baidu..
[-] Searching now in Yahoo..
[-] Searching now in Google..
[-] Searching now in Bing..
[-] Searching now in Ask..
[-] Searching now in Netcraft..
[-] Searching now in DNSdumpster..
[-] Searching now in Virustotal..
[-] Searching now in ThreatCrowd..
[-] Searching now in SSL Certificates..
[-] Searching now in PassiveDNS..
[-] Total Unique Subdomains Found: 11

You'll notice if you try and resolve some of these, they won't go anywhere. But still, they're very valuable to add to the 'investigate more' list.



This tool is a little bit un-maintained, but is a really great tool that I use on every engagement due to how simple it is.

You need to watch domains that have wildcard domain support, this is a feature that it lacks but can be fixed by a simple grep -v "<wildcardip>".

aiodnsbrute 0x00sec.org


[*] Brute forcing 0x00sec.org with a maximum of 512 concurrent tasks...
[*] Wordlist loaded, brute forcing 1000 DNS records
[*] Using recursive DNS with the following servers: ['', '']             
[+] www.0x00sec.org           ,                            
[+] status.0x00sec.org                                                 
[+] irc.0x00sec.org                                                    
100%|█████████████████████████████████████████████| 1000/1000 [00:06<00:00, 15679.59records/s]
[*] completed, 3 subdomains found.

We're lucky here because 0x00sec.org doesn't use wildcard IP's. If they do we can always run

aiodnsbrute 0x00sec.org | grep -v the.ip.they.use.for.wildcards
aiodnsbrute 0x00sec.org | grep -v

What I like about this tool is that with a custom wordlist, and (Cloudflare) as your DNS, you can test thousands of subdomains within seconds. This is great for sanity checking. Do watch out though if your rate is too high, Cloudflare might start dropping your traffic.

Another thing to watch if they do have wildcard IP's, there might be legitimate subdomains that point there, so you'll need to do a vhost bruteforce to properly identify that (with the wildcard IP as the target). You can use auxiliary/scanner/http/vhost_scanner in metasploit to do that (but it is not passive, it is very noisy, so be aware of that).

Active Reconnaissance

Now you have some hosts that you've identified. Hopefully you've utilised other OSINT techniques with reverse whois among other things.

If my goal isn't to remain stealthy, and just to go full-ham, then I generally use a combination of nmap scans simultaneously:

  1. Very intense slow, intense scan for later reference
  2. A "quick" UDP scan
  3. A full port UDP scan

UDP is something that is massively overlooked, and so many people actually forget exists.

Intense Slow Scan

sudo nmap -v -sS -A -Pn -T5 -p- scanme.nmap.org

Quick UDP Scan

sudo nmap -v -sU -T5 target.com

Full UDP Scan

sudo nmap -v -sU -T5 -p- victim.com

It may also pay to use the -oN flag to save them to a file for later reference.

HTTP Enumeration

Now that you've scanned and have open ports, the next thing I always do is explore HTTP. HTTP is a hugely valuable source for shells and information, you very usually can find _something_ over http.

Dirbusting with Gobuster

Again, if you're going full-ham, you'll want to dirbust these directories. It's shocking what you can find with the SecLists/big.txt wordlist. For dirbusting I tend to use gobuster as it is just nice and simple and just works great.

gobuster -w path/to/SecLists/Discovery/Web_Content/big.txt -u https://victim.com/

Pulling headers

Something I love doing is pulling headers. Things like set-cookies headers and can be super helpful in fingerprinting a web framework, as well as just grabbing plain banners, PHP versions and Apache/Nginx versions.

curl -v -I --user-agent "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.106 Safari/537.36" https://google.com

Pulling DNS records from HTTPS Certs - Credit to Florian Hansemann

Might find some juicy subdomains in there! Or better yet, you might find some other hosts that the same cert is signed for. Multi-Domain ftw!

echo | openssl s_client -connect 0x00sec.org:443  | openssl x509 -noout -text | grep DNS | sed 's/,/\n/g'


It's shocking how many forget about Robots.txt, it's literally a piece of text telling google what NOT to index. Sometimes it's just redundant directories, and sometimes its /verysecretdirectory. Be careful though, this can be used to honeypot you (I've done this many times before!).


My experience with nikto isn't particularly positive, it's never found much of interest, but I know people who run it on every engagement so it's worth a mention. Better to have too much info right? - Even if it is just to filter out the stuff you don't think is relevant.

nikto -h https://yoursite.com/


Wpscan is invaluable if you're doing wordpress scanning, but it is very verbose and it may show up 10 RCE's when in fact none of them actually work.

I always use Docker for wpscan as I find it doesn't tend seem to play well natively. But maybe that's just me, it would probably be okay on Kali.

Enumerate Users

docker run -ti -v $(pwd):/wpscan/tmp/ --rm wpscanteam/wpscan --log tmp/log.txt --force --random-agent --batch --enumerate u -u https://wordpress-site.local/

Enumerate Plugins

docker run -ti -v $(pwd):/wpscan/tmp/ --rm wpscanteam/wpscan --log tmp/log.txt --force --random-agent --batch --enumerate p -u https://wordpress-site.local/

Web framework fingerprinting

A key part of your web-based reconnaissance should be in assessing what frameworks and technologies are being used.  

Wappalyzer API

curl 'https://api.wappalyzer.com/lookup-basic/beta/?url=https://google.com/'

This will return something like:

[{"name":"Google Web Server","icon":"Google.svg"}]

If you want to parse it, use jq

curl 'https://api.wappalyzer.com/lookup-basic/beta/?url=https://google.com/' | jq


    "name": "Google Web Server",
    "icon": "Google.svg"


Chameleon is a tool that scrapes from IBM X-Force, McAfee and a few other classification platforms. It identifies things like the type of website, a forum, an email client, a business website etc.

This doesn't tell you anything about the technologies used, but may show you what it might of been at one point or provide some extra background without ever visiting the site. One of the more passive ways to identify a website.

python2 chameleon.py --proxy a --submit --domain 0x00sec.org


I hope this was helpful! If you have anything to add, send an email to delta@navisec.io and we'll be sure to add it in. We are constantly updating these references so be sure to check back every now and then to see if we've added any new tricks.

We use these reference guides internally for our Red Teamers and Offensive Security Engineers, so it just made sense to give this away for everybody to use and read!