DNSENUM Tutorial – DNS Information Gathering Tool | KYB Tutorial 2

123 views Leave a comment
Hey Friends, we are back with Know your Backtrack i.e. KYB Tutorial 2. Today we will learn about another information gathering tool DNSENUM. The purpose of Dnsenum is to gather as much information as possible about a domain. Basically its an perl script and it performs below operations:

1) Get the host’s addresse (A record).
2) Get the namservers (threaded).
3) Get the MX record (threaded).
4) Perform axfr queries on nameservers and get BIND versions(threaded).
5) Get extra names and subdomains via Google scraping (Google query = “allinurl: -www site:domain”).
6) Brute force subdomains from file, can also perform recursion on subdomain that have NS records (all threaded).
7) Calculate C class domain network ranges and perform whois queries on them (threaded).
8) Perform reverse lookups on net ranges ( C class or/and whois net ranges) (threaded).
9) Write to domain_ips.txt file ip-blocks.

DNSENUM Tutorial - KYB 2
DNSENUM Tutorial – KYB 2

So let’s start with a basic example, suppose we want to gather DNS information of domain say hackingloops.com then how we will proceed:

1. To start DNSENUM, First start the backtrack then follow below path:

Backtrack >> Information Gathering >> Network Analysis >> DNS Analysis >> dnsenum

Now terminal will open with DNSENUM script loaded with list of all sub commands that we can use with DNSENUM.

2. Now to gather DNS information of hackingloops type below command in terminal :

./dnsenum.pl hackingloops.com

Below are screenshots:

DNSENUM Tutorial
DNSENUM Tutorial : How to use DNSENUM

DNSENUM Tutorial

 So we can see by just giving simple command we have name servers, mail servers, host address and much more things. The results vary from website to website and scope enhances when there are sub domains in the target website.

We can also use DNSENUM to scrap the sub domains of a website from Google. For doing this we need to type below command :

./dnsenum.pl -p 1 -s 1 example.com

Scrapping will not work on those websites which do not have sub domains and for those websites which has restricted wild card scrapping.

That’s all for today frens. Hope this helps, you can practice multiple combinations and sub commands mentioned below to extract juicy information regarding DNS’s for any domain.


  –dnsserver  Use this DNS server for A, NS and MX queries.
  –enum Shortcut option equivalent to –threads 5 -s 20 -w.
  -h, –help Print this help message.
  –noreverse Skip the reverse lookup operations.
  –private Show and save private ips at the end of the file domain_ips.txt.
  –subfile Write all valid subdomains to this file.
  -t, –timeout The tcp and udp timeout values in seconds (default: 10s).
  –threads The number of threads that will perform different queries.
  -v, –verbose Be verbose: show all the progress and all the error messages.                        
  -p, –pages The number of google search pages to process when scraping names, 
the default is 20 pages, the -s switch must be specified.
  -s, –scrap The maximum number of subdomains that will be scraped from Google.
  -f, –file Read subdomains from this file to perform brute force.
  -u, –update Update the file specified with the -f switch with valid subdomains.
a (all) Update using all results.
g Update using only google scraping results.
r Update using only reverse lookup results.
zUpdate using only zonetransfer results.
  -r, –recursion Recursion on subdomains, brute force all discovred subdomains that have an NS record.


-d, –delay The maximum value of seconds to wait between whois queries, the value is defined randomly, default: 3s.
  -w, –whois Perform the whois queries on c class network ranges.


  -e, –exclude Exclude PTR records that match the regexp expression from reverse lookup results, useful on invalid hostnames.




  -o –output Output in XML format. Can be imported in MagicTree


Author Bio

Lokesh Singh

Hey Friends, This is Lokesh Singh. Your friend, who loves to share knowledge with friends as i believe in "Sharing is Caring". If you like our tutorials then you can send your gratitude by saying thanks or clicking any of our Sponsor ads.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>