Passive Sources

What is passive subdomain enumeration?

Passive subdomain enumeration is a technique to query passive DNS datasets provided by sources (Security trails, Censys, Shodan, Binaryedge, Virus total) to obtain the subdomains of a particular target.
It's highly recommended to read this first, before proceeding further.

What tools to use?

  • There are in total around 58 passive DNS sources that we can query(list). It's difficult to manually query these third-party services. Thus, various tools are developed which do this work on our behalf.
  • We can also scrape the data from the Internet Archives.
  • Organizations host their source code on Github; as well as security researchers post their recon data which may contain subdomains of our target.
  1. 1.
    Passive DNS gathering tools
  2. 2.
    Internet Archive
  3. 3.
    Github Scraping
  4. 4.
    The Rapid7 Project Sonar
​

A) Passive DNS gathering tools

1) Amass​

  • Author: OWASP (mainly caffix).
  • Language: Go
  • Total Passive Sources: 58
Amass is a Swiss army knife for subdomains enumeration that outperforms passive enumeration the best. This is because it queries the most number of third-party services which results in more subdomains of a particular target. These are sources that amass queries.

Configuring amass:

  • Since amass written in Go, you need your Go environment properly set up. (Steps to setup Go environment)
Installation:
go get -v github.com/OWASP/Amass/v3/...
Setting up Amass config file:
  • ​Link to my amass config file for reference.
  • By default, amass config file is located at $HOME/.config/amass/config.ini
  • Amass uses API keys mentioned in the config to query the third-party passive DNS sources.
  • There are in total 18 services on which you can signup and assign yourself with a free API key that will be used to query the large datasets.
Check this article on how to create API keys
  • Now let's set up our API keys in the config.iniconfig file.
  • Open the config file in a text editor and then uncomment the required lines and add your API keys
  • Refer to my config file(this is exactly how your amass config file should be).
# https://otx.alienvault.com (Free)
[data_sources.AlienVault]
[data_sources.AlienVault.Credentials]
apikey = dca0d4d692a6fd757107333d43d5f284f9a38f245d267b1cd72b4c5c6d5c31
​
#How to Add 2 API keys for a single service
​
# https://app.binaryedge.com (Free)
[data_sources.BinaryEdge]
ttl = 10080
[data_sources.BinaryEdge.account1]
apikey = d749e0d3-ff9e-gcd0-a913-b5e62f6f216a
[data_sources.BinaryEdge.account2]
apikey = afdb97ff-t65e-r47f-bba7-c51dc5d83347

Running Amass:

  • After setting up API keys now we are good to run amass.
amass enum -passive -d example.com -config config.ini -o output.txt
Flags:-
  • enum - Perform DNS enumeration
  • passive - passively collect information through the data sources mentioned in the config file.
  • config - Specify the location of your config file (default: $HOME/.config/amass/config.ini )
  • o - Output filename
πŸ§™β™‚
Tip:- After configuring your config file in order to verify whether the API keys have been correctly set up or not you can use this command:-
amass enum -list -config config.ini

​

2) Subfinder​

Subfinder tool provides the most number of subdomains compared to any other tool
πŸš€
. After all, it's been developed by the great ProjectDiscovery team on whose tools most security researchers depend upon. So, by setting up API keys will definitely provide you more subdomains. Simply, the best.

Configuring Subfinder:
βš™

Installation:
GO111MODULE=on go get -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder
Setting up Subfinder config file:
  • Subfinder's default config file location is at $HOME/.config/subfinder/config.yaml
  • When you install subfinder for the first time the config file doesn't get generated, hence you should run subfinder -h command to get it generated.
  • For subfinder you can obtain free API keys by signing up on 18 Passive DNS sources. (here the list of sources)
  • The subfinder config file follows YAML(YAML Ain't Markup Language) syntax. So, you need to be careful that you don't break the syntax. It's better that you use a text editor and set up syntax highlighting.
Example config file:-
  • ​Link to my subfinder config file for reference.
  • Some passive sources like Censys , PassiveTotal have 2 keys like APP-Id & Secret. For such sources, both values need to be mentioned with a colon(:) in between them. (Check how have I mentioned the "Censys" source values- APP-id:Secret in the below example )
  • Subfinder automatically detects its config file only if at the default position.
securitytrails: []
censys:
- ac244e2f-b635-4581-878a-33f4e79a2c13:dd510d6e-1b6e-4655-83f6-f347b363def9
shodan:
- AAAAClP1bJJSRMEYJazgwhJKrggRwKA
github:
- d23a554bbc1aabb208c9acfbd2dd41ce7fc9db39
- asdsd54bbc1aabb208c9acfbd2dd41ce7fc9db39
passivetotal:
- sample-[email protected]:sample_password
​
πŸ§™β™‚
Tip:- You can verify your YAML config file syntax on yamllint.com​

Running Subfinder:

subfinder -d example.com -all -config config.yaml -o output.txt
Flags:-
  • d - Specify our target domain
  • all - Use all passive sources (slow enumeration but more results)
  • config - Config file location
​
πŸ§™β™‚
Tip:- To view the sources that require API keys subfinder -ls command

​

3) Assetfinder​

  • Author: tomnomnom​
  • Language: Go
  • Total passive sources: 9
Don't know why did I include this tool
πŸ˜‚
just because its build by the legend Tomnomnom ? It doesn't give any unique subdomains compared to other tools but it's extremely fast.
go get -u github.com/tomnomnom/assetfinder
Running:
assetfinder --subs-only example.com > output.txt

​

4) Findomain​

  • Author: Edu4rdSHL​
  • Language: Rust
  • Total Passive sources: 16
Findomain is one of the standard subdomain finder tools in the industry. Another extremely fast enumeration tool. Has a paid version that offers much more features like subdomain monitoring, resolution, less resource consumption.

Configuring Findomain:
βš™

Installation:-
  • Depending on your architecture download binary from here​
wget -N -c https://github.com/Findomain/Findomain/releases/latest/download/findomain-linux
mv findomain-linux /usr/local/bin/findomain
chmod 755 /usr/local/bin/findomain
strip -s /usr/local/bin/findomain
Configuration:-
  • You need to define API keys in your .bashrc or .zshrc .
  • Findomain will pick up them automatically.
export findomain_virustotal_token="API_KEY"
export findomain_spyse_token="API_KEY"
export findomain_fb_token="API_KEY"

Running Findomain:

findomain -t example.com -u output.txt
Flags:-
  • t - target domain
  • u output file

B) Internet Archives

Internet Archives are web crawlers and indexing systems that crawl each website on the internet. Hence, they have historical data of any website that once existed. These can be a useful source to grab subdomains of a particular target that once existed and perform permutations on them to get more valid subdomains.
Internet Archive when queried gives back URLs.Since we are only concerned with the subdomains, we need to process those URLs to grab only unique subdomains from them.
For this, we use a tool called unfurl. When given URLs through stdin along with the "domain" flag, it extracts the domain part from them.

5) Gauplus​

Gauplus extracts data from internet crawling services. I prefer Gauplus than the original gau as sometimes it returns more results, as well as execution, completes faster than the original one.

Installation:

GO111MODULE=on go get -u -v github.com/bp0lr/gauplus

Running gauplus:

gauplus -t 5 -random-agent -subs example.com | unfurl -u domains | anew output.txt
Flags:
  • t - threads
  • random-agent - use random agents while querying
  • subs - include subdomains of the target domain

6) Waybackurls​

Waybackurls returns some unique data that gauplus/gau couldn't find as the sources are different. Hence, we need to include waybackurls in our arsenal.

Installation:

go get github.com/tomnomnom/waybackurls

Running Waybackurls:

waybackurls example.com | unfurl -u domains | sort -u output.txt
​

C) Github Scraping

Its often seen that organizations host their source on GitHub. Also, various security researchers host their recon data in public repositories. Github-subdomains tool helps to extract subdomains of your target from github.
Installation:
go get -u github.com/gwen001/github-subdomains
Configuring github-subdomains​​:
βš™
  • For github-subdomains to scrap domains from GitHub you need to specify a list of github access tokens.
  • ​Here is an article on how to make these access tokens.
  • These access tokens are used by the tool to perform searches and find data on behalf of you.
  • I always prefer that you make at least 10 tokens from 3 different accounts(30 in total) to avoid rate limiting.
  • Specify 1 token per line.
Running github-subdomains:
github-subdomains -d example.com -t tokens.txt -o output.txt
Flags:
  • d - target
  • t - file containing tokens
  • o - output file

D) Rapid7 Project Sonar dataset

​Project Sonar is a security research project by Rapid7 that conducts internet-wide scans. Rapid7 has been generous and made this data freely available to the public. Project Sonar contains 12 different datasets with a total size of over 45.6 TB which are updated on a regular basis. You can read here how you can parse these datasets on your own using this guide.
So this internet-wide DNS dataset could be an excellent resource for us to grab our subdomains right? But querying such large datasets could take up significant time. That's when Crobat comes to the rescue.

8) Crobat​

  • Author: Cgboal​
  • Language: Go
​Cgboal has done an excellent work of parsing and indexing the whole Rapid7 Sonar dataset into MongoDB and creating an API to query this database. This Crobat API is freely available at https://sonar.omnisint.io/.More over he developed a command-line tool that uses this API and returns the results at a blazing fast speed.

Installation:

go get github.com/cgboal/sonarsearch/cmd/crobat

Running:

crobat -s example.com > output.txt
Flags:
  • s - Target Name

​
🏁
That's it !!! Done with passive things
🏁

Liked my work? Don't hesitate to buy me a coffee XDD

​
❀
​
πŸ’™
​
πŸ’š
https://www.buymeacoffee.com/siddheshparab
πŸ’š
πŸ’™
❀

​
Copy link
On this page
What is passive subdomain enumeration?
What tools to use?
A) Passive DNS gathering tools
1) Amass
Configuring amass:
Running Amass:
2) Subfinder
Configuring Subfinder:
Running Subfinder:
3) Assetfinder
4) Findomain
Configuring Findomain:
Running Findomain:
B) Internet Archives
5) Gauplus
6) Waybackurls
C) Github Scraping
7) Github-subdomains
D) Rapid7 Project Sonar dataset
8) Crobat
Installation:
Running:
That's it !!! Done with passive things