Passive Sources

What is passive subdomain enumeration?

Passive subdomain enumeration is a technique to query for passive DNS datasets provided by services like SecurityTrails, Censys, Shodan, BinaryEdge, VirusTotal, Whoisxmlapi, etc. to obtain the subdomains of a particular target. Here we don't send any active probes to our target, instead passively try to scrape information available from the internet.

There are in total around 90 passive DNS sources/services that provide such datasets to query them. It's difficult to manually query these third-party services thus, to ease this process various tools are developed which automate these processes.

It's highly recommended to read this section first, before proceeding further.

  1. Passive DNS enumeration tools

  2. Internet Archive

  3. Github Scraping

  4. GitLab Scraping

A) Passive DNS gathering tools

1) Amass

  • Author: OWASP (mainly caffix).

  • Language: Go

  • Total Passive Sources: 82

Amass is a Swiss army knife for subdomains enumeration that outperforms passive enumeration the best. Amass queries the most number of third-party services which results in more subdomains of a particular target. These are passive services that amass queries.

⚙️ Configuring amass:

  • Since amass written in Go, you need your Go environment properly set up(Steps to setup Go environment)

Installation:

go install -v github.com/owasp-amass/amass/v3/...@master

Setting up Amass config file:

  • Link to my amass config file for reference.

  • To make it possible for Amass to query the passive DNS datasets, it necessary for us to setup the API keys of those services in the Amass configuration file.

  • By default, amass config file is located at $HOME/.config/amass/config.ini

To get to know to create API keys, check out this article.

  • Now let's set up our API keys in the config.iniconfig file.

  • Open the config file in a text editor and then uncomment the required lines and add your API keys.

  • Refer to my config file(this is exactly how your amass config file should be)

# https://otx.alienvault.com (Free)
[data_sources.AlienVault]
[data_sources.AlienVault.Credentials]
apikey = dca0d4d692a6fd757107333d43d5f284f9a38f245d267b1cd72b4c5c6d5c31


#How to Add 2 API keys for a single service
# https://app.binaryedge.com (Free)
[data_sources.BinaryEdge]
ttl = 10080
[data_sources.BinaryEdge.account1]
apikey = d749e0d3-ff9e-gcd0-a913-b5e62f6f216a
[data_sources.BinaryEdge.account2]
apikey = afdb97ff-t65e-r47f-bba7-c51dc5d83347

Running Amass:

  • After setting up API keys now we are good to run amass.

amass enum -passive -d example.com -config config.ini -o output.txt

Flags:-

  • enum - Perform DNS enumeration

  • passive - passively collect information through the data sources mentioned in the config file.

  • config - Specify the location of your config file (default: $HOME/.config/amass/config.ini )

  • o - Output filename

🧙‍♂️Tip: After configuring your config file in order to verify whether the API keys have been correctly set up or not you can use this command: amass enum -list -config config.ini

2) Subfinder

Subfinder is yet another great tool that one should have in their pipeline. There are some unique sources that subfinder queries for, that amass doesn't. This tool is been developed by the famous ProjectDiscovery team, who's tools are used by every other bugbounty hunter.

⚙️Configuring Subfinder:

Installation:

go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest

Setting up Subfinder configuration file:

  • Subfinder's default config file location is at $HOME/.config/subfinder/provider-config.yaml

  • After your first installation, if you didn't find the configuration file populated by default run the following command again subfinder in order to get it generated.

  • The subfinder config file follows YAML(YAML Ain't Markup Language) syntax. So, you need to be careful that you don't break the syntax. It's better that you use a text editor and set up syntax highlighting.

Example config file:-

  • Link to my subfinder config file for reference.

  • Some passive sources like Censys , PassiveTotal use 2 keys in combination in order to authenticate a user. For such services, both values need to be mentioned with a colon(:) in between them. (Check how have I mentioned the "Censys" source values- APP-id:Secret in the below example )

  • Subfinder automatically detects its config file only if at the default position.

securitytrails: []
censys:
  - ac244e2f-b635-4581-878a-33f4e79a2c13:dd510d6e-1b6e-4655-83f6-f347b363def9
shodan:
  - AAAAClP1bJJSRMEYJazgwhJKrggRwKA
github:
  - d23a554bbc1aabb208c9acfbd2dd41ce7fc9db39
  - asdsd54bbc1aabb208c9acfbd2dd41ce7fc9db39
passivetotal:
  - sample-email@user.com:password123

Running Subfinder:

subfinder -d example.com -all -config config.yaml -o output.txt

Flags:-

  • d - Specify our target domain

  • all - Use all passive sources (slow enumeration but more results)

  • config - Config file location

🧙‍♂️ Tip:- To view the sources that require API keys subfinder -ls command

3) Assetfinder

  • Author: tomnomnom

  • Language: Go

  • Total passive sources: 9

Don't know why did I include this tool😂just because its build by the legend Tomnomnom ? It doesn't give any unique subdomains compared to other tools but it's extremely fast.

go install github.com/tomnomnom/assetfinder@latest

Running:

assetfinder --subs-only example.com > output.txt

4) Findomain

  • Author: Edu4rdSHL

  • Language: Rust

  • Total Passive sources: 21

Findomain is one of the standard subdomain finder tools in the industry. Another extremely fast enumeration tool. It also has a paid version that offers much more features like subdomain monitoring, resolution, less resource consumption.

Configuring Findomain: ⚙️

Installation:-

  • Depending on your architecture download binary from here

wget -N -c https://github.com/Findomain/Findomain/releases/download/9.0.0/findomain-linux.zip
unzip findomain-linux.zip
mv findomain /usr/local/bin/findomain
chmod 755 /usr/local/bin/findomain

Configuration:-

  • You need to define API keys in your .bashrc or .zshrc .

  • Findomain will pick up them automatically.

export findomain_virustotal_token="API_KEY"
export findomain_fb_token="API_KEY"

Running Findomain:

findomain -t example.com -u output.txt

Flags:-

  • t - Target domain

  • u - Output file

B) Internet Archives

Internet Archives deploy their own web crawlers and indexing systems that crawl each website on the internet. Hence, they have historical data of all the websites that once existed. hence, Internet Archives can be a useful source to grab subdomains of a particular target that once existed and later perform permutations(more on this later) on them to get more valid subdomains.

Internet Archive when queried gives back URLs. Since we are only concerned with the subdomains, we need to process those URLs to grab only unique FQDN subdomains from them.

For this, we use a tool called unfurl. This tool helps to extract the domain name from a list of URLs.

5) Gau

Gau works by querying all the above 4 internet archive services and grabs all the URLs that their internet-wide crawler had once crawled. So through this process we get tons of URL's belonging to our target that once existed. After collecting the URLs we extract only the domain/subdomain part from those URLs.

Installation:

go install github.com/lc/gau/v2/cmd/gau@latest

Running gauplus:

gau --threads 5 --subs example.com |  unfurl -u domains | sort -u -o output_unfurl.txt

Flags:

  • threads - How many workers to spawn

  • subs - Include subdomains of the target domain

6) Waybackurls

Waybackurls works similar to Gau, but I have found that it returns some unique data that Gau couldn't find. Hence, we need to include waybackurls in our arsenal.

Installation:

go install github.com/tomnomnom/waybackurls@latest

Running Waybackurls:

waybackurls example.com |  unfurl -u domains | sort -u -o output.txt

C) GitHub Scraping

7) Github-subdomains

Organizations sometimes host their source code on GitHub, also employees working at these organizations sometimes leak the source code on GitHub. Additionally, I have came around instances where security researchers host their reconnaissance data in public repositories. The tool Github-subdomains can help you extract these exposed/leaked subdomains of your target from GitHub.

Installation:

go install github.com/gwen001/github-subdomains@latest

⚙️Configuring github-subdomains​​:

  • For github-subdomains to scrap domains from GitHub you need to specify a list of GitHub access tokens.

  • Here is an article on how you can generate your GitHub access tokens.

  • These access tokens are used by the tool to perform searches and find subdomains on behalf of you.

  • I always prefer that you make at least 10 tokens from 3 different accounts(30 in total) to avoid rate limiting.

  • Specify 1 token per line.

Running github-subdomains:

github-subdomains -d example.com -t tokens.txt -o output.txt

Flags:

  • d - target

  • t - file containing tokens

  • o - output file

D) Rapid7 Project Sonar dataset(depreciated)

Project Sonar is a security research project by Rapid7 that conducts internet-wide scans. Rapid7 has been generous and made this data freely available to the public. Project Sonar contains 8 different datasets with a total size of over 66.6 TB which are updated on a regular basis. You can read here how you can parse these datasets on your own using this guide.

This internet-wide DNS dataset could be an excellent resource for us to grab our subdomains right? But querying such large datasets could take up significant time. That's when Crobat comes to the rescue.

Cgboal has done an excellent work of parsing and indexing the whole Rapid7 Sonar dataset into MongoDB and creating an API to query this database. This Crobat API is freely available at https://sonar.omnisint.io/.More over he developed a command-line tool that uses this API and returns the results at a blazing fast speed.

Installation:

go get github.com/cgboal/sonarsearch/cmd/crobat

Running:

crobat -s example.com > output.txt

Flags:

  • s - Target Name

🏁That's it !!! Done with passive things 🏁

Liked my work? Don't hesitate to buy me a coffee XDD

❤️💙💚 https://www.buymeacoffee.com/siddheshparab 💚 💙 ❤️

Last updated