Tips and Tricks
Below are some helpful tricks to help you in your adventures.
Change Verbosity During Scan
Press enter during a BBOT scan to change the log level. This will allow you to see debugging messages, etc.
Kill Individual Module During Scan
Sometimes a certain module can get stuck or slow down the scan. If this happens and you want to kill it, just type "
kill <module>" in the terminal and press enter. This will kill and disable the module for the rest of the scan.
Common Config Changes
Boost Massdns Thread Count
If you have a fast internet connection or are running BBOT from a cloud VM, you can speed up subdomain enumeration by cranking the threads for
massdns. The default is
1000, which is about 1MB/s of DNS traffic:
# massdns with 5000 resolvers, about 5MB/s bbot -t evilcorp.com -f subdomain-enum -c modules.massdns.max_resolvers=5000
The web spider is controlled with three config values:
1: the maximum directory depth allowed. This is to prevent the spider from delving too deep into a website.
0== all spidering disabled, default:
0): the maximum number of links that can be followed in a row. This is designed to limit the spider in cases where
web_spider_depthfails (e.g. for an ecommerce website with thousands of base-level URLs).
25): the maximum number of links per page that can be followed. This is designed to save you in cases where a single page has hundreds or thousands of links.
Here is a typical example:
web_spider_depth: 2 web_spider_distance: 2 web_spider_links_per_page: 25
# run the web spider against www.evilcorp.com bbot -t www.evilcorp.com -m httpx -c spider.yml
You can also pair the web spider with subdomain enumeration:
# spider every subdomain of evilcorp.com bbot -t evilcorp.com -f subdomain-enum -c spider.yml
Custom HTTP Proxy
http_proxy config option like so:
# enumerate subdomains, take web screenshots, proxy through Burp bbot -t evilcorp.com -f subdomain-enum -m gowitness -c http_proxy=http://127.0.0.1:8080
httpx module emits
HTTP_RESPONSE events, but by default they're hidden from output. These events contain the full raw HTTP body along with headers, etc. If you want to see them, you can modify
omit_event_types in the config:
omit_event_types: - URL_UNVERIFIED # - HTTP_RESPONSE
Display Out-of-scope Events
By default, BBOT only shows in-scope events (with a few exceptions for things like storage buckets). If you want to see events that BBOT is emitting internally (such as for DNS resolution, etc.), you can increase
scope_report_distance in the config or on the command line like so:
# display events up to scope distance 2 (default == 0) bbot -f subdomain-enum -t evilcorp.com -c scope_report_distance=2
Speed Up Scans By Disabling DNS Resolution
If you already have a list of discovered targets (e.g. URLs), you can speed up the scan by skipping BBOT's DNS resolution. You can do this by setting
# disable the creation of new events from DNS resoluion bbot -m httpx gowitness wappalyzer -t urls.txt -c dns_resolution=false
URL_UNVERIFIED events are URLs that haven't yet been visited by
httpx visits them, it reraises them as
URLs, tagged with their resulting status code.
For example, when
excavate gets an
HTTP_RESPONSE event, it extracts links from the raw HTTP response as
URL_UNVERIFIEDs and then passes them back to
httpx to be visited.
URL_UNVERIFIEDs are hidden from output. If you want to see all of them including the out-of-scope ones, you can do it by changing
scope_report_distance in the config like so:
# visit www.evilcorp.com and extract all the links bbot -t www.evilcorp.com -m httpx -c omit_event_types= scope_report_distance=2