The RobotsDisallowed project is a harvest of the robots.txt disallowed directories of the world's top websites---specifically those of the Alexa 100K and the Majestic 100K.
This list of Disallowed directories is a great way to supplement content discovery during a web security assessment or bug bounty, since the website owner is basically saying,
"Don't go here; there's sensitive stuff in there!
In other words, it's a list of potential high-value targets.
So what we did is take the Alexa Top 100,000 websites, download their robots.txt files, extracted all Disallowed directories, and then performed a bunch of cleanup on them (they are a mess) to make the lists as useful as possible during web assessments.
My personal favorite option, however, is to go with the
curated.txt list, because it's only around 500 items, and it is a collection of directories containing the following strings and content:
In short, it's the best of the best. I blogged more about here.
TL;DR: If you want the best use of your time, go with
This concept is not new. The RAFT project was the first to do this, and we thank them for their pioneering of the idea. But the project is now dead and gone, and since the idea works best when it's kept up-to-date, we decided to give it a refresh in the form of RobotsDisallowed.
If you have any ideas on what to improve, please email us at [email protected] or submit a pull request.
Thank you, and happy hacking!