MongoDB Scraper
Keep door closed at all times
MongoDB is a NoSQL database and it's very handful when you don't want the constrains of a fixed schema.
Sadly it comes with very unsecure default settings: if left untouched, MongoDB will allow connections without any username and password.
Accordingly to Shodan, there are more than 60k MongoDB instances freely accessible over the Internet. What if we start to crawl them all?
The MongoDB Scraper script can do exactly this: given a list of target IPs, it will connect to each one and, if any password field is found, the contents will be dumped.
Requirements are pretty light: you only need pymongo
(for obvious reasons) and colorlog
(to output some nice log messages):
pip install pymongo pip install colorlog
Then you have to put all our targets inside the data.json
file, creating a JSON encoded array:
["123.456.789", "987.654.321"]
If you exported a search result from Shodan, name it data_raw.json
and run the parse_data.py
script: it will extract the IP address and dump it in the data.json
file.
Finally you're ready to go:
python mongodb-scaper.py
Get an alert on large collections
You can instruct MongoDB Scraper to send you an email when it finds a very large collection (by default larger than 1M of rows). You simply have to rename the settings-dist.json
file into settings.json
and fill all the fields (please note: you have to use a SMTP server to send the emails)
Comments: