This module implements a generic web crawler.
Fully asynchronous operation (Several hundred simultaneous requests)
Supports the /robots.txt exclusion standard
Number of concurrent fetchers
Bits per second (bandwidth throttling)
Number of concurrent fetchers per host
Delay between fetches from the same host
Supports HTTP and HTTPS