Scrape speed variance

I'd love to be able to add a scrape variance variable so that scrapes run at a variation to more closely mimic human browsing behavior.

IE
Request Interval 2000ms
Request Variance 10000ms
Page load delay 2000ms

This would scrape at a request interval of 2000-12000ms at random, in order to generate more inconsistent scrape timings for bot detection.

This could be useful and I'm sure it is not hard to implement. But I find websites are more likely to check on the frequency and the number of requests, e.g. more than 1 request every 5 seconds, or 12 requests per minute. You can avoid this just by setting longer delays. You could also avoid using "round" numbers in your delays, so something like:
Request interval: 2237
Page load delay: 5379

Also, web scrapers and website owners are in an arms race, and bot detection tools are getting more sophisticated: