PROBLEM:
-Have created a sitemap that goes from root->links->(1) post, and (2) comments
-Have selected multiple selectors on all the above (and used the hotkeys S,P,C to tailor the amount of data I need from each section)
-When scraping, I put Request interval (ms) at 16000 (to allow me time to scroll down as much as I feel is appropriate) and Page load delay (ms) at 3000. FYI ITS AN INFINITE SCROLL WEBSITE.
Most of the time the scraping won't gather all data (posts usually is fine, but for comments, it might only pick up 10%-40% of them) even if I do scroll down a fair amount, it only picks up 10-40% of what I've actually scrolled through.
This is evident when I export into excel
{"_id":"redditAM","startUrl":["https://www.reddit.com/r/AskMen/top/?t=year"],"selectors":[{"id":"post titles as links","parentSelectors":["_root"],"type":"SelectorLink","selector":"a.SQnoC3ObvgnGjWt90zD9Z","multiple":true,"delay":0},{"id":"Posts","parentSelectors":["post titles as links"],"type":"SelectorText","selector":"div._3xX726aBn29LDbsDtzr_6E","multiple":true,"delay":0,"regex":""},{"id":"comments","parentSelectors":["post titles as links"],"type":"SelectorText","selector":"div._3cjCphgls6DH-irkVaA0GM","multiple":true,"delay":0,"regex":""}]}