Scraper crashes when scraping a large site

I'm trying to scrape a large site, after about ~20,000 pages (1924941 records) (1GB in CouchDB) pages it seems to crash every time. Are there some limits to Web Scraper? Are there any performance tweaks I can do or change on my side, I'm running the scraper on my 2017 iMac with 24GB of memory, and using couchDB.

Thanks in advance,
Seb