Prevent Chrome from crashing when working on big project

Chrome crashes after scraping for few hours (error message with code) when working on big projects (1 million rows),
it crashes at around 4k rows,with 20 columns,approximately 80k cells

What practices are recommended to prevent or bypass crash of browser.Please suggest.

Usually there is a way to limit the scraper, e.g. create a limited paginator or do a two-stage scrape where in stage 1, you get all the product page URLs from the main page, then in stage 2 you have a different sitemap which uses all those URLs as Starturls. You should scrape it in smaller batches, say 4000-5000 Urls max at a time.