Chrome crashs without running scrapes

Web Scraper version: 0.4.2
Chrome version: 79.0.3945.117 (64-Bit)
OS: Win10x64

It is enough to activate Web Scraper and Chrome crashs repeatedly after 2-10 minutes since browser start. There are some sitemaps inside Web Scraper, but no scrape runs. As i said, it is enough just to activate Web Scraper to make Chrome crashing. It is definitely WebScraper - tested Chrome with single active extension.

Hard to diagnose without sitemap or url. I'm guessing there is a lot of data; there is known issue with WS crashing when there is too much data, especially when using a scroller.

On my PC it crashes at around 1500 rows when there is a scroller, and at around 4000 rows when there is no scroller. Workaround is to limit your scraper to 1500/4000 per session, or whatever the row limit is on your PC.

  1. What do you mean by activating the extension? Are you launching a scraper and then it crashes?
  2. What is crashing? A tab or entire browser?
  3. Do you have CouchDB integration enabled?
  4. Please provide error logs from the background page:

To access error messages follow these steps:

  1. Open chrome://extensions/ or go to manage extensions
  2. Enable “developer mode” at the top right
  3. Open Web Scrapers “background page”
  4. A new popup window should appear.
  5. Go to “Console” tab. You should see Web Scraper log messages and errors there.

What do you mean by activating the extension?
I mean just activating: extensions→Web Scraper→launch, as on screenshot:

What is crashing?
Entire browser

Do you have CouchDB integration enabled?
No. As i said, there no scrape process runs.

Please provide error logs
Done as you recommended - no errors in the log. Only this:

background_script.js:18739 initial configuration t.defaultdefaults: {storageType: "local", sitemapDb: "scraper-sitemaps", dataDb: ""}storageType: "local"sitemapDb: "scraper-sitemaps"dataDb: ""__proto__: Object
background_script.js:18702 initializing Background Script message listener

Finally got two warnings displayed: here they are:


Errors are displayed just after extension activation, without any running scrape.

  1. Could there be anything unique in the sitemaps that you made or in the data that you scraped? For example - sitemaps with lots of selectors, sitemaps with lots of start urls, large data blocks extracted from page (entire HTML of a page).
  2. Try installing and using our dev extension - https://chrome.google.com/webstore/detail/web-scraper-dev/fkelmgkpjgpicpdgjbocohgnfkipebgh . The dev extension doesn't have anything fixed related to this kind of problem. But if it is running then it would indicate that you could have a corrupted Chrome internal storage.
  3. Close all chrome windows and start it from command line. Maybe there will be an additional error message regarding this. Try this guide - https://www.windows-commandline.com/open-chrome-from-command-line/

So, now i've got it: not dependently of a scrape runs or not, just an existing of a sitemap (500kB big) makes Chrome crashing. Here is the sitemap: https://linx.li/078rmx3d.json

BTW. Firefox (latest) seems not to have this issue - no crashs until now with the same sitemap.