Can't downlaod large data set

I have set up a crawler on a large website ~ 1700 pages and about 300k records. The crawler runs seamlessly, if I interrupt the crawler near the start, I can view the data in the console and export it. However once the crawler reaches a certain level I can't see the data in the browser any more and I can't download.

When I click "Browse" the tool hangs on the "loading" message indefinitely. When I click export to csv (which is ultimately the goal) the screen hangs on the "Loading data from storage..." message.

Sitemap:
{"_id":"bincheck","startUrl":["https://bincheck.org"],"selectors":[{"id":"country","type":"SelectorLink","selector":"a.list-group-item","parentSelectors":["_root"],"multiple":true,"delay":0},{"id":"pagination","type":"SelectorLink","selector":"li.next a","parentSelectors":["country","pagination"],"multiple":false,"delay":0},{"id":"data","type":"SelectorTable","selector":"table.table","parentSelectors":["country","pagination"],"multiple":true,"columns":[{"header":"Bank Name","name":"Bank Name","extract":true},{"header":"Brand","name":"Brand","extract":true},{"header":"BIN","name":"BIN","extract":true}],"delay":0,"tableDataRowSelector":"tbody tr","tableHeaderRowSelector":"thead.cf tr"}]}

Hello,

it seems that you are scrapping too much data to deal with this extension.

Perhaps you could increase the different delay to make the extension work more easily, but not sure that will solve the problem.

Otherwise, vous have to make another scrapping to get all the URL of the different contries (189 countries).
Then you split you work grouping contries by 20, 40, 60 (I don't know exactly it's again a matter of volume to process). You inject theses grouped of countries in the JSON code and import them in your sitemmap

You will have to modify the sort ( pagination --> data data --> pagination) because I am not sure your sitemap is correct when for small countries where the botton "next" is not present.

Please check whether there are any errors in the background script during export process.

To access error messages follow these steps:

  1. Open chrome://extensions/ or go to manage extensions
  2. Enable “developer mode” at the top right
  3. Open Web Scrapers “background page”
  4. A new popup window should appear.
  5. Go to “Console” tab. You should see Web Scraper log messages and errors there.