Deleting/Reseting local storage files

Hi there,

I encountered an issue with large datasets, where Firefox and Chrome crashed when trying to create the CSV files for export. It would be great to have the option to reset the local database such that the sitemaps are not deleted (which is the case when I just delete the files).

Bump to this feature request

I used to be able to run 1000+ page scrapes and the CSV would export no problem. I'm now finding that anything more than about 300-500 pages causes the csv export to crash/do nothing. I can only assume this is because of cache issues

@97hills @snied

Hi, could you, please, provide us with more details regarding the issue?

Please, make sure to follow the steps described in the following post: How to submit a video bug report - #2

Hi - sorry for late reply.

I think I've followed the instructions above. However, because it was such a large job, I've skipped the scraping part and jumped straight to the issue I was having with the download. Video here:!AtlHvhq39mvXt9leuF8-4q-afrmZQw?e=KcUc6C

When I click to export to csv, it doesn't seem to do much (indeed I clicked twice). I think when I first started using the add-on, it took less than a minute to download a large scraping job. I originally thought the export just wasn't working but after trialling it more for this bug report, it seems that it is just a lot slower.

This scrape is a 1000 page scrape on web pages like this

and the csv export ended up being about 16,000 rows. This took 11 minutes to export the csv. It is a similar issue on another computer.

It may be as simple as I've forgotten how long it previously took to export data. However, I've got a lot of 100,000+ row files saved on my computer and I really don't recall waiting that long for the files to download. That is what has led me to believe it's a bloated cache. But admittedly, I have tried a fresh install in a new browser and had the same issue.

Chrome version: Version 122.0.6261.70
Windows 11 Home 22H2

I can provide the sitemap separately if needed

Please let me know if you need further info

@97hills Hi. Thank you for providing these details!

Could you, please, provide us with the sitemap you are referring to? Did you also try to export the scraped data in XLSX format and have experienced the same issue?

We would also appreciate if you could share the specifications of your computer.

Looking forward to your reply.

Specifications of the computer I normally use are
11th Gen Intel(R) Core(TM) i7-1165G7 @ 2.80GHz
16.0 GB

But I have also tested on a virtual machine which had 32GB of RAM. Not sure about the processor but probably similar to above.

I also tried xslx but found the same thing happening.

If it is struggling to compile the csv, then perhaps it just needs a progress bar

I have sent you the sitemap separately


@snied @97hills

Hello, our development team has performed slight optimization to the current way how the scraped data is being stored and after performing tests we have not encountered any issues regarding data export for the larger datasets.

This change will be implemented within the next extension release.

In case you encounter any issues, please, let us know.