Browser Extension without Subscription limitation

Is the Browser Extension, which is used locally limited to a extract a certain page number or entries?
Mine always stops around 360 entries.

I couldnt find any number on the official page, so im asking here.

No limits, but is better no up more than 4000/5000 links map

1 Like

No real limits except maybe for the import URL limit of 10,000 (you cannot import sitemap which has more than 10,000 URLs). However scraping is not limited and WS will navigate to as many links available according to your sitemap, and subject to your time and RAM limits. I have used WS before for some really big scraping jobs with 50K to 250K pages. But usually I split my scrapers in batches of 20K or less. This also allows me to run concurrent (simultaneous) scrapers on other machines.

1 Like

Unfortunately, no matter how many pages I scanned, it always stopped after 360 entries.
In cloud with same sitemap i couldnt see this behavior...

the problem must be another, blocking the page? Try to up page load time.

You use an sitemap with links o a navigation map?
If you have the links and not found the solution,
you can make groups of 360 and an index page where you can read, it's like building a queue.
Or import all, and when stop remove they are scraped.

1 Like

The site i crawled was limiting it.
With another site its unstoppable now :stuck_out_tongue:
Thanks for the answers :slight_smile: