Planning a large-scale data collection project and trying to decide between Professional and Scale tiers. I have questions on both pricing and technical capabilities.
Context:
- Scraping multiple high-traffic travel/accommodation websites
- These sites have aggressive bot detection
- Need reliable operation with minimal blocking
- Pages contain multiple images (listings, photos, media)
Package Decision - My Concern:
Professional tier only includes datacenter proxies. If I get blocked, I have no fallback. Scale tier allows adding residential proxies ($2.5/GB) as backup.
Package Questions:
- For high-traffic sites with strong anti-bot measures, which tier do you recommend?
- Is residential proxy necessary, or can datacenter work effectively?
- Can I upgrade from Professional to Scale mid-project if blocking occurs?
Critical: URL Credits & Images
When scraping product pages with multiple images, how are credits counted?
Example: 1 product page with 10 images = 1 credit or 11 credits?
Also, Scale tier lists "Image download" as a feature. Does this mean image downloads are counted separately from URL credits, or do they still consume the unlimited URL limit?
This is crucial—if images eat individual credits, Professional becomes unviable instantly.
Additional Requirements:
- API support for Java/Spring Boot backend?
- Automatic exports to S3/Google Cloud?
- Any data volume limits beyond URL credits?
Any guidance appreciated!