Hi everyone. I have a list of a few hundreds of URLs of one website (a tiny fraction of the whole website).
Is it possible to scrape these links in bulk without replacing metadata all the time?
Hi everyone. I have a list of a few hundreds of URLs of one website (a tiny fraction of the whole website).
Is it possible to scrape these links in bulk without replacing metadata all the time?
Yes it is. You can import multiple URLS. However to do a few hundred of them, it's a little more complicated.
I made a video a year ago that will show you how to do it - https://www.youtube.com/watch?v=ToMPE4wyon8&t=7s
I'm not an expert and I am sure someone on here knows an easier way, however, this will get you there.
Feel free to DM me if you have any questions
Thank you! It helped!