Strange request: Need to check index status for hundreds of specific urls

flatdeadsquirrel

Junior Member
Joined
Feb 21, 2021
Messages
104
Reaction score
65
hi all. We are carrying out a test and we are stuck on this bit.

We basically need to monitor the Google indexed status of specific hundreds of urls and track them daily. We just want to know which URLS are still indexed in google and which are no longer indexed. We would ran this check daily for about a month or two.

We thought it would be best to stick all the urls to rank tracking software by adding the gogole query site:domain.com/testurl/ for each urls tested but both semrush and Rank Tracker strip the : rendering the query useless.

So i'm now thinking does Scrapebox have the ability to check this? We don't have an issue with buying loads of proxies to do the checks daily.

(@Sweetfunny )​

 
Sure, You can use scrapebox for this purpose and it has many other features which worth the price.
But you need good rotating(Backconnect) residential proxies to do so. even dedicated proxies won't work more than a small period of time unless you own thousands of them.
 
I second Singhavn's scrapebox recommendation, and yeah you're gonna need a load of proxies.
 
Back
Top