almost everybody who uses SER talks about using lists. some talked about verified, but i dont seem to hear people talking about indexed. i tried scraping a list using scrapebox and came up with 360k urls. i broke it down to 10k per list and used the addon index-checker. set the connection to 1, even increased the delay to 4 sec. with 10 private proxies, i got my ip blocked with only 150 indexed urls and some 400 not indexed. i barely finished checking about 1000 (or fewer) urls. it will take me forever JUST to find the urls of the list that are indexed. unless i am doing it wrong. how? What's the point of getting a backlink from a site that is not even indexed? i imagine if they are not already indexed, they probably never will be. even worse if they are actually de-indexed. so they are verified. so what? i've bought a list from fiverr. 100k+ and i only got 900 verified links. i didnt actually check, but i imagine i prolly got like 50 indexed links out of it. i know since its fiverr, those are prolly not 'quality' list. how differently are so-called quality links obtained? i read somewhere on bhw people talk about how to create a huge list. they basically find the urls people are spamming. so you can spam there, i can too. those urls mostly are not indexed and never will be. a huge list just means a huge waste of time. i am all ears if i am wrong.