I don't have a dedicated server, I'm just on my home PC and on home broadband so it's not too fast, even though it might not seem like a lot for you guys it was a lot for me. I had my laptop on for 3 days harvesting URLs, I got up to around 15,000,000 URLS.
Then I couldn't take it anymore and had to stop it, I figure there are already so many dupes in there there wouldn't be that many uniques left over.
I waited for it to say finished posting etc., and that it had extracted the urls exceeding 1 million into separate files in the scrapebox folder. But then, an error message popped up. I've had this before but not on f*king 15 mil urls I mean come on.
Sweetfunny, is there any way to recover these URLs? Scrapebox is great but how can I trust the darn thing when you need it to harvest large amounts of URLs it just cuts out on you and you're left with jack sh*t. I am cry
Then I couldn't take it anymore and had to stop it, I figure there are already so many dupes in there there wouldn't be that many uniques left over.
I waited for it to say finished posting etc., and that it had extracted the urls exceeding 1 million into separate files in the scrapebox folder. But then, an error message popped up. I've had this before but not on f*king 15 mil urls I mean come on.
Sweetfunny, is there any way to recover these URLs? Scrapebox is great but how can I trust the darn thing when you need it to harvest large amounts of URLs it just cuts out on you and you're left with jack sh*t. I am cry