For example i am posting 1 million urls and i decided to abort. When you aborted there is no export rule for remaining urls. This is so lame. There should be this feature. I mean for example you aborted at 200k th url so there are still 800k url to process. But since you cant abort it starts from 0 again or you have to manually split them which is pretty hard.
Yeah, agree with you! And also Scrapebox should include feature like schedule too. For example after scraping, I can schedule it to commenting and whatever I want it to be scheduled
i've had my fair share of this happening, once in a while i want to "stop/pause" posting only to realize i won't have anything saved and i just let it go, so yes, it def should be in!
I will grab a copy of scrape box next week , r u facing any prob using it ? Please post your valuable comments . Thanks
I am also agreed with this feature. In my country its Power problem so i cant run the whole list in a sequence and i have to split that list into pieces and thats troubling me. Sweet funny should take action.
Yep, you're right. Also, it should add another feature: SB should auto-save your project every minute or something. In case it crashes, when you start the program again it should take it from where it left (like recovery in MS Word for example).
yes exactly. sometimes i am having power out problems and all works gone wasted. There should be auto state saving feature like every 10 min or something. we can also set timing too. And at least export remaining feature has to be added.
For the meantime you can just abort and export the failed ones. This should allow you to somewhat continue where you left off.