Scrapebox Duplicate Domain Usage

Discussion in 'Black Hat SEO' started by mpsheng, Jul 21, 2010.

  1. mpsheng

    mpsheng Newbie

    Joined:
    Apr 13, 2010
    Messages:
    46
    Likes Received:
    5
    Hi,

    I have a question regarding the SB remove duplicate domain functionality.

    If the scraper has 10 URLs from the same domain, and I remove duplicate domains, 9 of the URLs will get deleted.

    That is such a shame. Is it possible to delete, at random, the 9 links so next time, in 1-2 months, I can retry to post on a different U R L at the same domain?

    Example:

    abc . c o m / 1
    abc . c o m / 2
    abc . c o m / 3

    I want to be able to post to all 3 (one at a time over a 3 month period).

    Thanks.
     
  2. volund

    volund Supreme Member

    Joined:
    Jan 24, 2010
    Messages:
    1,224
    Likes Received:
    751
    Occupation:
    Trying to make a buck or two
    I do not think there is a very easy way to do this. You could just remove duplicate urls instead of domains and then export them to an excel file and sort them. Seems like a lot of manual work though.
     
  3. mpsheng

    mpsheng Newbie

    Joined:
    Apr 13, 2010
    Messages:
    46
    Likes Received:
    5
    Scrapebox will just resort them alphabetically when you reload them.

    I think scrapebox should have a "random posting " feature.