1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[Help] How to avoid scrapebox auto removes the URLs which are harvested.

Discussion in 'Black Hat SEO Tools' started by keval007, Sep 4, 2012.

  1. keval007

    keval007 Junior Member

    Joined:
    Jun 12, 2012
    Messages:
    145
    Likes Received:
    26
    Occupation:
    Web Scraper & PHP Developer
    Hi,

    I have used ScrapeBox to harvest URLs but when harvesting finish the scrape box displays message with following text:


    I don't want to scrapebox to remove URLs! Can any one give me step to make such settings in SB?
     
  2. Zak_A

    Zak_A Jr. VIP Jr. VIP Premium Member

    Joined:
    Mar 16, 2008
    Messages:
    808
    Likes Received:
    873
    Gender:
    Male
    Occupation:
    WP designer & developer
    Location:
    Western Europe
    There is a setting in the "option" or "settings" menu that makes SB automatically remove duplicate domains after every harvesting session (there is an additionnal blacklist feature that removes every blacklisted domains from your harvested lists, so you won't end up spamming catt mutts' blog etc.)

    Just uncheck these option if you don't want them :)
     
    • Thanks Thanks x 2
  3. keval007

    keval007 Junior Member

    Joined:
    Jun 12, 2012
    Messages:
    145
    Likes Received:
    26
    Occupation:
    Web Scraper & PHP Developer

    I have done. Thanks for help.