1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Scrapebox Tips

Discussion in 'Black Hat SEO' started by websitezbuilder, Nov 8, 2011.

  1. websitezbuilder

    websitezbuilder Regular Member

    Joined:
    Feb 23, 2011
    Messages:
    271
    Likes Received:
    362
    Location:
    Australia
    Hey,

    New to SB and just wondering, I just got a whole heap of Directory Submissions done by one of my VAs but they've just given me a list of
    the directories submitted to and the logins for each... Any chance of checking to see whether any links are live yet just by putting in the
    main URL of the directory into SB.

    I just tried the check links feature by putting in the website URL in a txt
    file into websites and the list of directories into a txt file into the blogs section but this did not work... Any other ways of mass checking links
    if ive only got the main directory URLs?

    Thanks in Advance,

    wb
     
  2. WizIMS

    WizIMS Power Member

    Joined:
    Sep 24, 2011
    Messages:
    684
    Likes Received:
    870
    Location:
    Skype - Wiz.IMS
    Home Page:
    If you just want to check wether the VAS actually added your website to the directories or not, just pick few random URLs from the list and check manually.

    If you want to check if the links are indexed or not, just wait few weeks and check with any backlink checker(though they all sucks)

    Sorry if I understood wrong anyway :/
     
  3. upl8t

    upl8t Regular Member

    Joined:
    Apr 9, 2008
    Messages:
    475
    Likes Received:
    84
    Location:
    New Scotland
    You have to have the actual page/url that your link is on. Scrapebox doesn't spider a site to try and find your link. If you search google I believe I've seen software before that does this. It was originally built for people doing link exchages to make sure the other person kept your link live.
     
  4. charmol

    charmol Junior Member

    Joined:
    Oct 7, 2011
    Messages:
    173
    Likes Received:
    84
    Location:
    ~| TrOpiCaL PaRaDiSe |~
    I think if you scraped with the keyword

    Code:
    site:"http://www.example.com" "ANCHOR TEXT"
    That would work, wouldn't it? you could use the merge function to create a list of keywords for all the sites you wanted to check.

    No doubt someone smarter than me will correct me if I'm wrong :)

    ~|ChaRoN|~

    EDIT: obviously, replace the words ANCHOR TEXT with your actual anchor ;)
     
  5. s4nt0s

    s4nt0s Jr. VIP Jr. VIP Premium Member

    Joined:
    Jul 10, 2009
    Messages:
    3,664
    Likes Received:
    1,940
    Location:
    Texas
    Ya I think what Charon said above would be the best way to approach this. Id be interested to hear if anyone else had different methods to checking the links.
     
  6. charmol

    charmol Junior Member

    Joined:
    Oct 7, 2011
    Messages:
    173
    Likes Received:
    84
    Location:
    ~| TrOpiCaL PaRaDiSe |~
    The only problem that I can see with my above post is that you would have to scrape an ENTIRE site just to find your single link. This may be OK with smaller sites, but I'm guessing that the sites you're submitting to will be huge.

    I think that using this method would be quite resource and time intensive, and you would NEED to use proxies, because if you ping thousands of pages in a single site at the same time, they will almost certainly ban your IP, and if you are using only a few proxies, you might successfully get them all banned.

    My advice, Scrape as many public proxies as you can within SB and use them, don't risk your private ones that you've spent good money to get.

    I'd be interested in hearing about your results if you do use this method.

    ~|ChaRoN|~