Scrapebox Newbie Quetions

Discussion in 'Black Hat SEO Tools' started by enzo-seo, Jan 24, 2012.

  1. enzo-seo

    enzo-seo Newbie

    Jul 26, 2011
    Likes Received:
    I have been getting my feet wet with SB this past week. I have some questions I couldn't find clear answers for in other threads. I though maybe some seniors can help.

    1. 95% of all the proxies I harvest are failing the proxy check. Most have a "Timeout Error". Why? I also have a set of private proxies from proxybonanza, but i hat using them because SB eats my bandwidth up in about o say 4 hours!
    2. I am starting to get my head wrapped around Footprints. I have about 300 I have gathered from previous threads. I will in time start to write my one though. My question is, do I have to go through each of the Footprints I want to harvest from individually per keyword? Can I / How many Footprints can be combined? I have been working with just one keyword these past few days and over about 10 footprints... this is taking forever, there has to be a fast route than what I'm doing.
    3. My harvest results are showing lots of root TLD for sites like Facebook, Addthis, MySpace and so on. I know I cant post to these, but they keep showing. Now I did add a bunch of them to the blacklist, but is there a way to prevent ANY root TLD sites from being harvested?
    4. In my comments I want to add one link. Is there a way to add it in dynamically to my pre-spun comments?
    5. I read a thread about using Human Names (preferably female) vs. Keyword Names. Does anyone have any other comment regarding this?
    6. For the "Home Page" URL to post on a comment, is it better to use a targeted deep-link url, or should this be the root TLD?
    Thanks for clearing these up.

  2. Abstractus

    Abstractus Registered Member

    Jan 24, 2012
    Likes Received:
    my advice

    buy private shared proxies and a autoapprove list

    that should solve your problems with a low budget <$50
  3. TheMatrix


    Dec 20, 2008
    Likes Received:
    The proxies timeout because they are slow or your timeout settings are low. Pull the timeout slider in Settings to the max value.

    As for private ones, get the proxies from a reliable source that provides unlimited BW, or slow down your posting. :D

    Actually, take 1 footprint, and in KW's, load a list of English words. And scrape.

    This will generate a lot more URLs for just 1 KWs.

    If you have enough BW and a high internet connection, I'd suggest scrape all at once, and run them through SB. It would not be such a great idea to go through 300+ footprints manually without the complete knowledge of them.

    No. There's no way to stop these from getting harvested, However, blacklisted domains will automatically be removed.

    I don't think I got this question. Can you please explain a little what exactly you want?

    If the blogs you are posting on are auto approve, go for keywords.

    Use any. Preferably, distribute your links. Like if you build 5 links to the homepage, build 5-10 for inner pages as well. That's something that I do personally!
    • Thanks Thanks x 1