1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Scrapebox Hack - Skip Blog The Blog Analyzer

Discussion in 'Black Hat SEO Tools' started by HelloInsomnia, Oct 8, 2011.

  1. HelloInsomnia

    HelloInsomnia Jr. Executive VIP Jr. VIP Premium Member

    Joined:
    Mar 1, 2009
    Messages:
    1,817
    Likes Received:
    2,913
    Okay, so we all know that one great way to find more blogs to comment on is to download others backlinks.

    One way to do this is to find an AA URL and extract all the outbound links on the page. Then simply download all their backlinks and you will have a ton of new places to comment on.

    The only downside is that you have to run all the URLs through the blog analyzer. And that takes forever!

    I'm still chugging through a massive project, I have been downloading the backlinks over nearly 2,000 sites and I needed a quick way to filter the URLs.

    Now this is not 100% but it can save you a ton of time.

    Simply import a massive URL list and then go to - "Remove / Filter" then choose "Remove ulrs not containing" and then type "?p=" (without quotes).

    This will remove all the URLs that are not the standard WP permalink structure.

    Now, I'm not going to give you the keys to the kingdom, you have to work a little bit for it and figure out some other URL stings to use for this.

    But when you do figure them out you can quickly pull out many URLs from a file to comment on them - or at the very least cut your list down in half and then run it through the blog analyzer.
     
  2. GoldenGlovez

    GoldenGlovez Moderator Staff Member Moderator Jr. VIP

    Joined:
    Mar 23, 2011
    Messages:
    701
    Likes Received:
    1,713
    Location:
    Guangdong, China
    Home Page:
    Why not just run them through the fast poster, export, and check links? Faster than blog analyzer and not all pages on a AA domain are actually auto approve. Plus you will not miss as many AA pages filtering with that technique (as well not excluding non-wordpress sites).
     
    • Thanks Thanks x 2
  3. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,380
    Likes Received:
    1,801
    Gender:
    Male
    Home Page:
    Exactly, running them thru blog analyzer before fast poster is POINTLESS. Id write it in size 7 font, but I don't like to be too obnoxious. :yumyum:

    Anyway, they keys to the kingdom will tell you that if you split test, that the blog analyzer is a wonderful tool, but it gets less updates then the poster as the poster is what produces. The advantage to blog analyzer comes over slower poster.

    So back to the split test, you will find that blog analyzer says urls that won't work, do, and urls that will work won't, and entire platforms are discarded that work in scrapebox fast poster etc...

    Its great for filtering malformated lists or problem lists or slow poster, but running them thru fast poster is ideal. Also I always change the permalink structure of my blogs upon install, so I can only assume that many others do as well, so you just lose out on those.
     
  4. HelloInsomnia

    HelloInsomnia Jr. Executive VIP Jr. VIP Premium Member

    Joined:
    Mar 1, 2009
    Messages:
    1,817
    Likes Received:
    2,913
    It just depends on your setup and how you want to do it. For example, If I download the backlinks from say 3000 sites I may have around 300k URLs - I can strip out let's say 25k just by filtering all but the standard WP post URL.

    This may be the difference of running the fast poster for 30 minutes vs 6 hours.

    You, loopline - I see he thanked you - probably have much better setups and a lot more proxies than most people.

    On the flipside, if you do have the setup then that is probably the best way to go, do that, then filter out the different platforms (if you choose to) afterwards because it will be a smaller list.

    Edit: I guess I should clarify that I don't typically use the blog analyzer. This is just a way to filter a massive amount of backlinks. If you download 10 million backlinks you probably do not want to post to all of them or even attempt it if you are using a standard setup. Using this method you can strip out chunks of that original 10 million that you know are possible blogs to comment on. You may still be left with hundreds of thousands of URLs but at least you know they are blogs and not forum profiles or anything else.
     
    Last edited: Oct 8, 2011
  5. eatingMemory

    eatingMemory Jr. VIP Jr. VIP

    Joined:
    Mar 23, 2010
    Messages:
    1,306
    Likes Received:
    189
    You could also try taking the backlinks of those sites you scrapped and filter out dupe domains and post to them instead. Then check links the successfully posted ones. The ones that show a link means it's a AA domain.

    What i do after getting this list of aa urls (also domains) is trim them to root and add site: infront and scrape the google for more pages. This is what i do and it gets me anywhere from 2,000 to 4000 aa domains each time on a original list size of 3-4million. Instead of posting to all the dupe free urls i post to unique domains only so it saves me alot of time and i can produce big lists in a day or so (yes i have the proxies and servers to start with). Hope you guys with limited resources could use this as it saves alot of time.
     
    • Thanks Thanks x 1
  6. mrpega

    mrpega Regular Member

    Joined:
    Sep 19, 2008
    Messages:
    352
    Likes Received:
    88
    I used to diligently filter out using blog analyzer but soon after i find that pointless and waste of time. I used that time to blast through using fast poster and do a link checker after that..have to check on the links anyway.
     
  7. HelloInsomnia

    HelloInsomnia Jr. Executive VIP Jr. VIP Premium Member

    Joined:
    Mar 1, 2009
    Messages:
    1,817
    Likes Received:
    2,913
    I do that, but first I'll filter down the list. Usually I am not targeting WP so I can use a footprint that will work with another platform. This will filter out everything except post pages on my target platform.

    Then I will remove duplicate domains and scrub out the domains I have already ran by checking it against my blacklist.

    This will leave me (pretty quickly) with a bunch of unique domains that I have not ran before (or at least are not on my AA list for this platform).

    Then I link check the URLs after I run them through the fast poster. Obviously links found will be trimmed to root and then I'll scrape the whole site. Since I filtered the original list by using the footprint I can do this again to only get post pages.

    Then I post to all of those and link check and whatever comes up is my AA list.

    It's not the only way to do it but it is a quick way to do it.
     
  8. eatingMemory

    eatingMemory Jr. VIP Jr. VIP

    Joined:
    Mar 23, 2010
    Messages:
    1,306
    Likes Received:
    189
    Yeah its basically the same method but it works great ! The only time consuming part is when posting to all the scraped urls from site: operator as a part of those are not postable blog urls though.

    But still this would definitely save you alot of time compared to posting to millions of urls , many of which are from the same domain.
     
  9. HelloInsomnia

    HelloInsomnia Jr. Executive VIP Jr. VIP Premium Member

    Joined:
    Mar 1, 2009
    Messages:
    1,817
    Likes Received:
    2,913
    If this list is very large you can cut it down a little bit by filtering out pages you know don't accept comments.

    To do this just go to remove / filter - then remove urls containing and then filter out pages such as:

    ?cat=
    ?page_id=
    ?page=
    /page/
    /cat/
    /tag/
    ?tag=
    ?m=

    and so on..
     
    • Thanks Thanks x 1
  10. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,380
    Likes Received:
    1,801
    Gender:
    Male
    Home Page:
    Hmmm so your saying on a backlink extraction then?

    I like you post above this one as a better idea though. Rather then stripping out urls that are blog posts you could strip out what you know won't post.

    My initial point was that using blog analyzer prior to posting was pointless, not that you idea in general was pointless, it has merit.

    I would take it a step further though. I would put together a list that you know you can't post to, including your above, but also of things like:

    Forum footprints in urls - as we know many people use profiles etc...
    Standard web 2.0 properties - e.g. squidoo etc...
    Things like Digg and other common sites
    They sky is the limit here...

    Further more I would strip out home/root pages. I also would do as mentioned above and remove duplicates, but I wouldn't work with just 1 url from a domain, as if you happen to land on a page thats not caught by an above filter and it doesn't take the comment then you lost that whole domain. I would do 3-5 urls from a given domain.

    My free scrapebox classroom domain cleaner tool will allow you to specify to only keep like 3 urls from a given domain (or any number up to 25) and it will also optionally strip out the home page urls. (You do have to sign up for a mailing list to get the tool, but you can unsubscribe after you download the tool, but if you stay signed up you will get more scrapebox tools/exclusive videos/and free reports emailed out) Tool here:
    http://scrapeboxmarketplace.com/scrapebox-classroom-domain-cleaner


    All that done you could save yourself some nice time for sure.