1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[SCRAPEBOX] My Easy way of turning 1 autoapprove link into 100s or 1000s quick and easy

Discussion in 'Black Hat SEO' started by rnc505, Dec 26, 2010.

  1. rnc505

    rnc505 Regular Member

    Joined:
    Oct 28, 2008
    Messages:
    229
    Likes Received:
    109
    Hey, I know there's plenty of guides and tutorials about getting all of the postable links from within a single site, etc. but this way has worked best for me and lets you do more than one domain at a time.

    1) Scrape/Post/Check and filter out all non-autoapproved links, so you only have a list of autoapproved links and import them into your Harvest box.
    2) Click "Trim to Root"
    3) Click "Remove Duplicates" --> "Remove Duplicate Urls" (Doing it in this order is just easier so you don't get duplicate domains)
    4) Click "Export URL List" --> "Export to Excel (.xls)"
    5) Open the file (either in Excel, OpenOffice, etc - any spreadsheet program).
    6) Now, will all of the domains in Column A until row ###, type into Cell 1B the following and then click enter:
    Code:
    ="site:"&A1&" 'name (required)'"
    This code will do a "site:" search for the domain looking for 'name (required)' on the page, which is ALWAYS on blog comment pages, and out of place on any other page.
    7) Then click the tiny black/square box on the bottom right hand side of cell 1B and drag it down until the cell to the left of it is the last cell in the first column. This autopopulates the cells with the information from the first cell, while also incrementing the A1-->A2-->A3...-->A## so it always grabs the info from the cell to the left.
    8) Now select the entire second column, choose "Custom Footprint" and paste all of the [site:http://domain.com" 'name (required)'] into the keyword field.
    9) Then choose ONLY yahoo, use scraped, not private, proxies and click harvest.
    10) Clean up your list a bit with remove dups and now you have a list of what should be autoapprove posts.

    ENJOY!!,

    rnc505
     
    • Thanks Thanks x 19
  2. wannabie

    wannabie Elite Member

    Joined:
    Mar 11, 2009
    Messages:
    3,807
    Likes Received:
    2,954
    Occupation:
    Seo and Marketing Suprisingly
    Location:
    Your bedroom window
    Home Page:
    Took me a couple of reads, but cheers dud e:)

    Ill try this tomorrow after beer
     
    • Thanks Thanks x 1
  3. dabbie

    dabbie Newbie

    Joined:
    Jul 24, 2009
    Messages:
    7
    Likes Received:
    0
    Yeah same here, looks really nice after the 1st read. Thanks.
     
  4. Jesperj

    Jesperj Power Member

    Joined:
    Sep 10, 2010
    Messages:
    502
    Likes Received:
    347
    Occupation:
    Web Designer
    Location:
    Far, Far away
    Home Page:
    Took some time to understand, but aint it just easier:

    put in "site:" without the brackets into custom footprint, remember to tick custom footprint.

    Then import your auto approve domains into the harvester, trim to root, and export->copy to clipboard.
    Then paste them into the keyword window.

    Now you can harvest urls using the domains as keywords, and since you use a custom footprint it will put in site: in front of them, so it only finds sites from that domain.
     
    • Thanks Thanks x 1
  5. Nerevar

    Nerevar Jr. VIP Jr. VIP

    Joined:
    Jun 30, 2010
    Messages:
    421
    Likes Received:
    167
    @Jesperj: This way you harvest all pages, not just commentable. The above method harvest only commentable pages.
     
  6. sargerevenge

    sargerevenge Newbie

    Joined:
    Nov 17, 2010
    Messages:
    33
    Likes Received:
    8
    Can someone who comprehends the OP's post rewrite it for the layman? Im left utterly confused.
     
  7. rnc505

    rnc505 Regular Member

    Joined:
    Oct 28, 2008
    Messages:
    229
    Likes Received:
    109
    I personally don't see what everyones having an issue with....
    1) Find autoapprove links (using whatever method).
    2) Delete duplicate domains from the autoapproves
    3) Copy+Paste all domains into excel.
    4) Type:
    Code:
    "site:"&A1&" 'name (required)'"
    into B1 and then drag it down to all cells adjacent to the left (Column A).
    5) copy and paste all of the second column (Column B) into the keyword list and choose "Custom Footprint"

    Viola done.
     
    • Thanks Thanks x 1
  8. Nerevar

    Nerevar Jr. VIP Jr. VIP

    Joined:
    Jun 30, 2010
    Messages:
    421
    Likes Received:
    167
    I got 1 million autoapprove URLs from 12k unique domains. How useful are they really? First thing that comes to mind is that you need to randomize and split the list. But still - how much daily posts do you do with such a list?
     
  9. syngenetic

    syngenetic Guest

    I tried this method few days back and it works perfectly fine. I thanked you. :)
     
  10. shinjukusamurai

    shinjukusamurai Newbie

    Joined:
    Dec 19, 2010
    Messages:
    2
    Likes Received:
    0
    could you tell me where i could get scrapebox?
     
  11. HoNeYBiRD

    HoNeYBiRD Jr. VIP Jr. VIP

    Joined:
    May 1, 2009
    Messages:
    5,881
    Likes Received:
    7,122
    Gender:
    Male
    Occupation:
    Geographer, Tourism Manager
    Location:
    Ghosted
    you can get it here with bhw discount:
    Code:
    http://www.scrapebox.com/bhw
     
  12. scriptomania

    scriptomania Junior Member

    Joined:
    Dec 28, 2010
    Messages:
    127
    Likes Received:
    249
    Occupation:
    A full time pirate at sea
    Location:
    The European capital of politics
    Pretty cool method actually... Thanks!

    I'd say that would depend on your domain. Is it established or whatnot. Age and stuff like that. I personally never use Scrapebox or for that matter any kind of blackhat tools on my money sites. Just channel them through buffersites of all sorts and it should be fine. Autoapprove links are pretty useful in my experience, provided you have a quality list.
     
  13. mikey9991

    mikey9991 Regular Member

    Joined:
    Dec 10, 2010
    Messages:
    321
    Likes Received:
    90
    Occupation:
    Marketing Consultant
    Location:
    Heavenly
    Thanks!

    Will this choose the first post on their blog or one random post or all the comment-able blog post?

    mike
     
  14. morehits

    morehits Junior Member

    Joined:
    Nov 24, 2010
    Messages:
    163
    Likes Received:
    31
    Looks a sweet method to me too. Will work on this one later.
     
  15. s4nt0s

    s4nt0s Jr. VIP Jr. VIP Premium Member

    Joined:
    Jul 10, 2009
    Messages:
    3,659
    Likes Received:
    1,940
    Location:
    Texas
    I'm trying to understand what the excel part is needed for? Is it just adding the 'name (required)' to the end of each URL or is it doing something more?

    If it's only adding the 'name (required)' to the end why not just use the merge feature in Scrapebox? If it's doing something more can you please clarify. Thanks
     
  16. HardAssets

    HardAssets Newbie

    Joined:
    Jul 19, 2010
    Messages:
    30
    Likes Received:
    3
    Very nice method...thanks!
     
  17. HardAssets

    HardAssets Newbie

    Joined:
    Jul 19, 2010
    Messages:
    30
    Likes Received:
    3
    It should pick up every comment page within the site.
     
    • Thanks Thanks x 1
  18. Nerevar

    Nerevar Jr. VIP Jr. VIP

    Joined:
    Jun 30, 2010
    Messages:
    421
    Likes Received:
    167
    Sorry, wasn't clear. I meant in a way that you don't get banned by spamming the site from the same IP. Or doesn't it matter if it's autoapprove?
     
  19. bakxos

    bakxos Regular Member

    Joined:
    Aug 8, 2010
    Messages:
    498
    Likes Received:
    292
    Location:
    Scotland
    Yes you can do that but there is a problem with non english wordpress websites:) so you will miss some blogs...
    Better to just use site:www. domain. com etc and then comment to get what you want or just harvest all the pages from a website and use notepad++ to take off the tag urls. You will have a high % anyway:)
     
    Last edited: Dec 29, 2010
  20. marrrko

    marrrko Jr. VIP Jr. VIP

    Joined:
    Oct 10, 2009
    Messages:
    3,895
    Likes Received:
    1,610
    Location:
    Good Old Europe
    Home Page: