1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[GET]Huge keyword list for scraping with scrapebox

Discussion in 'Black Hat SEO Tools' started by sbw27, Feb 16, 2012.

  1. sbw27

    sbw27 Regular Member

    Joined:
    Jan 6, 2008
    Messages:
    390
    Likes Received:
    441
    This list consists of just under half a million keywords. I built it myself using a smaller list of high paying keywords, then expanding it using the Scrapebox keyword scraper.

    It's genral niche, very useful for large scrapes. Just load your footprint up top, and this list into the keyword box and scrape.

    Code:
    http://www.mediafire.com/?nk7qxll0cw779l7
     
    • Thanks Thanks x 47
  2. djstanley

    djstanley Newbie

    Joined:
    Sep 20, 2010
    Messages:
    23
    Likes Received:
    3
    Nice but what if I only want to use 300 or so words from there? Have to do it via notepad?
     
  3. manjeet036

    manjeet036 Elite Member

    Joined:
    Dec 9, 2011
    Messages:
    2,103
    Likes Received:
    410
    Gender:
    Male
    Occupation:
    Government Job
    Location:
    Dad's Home
    nice for share dude,,,
     
  4. GoldenGlovez

    GoldenGlovez Moderator Staff Member Moderator Jr. VIP

    Joined:
    Mar 23, 2011
    Messages:
    701
    Likes Received:
    1,713
    Location:
    Guangdong, China
    Home Page:
    I'd recommend using a better utility such as Notepad++, it will be much faster to load and organize large text files such as this.
     
  5. coolsolution

    coolsolution Registered Member

    Joined:
    Oct 26, 2011
    Messages:
    94
    Likes Received:
    14
    thanks a lot.
     
  6. sbw27

    sbw27 Regular Member

    Joined:
    Jan 6, 2008
    Messages:
    390
    Likes Received:
    441
    If you only want 300 keywords, use the built in keyword scraper...put a seed keyword, and scrape.
     
  7. sameer5762

    sameer5762 Elite Member

    Joined:
    Sep 23, 2009
    Messages:
    5,228
    Likes Received:
    1,468
    Occupation:
    Software engineer
    Location:
    http;//sameer5762.com
    Home Page:
    Thanks mate for sharing the awesome list...:)
     
  8. drvijay

    drvijay Newbie

    Joined:
    Dec 31, 2011
    Messages:
    35
    Likes Received:
    1
    please upload as zip or rar file
     
  9. Markthedude

    Markthedude Power Member

    Joined:
    Feb 26, 2010
    Messages:
    572
    Likes Received:
    266
    Occupation:
    Entrepreneur
    Location:
    United States
    Thanks for the list! Ironically now scrapebox keeps crashing. Had to make it so that combined with my footprints, scrapebox was only searching for 700K combined searches.

    My system has plenty of power too:

    i7-3610qm
    16gb ram
     
  10. Scritty

    Scritty Elite Member Premium Member

    Joined:
    May 1, 2010
    Messages:
    2,807
    Likes Received:
    4,496
    Occupation:
    Affiliate Marketer
    Location:
    UK
    Home Page:
    I put in 5000 keywords at a time. If SB gets close to 1 million scraped URL's it crashes.
    I rarely use more than 25,000 keywords (5 lists of 5000).
    I only use 100 private proxies and 5 threads so 5000 keywords often gets me more or less the 1 million URL's - though with some footprints it can be a lot more so have to split the list further. Often a "remove dupes" is enough to carry on with.

    Scritty
     
    • Thanks Thanks x 1
  11. Rua999

    Rua999 Power Member

    Joined:
    Jun 25, 2011
    Messages:
    630
    Likes Received:
    407
    A better idea i find is to use a program like keyword researcher that you can get it to search for all phrases on google that end in for example "* in usa".

    Then use the find and replace feature in notepad to remove the "in usa" from all your results and you're left with many random phrases / words. This way your list of keywords is way more diverse than what scrapebox gives, since that seems to repeat the same keywords over and over with just a slight change meaning you're basically scraping the same url's over and over again.

    What i like to do then is head over to the google translate tool (http://translate.google.com/) and convert these keywords into some top languages like spannish, german, french, and italian, delete all dups and scrape using these keywords which gives a lot better i.p diversity and generally finds less spammed url's to post on :)
     
    • Thanks Thanks x 4
    Last edited: Aug 15, 2012
  12. Markthedude

    Markthedude Power Member

    Joined:
    Feb 26, 2010
    Messages:
    572
    Likes Received:
    266
    Occupation:
    Entrepreneur
    Location:
    United States
    Wow Scritty and Rua999, great tips!

    I was actually looking for you Scritty after trying to scrape and having SB crash to see if you sold any lists, PM if you do or would like to trade. I'm only after a couple platforms.

    And Rua999, never thought of that before, it's true what you said, I get kind of tired of having the same url scraped over and over in one sessions. I've started to dread deleting dupe domains since a huge number of the scrape urls are always delete.
     
    • Thanks Thanks x 1
  13. Real78

    Real78 Regular Member

    Joined:
    Apr 19, 2012
    Messages:
    371
    Likes Received:
    26
    @Rua999 great idea I am going to try that out later tomorrow as my SB is scraping right now.
     
  14. Roland32

    Roland32 Regular Member

    Joined:
    Sep 16, 2011
    Messages:
    237
    Likes Received:
    45
    Good tip. Scraping in foreign languages is a little known secret in the IM world. However, you can't just translate the words. that really won't work (I mean it can, but..). you need the actual foreign characters and words
     
  15. Rua999

    Rua999 Power Member

    Joined:
    Jun 25, 2011
    Messages:
    630
    Likes Received:
    407
    There are addons for scrapebox to do that with the mad foreign languages like arabic and chinese, but that's why i only mentioned spannish, german, french, and italian to keep it simple since there's no need to convert these :)
     
  16. dotgirish

    dotgirish Registered Member

    Joined:
    Mar 12, 2010
    Messages:
    99
    Likes Received:
    13
    For any one who wonder why and what to do with this keyword list , It can be used effectively on Scrapebox or GSA for better results..

    Nice share ,
     
  17. kajzersoze

    kajzersoze Registered Member

    Joined:
    Jun 7, 2009
    Messages:
    51
    Likes Received:
    4
    Location:
    Demons Gate
    Great share, I just wondered what to do whit this list and I saw "dotgirish" post above where he/she explains my wondering in my mind, may I ask what is GSA . Thank's :)
     
  18. keith

    keith Junior Member

    Joined:
    Jan 26, 2010
    Messages:
    177
    Likes Received:
    127
    Occupation:
    Web Development & SEO
    Location:
    Outside of Chicago
    Home Page:
    This list is also great for HRefer. Just throwing that out there.

     
  19. Scritty

    Scritty Elite Member Premium Member

    Joined:
    May 1, 2010
    Messages:
    2,807
    Likes Received:
    4,496
    Occupation:
    Affiliate Marketer
    Location:
    UK
    Home Page:
    Still a great list.

    Worth noting that using SB to split the list into 10,000 blocks then using the automator plugin that is now available for SB to run each of the 40 odd lists one at a time.
    Also worth shuffling the lists because I've noticed that Google is quicker to ban if you have similar searches too often, and in some places on this list there are many hundreds of very similar terms in a row, even with 100 proxies you're going to do several very similar searches quite quickly.

    Again SB itself can mix these up

    Scritty
     
  20. FallenSkywalker

    FallenSkywalker Registered Member

    Joined:
    Nov 12, 2009
    Messages:
    83
    Likes Received:
    10
    2 questions regarding this:

    1) in the automator plugin... it would be 40 different "harvest urls" steps?
    2) export harvested urls... can these be the same file specified in all 40 steps? more specifically, would sb merge the newly scraped to the same file or will sb overwrite?

    thanks for the help