1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[SB] Spamming same Domain But different URLs

Discussion in 'Black Hat SEO' started by echaz, Sep 5, 2010.

  1. echaz

    echaz Regular Member

    Joined:
    Feb 5, 2009
    Messages:
    236
    Likes Received:
    77
    Hey!

    I always read about people saying that it is extremely important to let scrapebox remove all duplicate urls and domains from the harvested lists.

    Now I'm checking several competitor websites with every single site of them having 15k - 40k backlinks according to yahoo site explorer.

    I noticed that every competitor has dozens and even hundreds of backlinks from one single Domain.

    So why do people recommend removing all dupes? I see my competitors use it quite successfull - cause they always rank me out in google ^^
     
  2. boo blizzi

    boo blizzi Regular Member

    Joined:
    May 28, 2009
    Messages:
    361
    Likes Received:
    267
    yeah i noticed this too so now i just remove duplicate urls...each page has its own pr too so i guess it all counts
     
  3. PandaMeat

    PandaMeat Newbie

    Joined:
    Jun 30, 2010
    Messages:
    34
    Likes Received:
    34
    Yahoo is not Google.

    Yahoo Site Explorer is not Google

    Backlinks in Site Explorer does not mean anything other than give you a VERY rough idea of what someone has going on in terms of volume of back links and certainly doesn't mean that Google gives them any weight.

    Besides, Google also only counts one link from a page. So you'll need to find another excuse for losing out to competitors ;)
     
  4. wagooza

    wagooza Newbie

    Joined:
    Jan 8, 2008
    Messages:
    39
    Likes Received:
    5
    These links are built over time. So 10 links from a site may be built on 10 different days.
    Also, most of them could be Blogroll links (or some sitewide links).

    The reason to remove duplicate urls and domains is simple - to increase the comment acceptance rate in moderation.
    Imagine you have a blog, and see 1 comment "nice post" linking to urblog(dot)com
    Now imagine seeing 5 comments, all saying "nice post" all linking back to urblog(dot)com -> You will definitely mark all of them as spam...

    It's important to understand why you are doing what you are doing. Hope this helps :)
     
    Last edited: Sep 5, 2010
  5. xiphre

    xiphre Regular Member

    Joined:
    Jun 9, 2007
    Messages:
    290
    Likes Received:
    84
    Location:
    EU
    Imho it doesnt matter if it is or not. If a guy wants to match those with the best serps his best bet is of course starting out mimicking their methods and then try to improve them.

    Take a closer look at the most spammed queries - generic viagra, adult dating, porn and so on, and you will notice that 99% of the sites have tons of duplicate domains in the backlinks. The guy spoken about in the big "33k backlinks a month...." thread has 100k+ backlinks but they are all only from 3-4k different domains. Google only counting one link from a page? Dont think so, its only misinformation spread from google themselves
     
  6. crazyflx

    crazyflx Elite Member

    Joined:
    Nov 9, 2009
    Messages:
    1,674
    Likes Received:
    4,825
    Location:
    http://CRAZYFLX.COM
    Home Page:
    SB is an incredibly powerful tool. A tool that I've just recently started to FULLY realize the power of.

    There is NO reason to remove duplicate domains. The original reason to remove duplicate domains was because SB used to automatically alphabetize the URLs (that is, sort them alphabetically).

    This meant that you'd be commenting on the same domain right in succession.

    NOTHING looks more like spam to a blog owner than getting somebody who has commented (with their comments all linking back to one site) on their blog 50 times in 10 seconds...which is exactly what happens when you're commenting on a list of blogs without duplicate domains removed alphabetically.

    HOWEVER, now that SB has the feature to randomize your list of blogs on import (open up settings & check the last option) this doesn't have to happen.

    So lets say you have a list of 30k unique domains and a list of 20k urls that are sharing 5k domains. That means you've got 50k unique URLs total coming from a total of 35k domains.

    That means you've got roughly 4 urls from one domain for your second list. When randomized into the total 50k list, you won't be commenting on the same blog domain repeatedly. Depending on how far that one domain entry is from the last, it could be a 30 minute span from one comment to the next...which looks fine to a blog owner.

    AND GOOGLE WILL COUNT THEM!! If you have a link from one blog URL and another one from another blog URL (sharing the same domain), google will count them, I can assure you.

    Why do this? Well, there are a lot of reasons, but I'll name the main two (in my opinion):

    Outbound Link Number &
    Each Pages PR

    One of those URLs you commented on might have 500 outbound links and the other might have 5 outbound links. You get to put yours one both.

    One page might have N/A PR & the other might be a pr 4. You get your link on both.

    Google counts both links.

    Both links will be found.

    Both links contribute to the anchor text percentages search engines are calculating for your site.

    Don't remove duplicate domains IF your list is sufficiently large to fully scatter & randomize the number of times & the time span in between when you're commenting on the same domains.

    I've got 25k backlinks to a site is YSE & 21k backlinks in Google Webmaster Tools...all inside of 1 month using SB and another tool EXCLUSIVELY, and you would seriously not believe how many in BOTH of those (YSE & GWT) are from the same domain but different URLs.

    Trust me, it's worth it to comment on different URLs of the same domain. You wouldn't believe my jump in rankings by doing this, particularly by commenting on EVERY indexed URL of every autoapprove blog I could find.

    Read this thread: http://www.blackhatworld.com/blackh...re-comments-approved-backlinks-scrapebox.html
     
    Last edited: Sep 5, 2010
  7. HackZu

    HackZu Junior Member

    Joined:
    Nov 6, 2009
    Messages:
    115
    Likes Received:
    53
    Crazyflx, thanks for posting, very insightful.

    I got a question. In the last days I collected a list of 130k unique URLs and prolly about 3000 auto approve blog URLs. I would then trim the auto approve ones to root and harvest the whole site. Say I end up with 10 auto approve URLs / blog, that would give me around 30000 BL. Is this possible? Or am I misunderstanding sth? :D


    Salute
     
  8. PandaMeat

    PandaMeat Newbie

    Joined:
    Jun 30, 2010
    Messages:
    34
    Likes Received:
    34
    I thought you meant multiple links from the same page to one domain. I'm really sorry.

    Yes,... ideally, if you could get a link from every page of a domain, that is much better. It would be the same as getting a sitewide link. Links are about pages, not domains. Thats how PageRank works. Links from pages are what matters and the value they pass. The more links from more pages, the better. I don't know why anyone would recommend filtering out duplicate domains, but it would be a good idea to filter out duplicate pages so you aren't continually linking to your same domain from the scraped page.

    As someone pointed out however, you don't want a blog owner to wake up to 1 comment on every page of their site. It would be better to find all pages on a lot of target domains, use excel or some other method to randomize them all and then work through that list so people don't wake up to see 100 new comments from one guy linking to one site .
     
  9. xiphre

    xiphre Regular Member

    Joined:
    Jun 9, 2007
    Messages:
    290
    Likes Received:
    84
    Location:
    EU
    Yes exactly :)

    Hackzu, you are taking things a bit too far :rolleyes:
     
  10. echaz

    echaz Regular Member

    Joined:
    Feb 5, 2009
    Messages:
    236
    Likes Received:
    77
    wow thanks for this amazing replies!

    You guys really helped me to understand SB much much better!

    especially crazyflx... I really enjoy reading your posts about sb! thanks so much!
     
  11. PandaMeat

    PandaMeat Newbie

    Joined:
    Jun 30, 2010
    Messages:
    34
    Likes Received:
    34
    Well, you WANT a link from every page from a site. You just are unlikely to get them in one run. Crawl those 3000 domains, filter the URLs to make sure you have only the blog posts (get rid of pages, /feed/, /rss/, tags etc), put them into excel, select the cells in the next column and use the "RAND()" function to generate random numbers next to them, sort that column to randomize the pages and then run them XXX at a time or whatever.
     
  12. quadratic

    quadratic Registered Member

    Joined:
    Oct 26, 2009
    Messages:
    69
    Likes Received:
    46
    Scrapebox has a the ability to take a list and randomize it. You can use this for your process and not require Excel.
     
  13. Dragonz

    Dragonz Registered Member

    Joined:
    Feb 22, 2008
    Messages:
    51
    Likes Received:
    6
    I am reading on other thread that suggesting to remove dup domain for WP in SB because of aksimest will pick on your domain and blacklisted it, is it true ?
     
  14. «ßH» ¢Ödê ▬ Wâ®®îÖ®

    «ßH» ¢Ödê ▬ Wâ®®îÖ® Newbie

    Joined:
    Sep 12, 2010
    Messages:
    3
    Likes Received:
    5
    OP - as long as you have fed the commenter a varied list of names, email addresses, urls (Web 2.0 pages & cheap .info's buffers in a linkwheel structure ideally, which will also fool Akismet if done right) - and very importantly, well-spun, coherent and topical comments,- you wont have a problem. Obviously if you scrape a good domain (Auto-approve, High PR, etc) - you want to get proper mileage out of it, dont limit yourself to just one URL from it..

    As has already been mentioned on numerous other SB threads / guides, one easy way to scrape all of the potential URL's from those high PR, auto-approve domains that you have already found - just trim that list of good domains to root, remove duplicates, copy and paste them into the keywords box with a 'site:' prefix (ie. 'site:w»w.good-domain.c0m'), choose the corresponding footprint, and harvest away.

    Google's algorithim for indentifying domains to be deindexed is dynamic and takes into account a lot more factors than just total number, frequency, domain and PR distribution of backlinks. When you get time, you should conduct your own research into this with some expendable age, TLD and content varied domains to see how far you can push before you get deindexed, alot of BH'ers never really do this and are overly cautious after their first G deindexing IMO).

    Shadow - Akismet is pretty vigilant these days and will generally be wiser to your devious WP spamming ways alot sooner than G will be - once again, setting up buffers with a structured linkwheel / satellite structure and using the list of sites to vary your url backlinking will get you around this. WP is generally the best prepared for anti-spam measures (Default WP setups have comments set to admin-approve, etc), so start looking to custom footprints and research the 'learn mode' in the SB manual poster, or PHP script / imacros / roboform that custom footprint commenting.

    "Give a man a match and he will be warm for a minute - but set him on fire, and he will be warm for the rest of his life"

    :firedevil
     
    • Thanks Thanks x 1