1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Negative SEO vs. MattCutts.com

Discussion in 'Black Hat SEO' started by danailo, May 14, 2013.

  1. danailo

    danailo Senior Member

    Joined:
    Apr 1, 2011
    Messages:
    1,190
    Likes Received:
    766
    Location:
    http://fortunelords.com
    Home Page:
    Negative SEO has been a hot topic for a number of years now, and we all know the typical routes to demoting your competition:

    1) buy loads of crappy links (ie. xrumer/fiverr blasts)
    2) duplicate the target site across hundreds of other domains, (squidoo lenses for instance)
    3) breach GWMT security and de-index
    4) manipulate mass spam annotations (email/DMCA's/spamhaus etc)
    5) link takedowns (send email to people linking to a competitor, and ask/threaten them to remove them)

    Any of those might work, but none are particularly reliable.
    Most actually stand a good chance of positively influencing their results instead.
    Crucially, its not hard to notice this stuff going on, as long as you know where/how to look.

    A savvy SEO would probably take another route.*


    [​IMG]
    (*by recruiting Pandas & Penguins, and directing
    them towards problems you can create).

    If I were going to look for SEO vulnerabilities, I would be more inclined to just carry out a full SEO audit on a competitors website and look for weak spots, then try and optimise their weakness.
    Example: MattCutts.com

    Lets just say I have a site that I wanted to rank for the term, "iPhone User Agent".
    While doing some competitive research, I stumble on the domain mattcutts.com which ranks 4th:
    [​IMG]
    So after checking out his site, we can see that its powered by wordpress, therefore *could* have all sorts of vulnerabilities, but lets assume that the blog owner keeps everything up to date, and doesn't have any rogue plugins installed.
    Manufacturing Duplicate Content

    Lets be clear, if someone copies your content, there is very little you can do.
    Lets also be clear, Google is really bad at identifying the source of any piece of content.

    BUT - it is YOUR RESPONSIBILITY to make sure you haven't allowed a technical problem that might result in duplicate content issues on site.
    There are two main ways of manufacturing on-site duplicate content, both have a major dependency:
    (if this is not in place, negative SEO would not work)

    ** a page with a DIFFERENT URL containing the same content MUST NOT render an http 200 header **
    You could just add a query string for instance: www.mattcutts.com/blog/iphone-user-agent/?variable=dupecontent that would serve the purpose of rendering the page again, with the same content, thus creating duplicate content.
    UNLESS the target site has rel=canonical correctly implemented (which it does).
    But thats not all we can do when screwing around with URL's...
    Wildcard Subdomains

    Basically, a really bad idea, 99% of the time. You can see a site that allows wildcard subdomains by inserting a random character string before the FQD, lets take a look at what happens when you go to:
    blahblah.martinmacdonald.net
    [​IMG]
    That is behaving EXACTLY as its meant to, it returns a straight 404 not found response code, but lets look at a similar page on MattCutts.com:
    [​IMG]
    Using ANY subdomain apart from
    www.* results in a full duplicate copy of the Matt Cutts blog, each page returning a 200 http response. Pretty handy if you wanted to, for instance, de-index it for a specific keyword term. [​IMG]

    So lets pick on that iPhone post again. For google to see it, I need to link to it. Something like this would work: iPhone User Agent. Now google is going to see the page, and index it. Unless that Rel=Canonical is setup correctly again, so lets check:
    [​IMG]

    as you can see in the screenshot, that page is correctly canonicalised to the correct page, on the REAL subdomain.
    DAMNIT. The blog is protected. Isn't it?

    Well, not quite. Knowing that the server is accepting wildcard subdomain requests, we know there is an SEO vulnerability - and this motivates us like nothing else to find a route through the protection, so lets fire up Screaming Frog!
    First thing is to configure the spider to search for the inclusion (or omission) of a rel=canonical tag (handy guide available here from SEER) and find pages that DO NOT HAVE REL CANONICAL in the source code.
    Surprisingly, this turned out a few URL's, crucially the homepage and the /blog/ homepage are included in that list. So now we know we can screw with the rankings for those pages by creating on-domain duplicate content, and using searchmetrics we can pretty quickly work out what those pages rank for:

    [​IMG]
    I particularly like the rankings: SEO Blog & webmaster blog so lets start with those, and simply put the links in this line of text should be enough to use the Panda algorithm update to confuse google into not ranking the site, by creating domain level duplicate content.
    By linking to the duplicate page with exact match anchor text, we theoretically are sending google a signal that those pages are significant for the query as used in the anchor text, and by MattCutts.com serving the same page as it would do on the ‘www.' subdomain its creating duplicate content.
    Sweet!

    So we've found a vulnerability, and taken advantage of it to hurt rankings on the target site, but what about that iphone user agent post?
    Well: unfortunately, the above would really only work on pages where you can create duplicates without a canonical tag, and that post has one - so this will not directly impact that ranking. It will however negatively impact the site as a whole, and if it were replicated hundreds of thousands / millions of times, its likely to cause significant crawl equity and subsequent ranking problems to the main site.

    Hey, I'd love to help test this:

    If you would like to help out this negative SEO test, please just link to the following pages and anchor texts:
    http://seo-blog.mattcutts.com/blog/ Anchor text: "SEO Blog"
    http://webmaster-blog.mattcutts.com/blog/ Anchor text: "Webmaster Blog"

    From any sites that you have direct access to. As with all negative SEO tests, I strongly recommend only doing this on domains that you can immediately remove the links from in future should the need arise.
    Hey, I'm Matt Cutts and I'd like to prevent this:

    Two ways: either correct the server config to disallow wildcard subdomains,
    OR
    make totally sure that every page on your site has the correct fully qualified URL within the rel canonical tag.

    Footnote:

    a quick way of finding which pages have been linked to externally, but do not carry a rel canonical (hence are indexed) is by doing this:


    [​IMG]
    which reveals some other interesting stuff:

    1) there is a duplicate installation of wordpress on mattcutts.com under the /blog1/ folder. My guess is that if you wanted to brute force a WP installation on that domain, choosing this one is probably a good idea.

    2) the second orange arrow points out the subdomain seomofo.mattcutts.com so I assume thatDarren Slatten has also noticed this vulnerability.

    Source
     
    • Thanks Thanks x 7
    Last edited: May 14, 2013
  2. ComputerEngineer

    ComputerEngineer Senior Member

    Joined:
    Apr 25, 2012
    Messages:
    833
    Likes Received:
    70
    well negative seo definitely works and i am one another victim of this
    got spammed with thousands of un related adult keywords as blog comments
    i dont know who did but definitely affected me
     
  3. Smeems

    Smeems Regular Member

    Joined:
    Apr 29, 2012
    Messages:
    425
    Likes Received:
    417
    • Thanks Thanks x 3
  4. UrsuAke

    UrsuAke Power Member

    Joined:
    Sep 28, 2011
    Messages:
    700
    Likes Received:
    978
    Occupation:
    SEO Specialist.
    Location:
    Romania, land of choice
  5. UrsuAke

    UrsuAke Power Member

    Joined:
    Sep 28, 2011
    Messages:
    700
    Likes Received:
    978
    Occupation:
    SEO Specialist.
    Location:
    Romania, land of choice
    • Thanks Thanks x 1
  6. kryler

    kryler Junior Member Premium Member

    Joined:
    Jun 14, 2012
    Messages:
    103
    Likes Received:
    53
    Location:
    Wales, UK
    I've always wondered, surely if Negative SEO was such a massive way of getting a domain down in rankings... then surely using Matt Cutts' website as a target would be a perfect chance to test this out?

    If Matt's site drops, monitor what changes he makes to get it back up in rankings and apply that to your own if you get hit.

    Obviously he has certain "advantages" over the average SEO dude though.
     
    • Thanks Thanks x 1
  7. bobred

    bobred Registered Member

    Joined:
    Dec 21, 2011
    Messages:
    98
    Likes Received:
    63
    Unfortunately us mere mortals don't have the ability to flick an i-win button ;)
     
  8. Ste Fishkin

    Ste Fishkin "I'm watching you.." - Apricot Jr. VIP Premium Member UnGagged Attendee

    Joined:
    May 14, 2011
    Messages:
    1,829
    Likes Received:
    8,688
    Occupation:
    Rands Sex Slave
    Location:
    England
    It's in his interest to not let it drop as if it does drop form NSEO, he gets fired.
     
  9. kryler

    kryler Junior Member Premium Member

    Joined:
    Jun 14, 2012
    Messages:
    103
    Likes Received:
    53
    Location:
    Wales, UK
    TBH I wouldn't be surprised if certain sites (including his) have MUCH more weight in the algorithms than they let on.
     
  10. roaldd

    roaldd Registered Member

    Joined:
    Aug 26, 2012
    Messages:
    62
    Likes Received:
    4
    lol, sickening
     
  11. omnipotent$

    omnipotent$ Regular Member

    Joined:
    Mar 23, 2013
    Messages:
    493
    Likes Received:
    288
    I remembered a group of webmasters tried taking down seomoz in the serps with neg seo. If people can't neg seo a site like that, they're going to have a tough time trying to take down the head of Google webspams personal site.

    If anything, all the links thrown at his site is going to be devalued.
     
  12. UrsuAke

    UrsuAke Power Member

    Joined:
    Sep 28, 2011
    Messages:
    700
    Likes Received:
    978
    Occupation:
    SEO Specialist.
    Location:
    Romania, land of choice
    Once upon a time amazon.com got the ban hammer for about 24 hours. Work of some great SEO's. I know one individual who took part in that epic challenge.

    True story.

    Anything is possible, you just have to do it.
     
  13. perfectgirls.net

    perfectgirls.net Newbie

    Joined:
    Apr 17, 2013
    Messages:
    26
    Likes Received:
    5
    as far as neg seo for seomoz goes, that's a pretty massive site to affect. Agreed, anything is possible...
     
  14. danailo

    danailo Senior Member

    Joined:
    Apr 1, 2011
    Messages:
    1,190
    Likes Received:
    766
    Location:
    http://fortunelords.com
    Home Page:
    We have some discussion here :)
     
  15. omnipotent$

    omnipotent$ Regular Member

    Joined:
    Mar 23, 2013
    Messages:
    493
    Likes Received:
    288
    I think the real challenge is negative seo that results in permanent de-index or long term penalty for big brand sites. We all know negative seo works on sites not yet fully trusted with google, which leaves all of us attacking each other while the big brands take all the money.

    And I wouldn't be surprised if some random guys had decided to use blog networks/valuable link sources to trip some kind of anchor filter with seomoz, which allowed seomoz to report those networks/sites/links to google. Thus, devaluation/de-indexing/penalties for webmasters using those sources.

    Talk about a massive backfire in the attempt to take down seomoz. Again this is just theory, but who knows.

    On another note, I think google's disavow tool is their crowd sourcing method to devalue link sources and not to help webmasters out. Because their reconsideration request has worked in the past for many so this leaves in my mind their disavow tool as just a false hope with deeper intentions on Google's part.

    ...the BH mind wanders at times...
     
    Last edited: May 15, 2013