1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

SEO Relativity & Ranking in Big G

Discussion in 'Black Hat SEO' started by cybersage, Nov 5, 2011.

  1. cybersage

    cybersage Regular Member

    Joined:
    Jun 8, 2011
    Messages:
    309
    Likes Received:
    160
    Okay so this probably isn?t the world?s greatest 100th post (in fact its only my 99th!), but hey, its better than some people?s (this sentence is not an excuse to stop reading either!). Lately I have seen a couple of threads regarding Big G, and how they look at SEO, and Big G?s algorithm etc. Instead of writing a tutorial/guide on how I do this or that, I figured I?d talk about SEO and hopefully clear things up for some, or get some people that know more than I do to tell me I?m an idiot (but back it up, because otherwise it has no learning value attached to it). I want to say that to me this is a top down look, as I don?t get into much specifics regarding ranking a particular site, just a philosophy/perspective on how to go about it.

    My definition of SEO, is basically trying to rank my page number 1 in the SEs, by any methods available to me. You hear people in IM say ?SEOs? referring to people that perform SEO services; all this makes me think that the term SEO itself is relative and people attach their own meaning to it. I personally draw no distinction b/w whitehat and blackhat, except to say that the only thing I don?t condone would be building rank through hacking. To me just about everything I do is blackhat, because I am trying to game the system; the key as I will mention below is keeping things looking natural as much as possible.

    Let talk a little bit of history, and why we do the things we do. Most of what people know and do comes from Big G?s 2004 patent:

    Code:
    hxxp://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.html&r=1&f=G&l=50&s1=%2220050071741%22.PGNR.&OS=DN/20050071741&RS=DN/20050071741
    that was published 6 years later in 2010; this document helped confirmed methods that people were using to rank. In fact many of the items that we talk about everyday are contemplated in this patent, and this is one of the primary reasons why we focus on many of these factors. There are numerous factors mentioned in the algo patent and include things like CTR, Frequency of changes, freshness of links, age of domain, anchor text, and anchor texts changes, etc. Now it is well known that Big G is constantly updating their algorithm, so things are constantly in flux, but the foundation itself would be extremely hard to change (meaning that things like # of backlinks will be massively important until the algo is completely reworked). If we keep in mind that Big G is in the business of providing the most relevant search results, then we can assume that algorithm updates are targeted toward achieving that goal.

    What most people don?t realize is that when Big G refers to SEO, they ascribe a different meaning to SEO. There is a reason that Matt Cutts said that they don?t think of SEO as spam; that reason is simple, because big G?s definition of SEO, is what most BHW members would consider on-site SEO; this is also the reason why big G likes wordpress sites, is that it makes the content easily digestible by the search engines. A WP site with a couple of plugins can have amazingly optimized on-site SEO.

    According to big G, any form of unnatural backlinking is blackhat. Again, according to Big G, whitehat is merely on-site optimization that makes your content easily digestible for the search engines. Thus, Big G puts a lot of effort into devaluing links that are built unnaturally. The problem is that going viral is very unnatural looking (so there must be exceptions to the general rule; I believe this is what can trigger manual reviews). Big G?s algorithm therefore MUST look for unnatural links being built. This then begs the questions of how should one build links?

    Here are some of the things we know:

    • Backlinking can get you to the first page (even with crap content), but we know that Big G does manual reviews and the content must hold up. I can?t think of many [profitable] sites that rank #1 without on-site seo.
    • Big G has a preference to original content, but duplicate content gets put in their supplemental index, which has some impact as far as ranking is concerned (at least most believe this; but some may disagree).
    • Every type of link matters. This statement might be a little exaggerative, but people?s rankings improve even with n0foll0w links!
    • After reading many threads, you will notice that the more expert people start eluding to things such as ?link velocity? (how fast links are built), and ?link diversity? (where links are built) ? this is really a no brainer, because if you have a site in WMT, you can see how your links are growing, if you have a spike this looks completely unnatural. (example is when over 5 years your site accumulates 5k links, then you go and do a xrumer/scrapebox blast and create another 5k links overnight).
    • The above point and this one is crucial, IT IS ALL RELATIVE, because on depending on certain factors it might be appropriate to create tons of new links a day. But as many have said, you want to be consistent with your link building ? another way of putting it is if you are building 10k links/per
    week and then you stop building links this raises a red flag.

    The types of backlinks I should build?

    Certain types of backlinking LOOK more legitimate, but the truth is that it is still blackhat (because you are creating these links for yourself). Web 2.0s, and Social Bookmarking, are those types of links that look most natural however, and thus you are less likely to have those links devalued. Sure forum profiles and blog comments CAN look natural, but when you make 100s or 1000s of these links in day, then it becomes unnatural looking. The problem in the IM game is that few, if any, are quite sure of the threshold between natural and unnatural link building (this is why people play it safe and use scrapebox and such links to lower level tiers; it is less likely to get their main site penalized).

    Basically it seems to me that people?s site get hit either by having crappy content, or having unnatural linking occurring (usually both, as the latter can trigger a review, and once a site with crappy content is reviewed, it can kiss its rank goodbye).

    Perhaps this is just a rant, or an attempt to educate. I can?t express just how awesome this community is and this was an attempt to give back. My hope is that this post allows some to see SEO in a new light and how to continue ?gaming? the system to achieve good rankings.

    In a nutshell:
    Sometimes knowing the bigger picture allows one to draw logical conclusions from how to game the system.

    Feel free to thanks or rep if this helped you.
     
    • Thanks Thanks x 2