1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

[Theory] On-page density

Discussion in 'Black Hat SEO' started by gman777, Apr 19, 2017.

Tags:
  1. gman777

    gman777 Jr. VIP Jr. VIP

    Joined:
    Apr 7, 2016
    Messages:
    641
    Likes Received:
    487
    Not sure if this has been discussed before, but I want to talk about.

    Many people say that there are different keyword density across keywords from different niches.

    IMO, the keyword density is applied at global level, so it's not like the sciatica niche should tolerate more % than casino for instance.

    I think it has to do with the authority of site, or to be more precise, with the number of unique backlinks(RD).

    The higher the number of generic/brand/naked links, the better you can use a bigger keyword density.

    But here 2 questions popped in my head:

    1. Is it actually because of higher unrelated anchors(generic etc.) backlinks which delute the kw target?

    2. Or in fact that particular website it's penalized, but due to it's enourmous backlink profile, you won't notice?

    Because I see like this: depending of how much you optimize for that particular keyword your site gets a score. If you increase the density to a certain point, then the on-page score will increase, but if you pass that thresold, your score will get smaller which results in the penalization.

    But because you compensate with a huge diversified profile, you won't notice it.

    If not mistaken, even phpbuilt said that you can get your kw penalized just a few places.

    That'd make my on-page study futile. I'd have to test at different BL counts to make it worthwhile.

    What do you think?
     
  2. davids355

    davids355 Jr. VIP Jr. VIP

    Joined:
    Apr 25, 2011
    Messages:
    9,831
    Likes Received:
    7,436
    Home Page:
    Personally I think the opposite - I think in can vary across niches and types of keywords.

    I think google look for anomalies rather than specific density percentage.

    But I'm only guessing + based a little bit on how I can see lot of different densities in different niches when looking at top 5 results for different terms.

    Also it's more complicated because I think now google won't even separate exact match / partial match / lsi because they willgroup keywordstogether as having either the exact same meaning or similar meaning.

    I'm sure ahrefs have done some studies on this stuff, and that's probably the most accurate info you will get because it needs to be tested on a fairly big scale.
     
  3. elavmunretea

    elavmunretea BANNED BANNED

    Joined:
    May 14, 2016
    Messages:
    1,579
    Likes Received:
    2,090
    This is the way I see it:

    Google have some massive complex algorithm with millions of stages.

    That algorithm likely attaches a certain number of points - or some numerical identifier - to different metrics. For example, (literally just an example) your page may have 15 points from the links coming in, and -5 points for kw stuffing. You would still have 10 points.

    This would mean that, like you suggested, doing one thing wrong (not as good as you should) doesn't mean that article won't rank.
     
  4. gman777

    gman777 Jr. VIP Jr. VIP

    Joined:
    Apr 7, 2016
    Messages:
    641
    Likes Received:
    487
    Yes, I'm also quite sure that they don't use % as part of algo. It's more a guideline that we use to get an idea of what should be good.

    But, do you think that they actually wasted their time to create a "rule" for every single niche? I don't think so.

    About the exact match/lsi thing, well we can isolate that to just one.
     
  5. aidenhera

    aidenhera Elite Member

    Joined:
    Nov 30, 2016
    Messages:
    1,595
    Likes Received:
    277
    Gender:
    Male
    i used to think density rule applies on global level but now i think it doesnt apply at all

    its keyword proximity what matters. i ranked sites with 20% kw density on first page of G? and couldnt rank pages with 0.50% of density at all (N/A).

    i bulk fixed keyword proximity in certain articles and next day they were all on first page of G.

    believe or not, own conclusions are the most important anyway


    however if I can i still try to have 0.50% of density as these in my bulk test performed the best in ranking, keeping reasoned keyword proximity of course.


    keyword proximity recrawls applies very quick, in my case next day without using indexers.

    not sure about kw density recrawl rate, that might work this fast as well.
     
    Last edited: Apr 19, 2017
  6. gman777

    gman777 Jr. VIP Jr. VIP

    Joined:
    Apr 7, 2016
    Messages:
    641
    Likes Received:
    487
    What is that?

    Anyways, guys, on page is thightly connected to the keywords you use in that page. If you don't add the keyword in the title, or somewhere in the content, then you won't rank unless of course other keywords that are used may be correlated.

    But even then, you more likely won't rank as the more the keywords are distant from the main keyword the lesser the emphasis will be placed on that kw.

    So if you have "busty girls fucks in the fake taxi car", but you write without the car, then you'll get lesser power; if you remove taxi, even lesser to the point where you may not even rank. (at least in the case of my website it happened I guess).

    Let's not complicate this too much. It's really not that difficult. Google may just want to believe that. There should be some kind of connection between. Finding it, will make me demi-god in SEO.
     
  7. aidenhera

    aidenhera Elite Member

    Joined:
    Nov 30, 2016
    Messages:
    1,595
    Likes Received:
    277
    Gender:
    Male
    @gman777 for example if you repeat specific word, kw for example too often in single paragraph for example.
     
  8. davids355

    davids355 Jr. VIP Jr. VIP

    Joined:
    Apr 25, 2011
    Messages:
    9,831
    Likes Received:
    7,436
    Home Page:
    Well, I dont know. I just think about it from their point of view -
    They are thinking first of all - how do we determine what a page is about? We determine this from the anchor text people use when linking to it.
    So if everyone links to a page using the text "Fishing" then its about fishing.

    But wait a minute, now this particular site has fishing as the anchor text from every one if its external links, that seems engineered.
    So how do we determine what is the real natural inbound anchor text distribution?
    Would it be the same for every keyword? Probably not - the famous adobe acrobat reader site that was ranking number 1 for download here (Or something like that) is a good example - it gets much higher percentage of anchors for that term than any other.

    Some random terms like what is the best way to clean out your ears or something might not have any direct match anchors.

    So if I think like Google, I would probably be looking for anomolies - for example if there are a thousand sites that have text on their page that relates to fishing, and they all have an average of say 2% direct match anchors for fishing when looking at their inbound links but there is 1 site that has an inbound anchor text percentage for direct match anchor of 25% I would devalue that one site because it is an anomaly, in other words it stands out like a saw thumb.

    Further more, if I were google I would probably be looking at the entire picture - not just the keyword density as one metric and the backlink anchor text as another metric I would be looking at all of it together - looking at the on page text, the interlinking from other pages on that site and the external backlinks, how many times is the word "fishing mentioned"? And how does it compare to other websites that also compete for that keyword?

    Then I have a group of sites - the lower extreme has a combined keyword density of 0.001%, the upper extreme has a combined keyword density of 20%. What is the mean / average keyword density?

    And perhaps there is a calculation they can use - for example the most relevant sites for any particular keyword might have a keyword density that is X number of points above the average density as a whole for that particular keyword. So that it is a fixed formula, but not a fixed percentage because it depends what the average density is like for that particular niche.

    I can't back this up with hard proof and it is only a theory of mine, but I can say for sure that I have seen plenty of niches where the top 10 sites have very low (Or in fact non-existent) exact match density on anchor profile AND "on page" and equally I have seen a lot of niches where keyword density of the top 10 is much higher.

    There is a good study from ahrefs about anchor text -

    https://ahrefs.com/blog/anchor-text/

    Of course it finds that anchor text does influence results but it also mentions that some niches are very different to others and produce skewed results.

    Having said all of that, will be great if you do some experiments here, would really be interested to hear how you would go about it as well.
     
    • Thanks Thanks x 1
  9. Zatoichi

    Zatoichi Junior Member

    Joined:
    Nov 10, 2015
    Messages:
    101
    Likes Received:
    20
    This is the topic that is one of the things I am currently trying to figure out regarding on-page seo.

    I would like to add to the discussion and to say that a Relevancy and Information Density should also be considered when evaluating effects of KW density.

    Relevancy, in my opinion, is the number and closeness of related terms to the main keyword (or keywords if you try to optimize page for several different keywords). Example: ranking for main term: best washing machine, you need to talk about technical specifications, noise levels, overall size, principle of working, quality of laundering...

    The unknown here is: how Google determines relevancy? Does it use it's own index to extract closely connected terms from ALL related websites, and than award web pages for how much of the closely related topics they cover? Or they use a an external database, which seems unrealistic, even for Google, having in mind how comprehensive this database should be to be of any use.

    Information Density, I would define as semantically determining the quantity of information provided in a page. Example: This washing machine is of largest capacity with quiet operation and good quality of washing. There are 15 words in that sentence and it gives us 3 distinct information about the product. That is an example of information rich sentence in my opinion. I know the problem of classifying and making information as such computable (or at least in this case calculable) is a subject of serious scientific study. How far could they have gone with this in Google, I have no idea.

    I would also lean towards the on-page seo in total being calculated with giving +- points for each separate evaluation part, and then calculating the overall score for the page as it relates to the main keyword.

    I'll write more if I can remember other considerations regarding this that I have.
     
  10. gman777

    gman777 Jr. VIP Jr. VIP

    Joined:
    Apr 7, 2016
    Messages:
    641
    Likes Received:
    487
    Ok, here's a better thing to see this relevancy thing: Write 10 articles without the keyword anywhere, but talking like you'd write about that kw.

    Will you rank for that kw? There's a correlation...No? then we just over-complicate stuff. Chances are it won't. Why? Cuz for one my article I checked a kw which was related to subject, but it didn't ranked at all.

    Eaze peaze.

    Google, is just a machine, that's it. No need to ass lick.
     
    Last edited: Apr 19, 2017
  11. dbanjo

    dbanjo BANNED BANNED

    Joined:
    Dec 4, 2016
    Messages:
    120
    Likes Received:
    48
    It might just be a machine, and I love how ppl throw that phrase around... but it doesn't somehow make this all easy...

    Have you ever played Chess against a machine, that thing will kill most ppl, and its fucking dumb in comparission to the algo.

    Just because it's a machine, doesn't automatically mean a human can beat it. You can, but it's mostly because it allowed you to, rather than you somehow tricked it.
     
  12. Nut-Nights

    Nut-Nights Jr. VIP Jr. VIP

    Joined:
    Jun 20, 2013
    Messages:
    5,009
    Likes Received:
    3,198
    Location:
    Hell
    Home Page:
    "Many people say that there are different keyword density across keywords from different niches" Its really depends on lots of other things.
     
  13. littlewebdragon

    littlewebdragon Jr. VIP Jr. VIP

    Joined:
    Dec 30, 2007
    Messages:
    1,667
    Likes Received:
    816
    Occupation:
    Occupation
    Location:
    Location
    Here is one sample of exactly what you have just described. :)