1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

How can Google DECIDE if that content is high or low quality ?!

Discussion in 'Black Hat SEO' started by Mr.Ilyass, Feb 8, 2013.

  1. Mr.Ilyass

    Mr.Ilyass Newbie

    Joined:
    Apr 29, 2012
    Messages:
    36
    Likes Received:
    5
    In recent months, after Google updates, so many people complain that Google can also penalty your site if you have bad content, or you are not gonna rank higher if you have low quality or EVEN spun content !!

    the question here is !! HOW can Google DECIDE if this content a high or low quality ?!!
     
  2. hootsparta

    hootsparta Jr. VIP Jr. VIP Premium Member

    Joined:
    Aug 23, 2012
    Messages:
    1,404
    Likes Received:
    328
    Occupation:
    Unique and Spun Blogpost, Real Guest Posting.
    Location:
    Sparta World to BHW
    Home Page:
    Its not decide like that...It will check duplicate only mate whether the posts are published already or not like it checked..
     
  3. axtolip

    axtolip Newbie

    Joined:
    Dec 7, 2012
    Messages:
    38
    Likes Received:
    5
    Does it look duplicate?
    Does it look spun?
    Does it look over optimized for a certain keyword?
    Does it have images and embedded videos?

    These kind of things are all Google check. That's the whole "quality" algorithm.
     
  4. *zap*

    *zap* Regular Member

    Joined:
    Apr 7, 2008
    Messages:
    349
    Likes Received:
    98
    Occupation:
    none
    Location:
    Uk
    google filetype: pdf quality rater guidelines
    remove the space after the:
     
  5. kindarthur

    kindarthur Jr. VIP Jr. VIP

    Joined:
    Nov 27, 2011
    Messages:
    2,212
    Likes Received:
    332
    I have one doubt here!

    If the Google doesn't check the quality of the content means, i will write the unique content without consider the grammar and all these shit things.

    The content also not spun and doesn't have keyword stuffing.

    Then; how the Google consider the content is quality or not?
     
  6. futureland

    futureland Registered Member

    Joined:
    Jan 26, 2013
    Messages:
    53
    Likes Received:
    18
    There are many ways to estimate the value of a content:
    - Does your content talk about all related keywords (from google suggest) ?
    - Is it narrative, personal ?
    - Is the grammar correct ?
    - Do people come back to google after reaching your page ?

    There are plenty of criteria, but I would definitely bet everything on the bounce rate. If people stay, they found what they wanted and if so, there is no rule, even one sentence and a few good outbounds could do that, so it's not that much about the structure but more about visitor satisfaction.
     
  7. twilightofidols

    twilightofidols Power Member

    Joined:
    Nov 11, 2011
    Messages:
    675
    Likes Received:
    223
    The Googlebot does not possess human AI therefore it can not distinguish quality content from poor content without the help of a manual reviewer. However, there are some very important clues that help Google determine if there is a high probably that the content is valuable. I doubt anyone outside of Google knows them all, but amongst these factors most believe include...

    1. Uniqueness of the content
    2. Bounce rate
    3. Internal Linking
    4. Links from high authority sites (i.e. wikipedia, webmd)
    5. Social Buzz

    etc... there are so many more that could be used to extrapolate that the site might be of high quality. I also believe there are many ranking factors that are not indicative of quality content, but nonetheless help with SEO. I.e. using headers, and keyword density etc... would be such factors.
     
  8. Mr.Ilyass

    Mr.Ilyass Newbie

    Joined:
    Apr 29, 2012
    Messages:
    36
    Likes Received:
    5
    For me i don't think that google bots that cleaver, they cannot tell of this article is spun or no, or if it duplicate content, or even if it's has grammar mistakes.. it's too hard for a bot to do that!! i agree that it can detect that content relevant to your keywords, or bonus rate, internal linking, but UNIQUENESS OF CONTENT ... hmm i don't think so, for example news sites, they have the same title, the same content and they still higher..
     
  9. LakeForest

    LakeForest Supreme Member

    Joined:
    Nov 11, 2009
    Messages:
    1,269
    Likes Received:
    1,802
    Location:
    Location Location
    It will be tremendously exciting when a bot/crawler can determine the factual information, grammatical accuracy, readability, and check for nonsensical optimization of an article.

    Until then, they mostly just check for repeated characters. It's kind of stone age and sad if you think about it. Poor bots.
     
  10. artizhay

    artizhay BANNED BANNED

    Joined:
    Nov 21, 2010
    Messages:
    1,867
    Likes Received:
    1,335
    Lmfao. Even if you don't consider Google, there are tons of bots/websites/crawlers out there whose sole purpose is to find duplicate content, such as the websites teachers use to determine if anything from a student's essay was lifted from the web. Don't be dumb now and distort factual information with your assumptions and opinions.

    Cannot tell if serious...
     
    • Thanks Thanks x 1
  11. gorang

    gorang Elite Member

    Joined:
    Dec 6, 2008
    Messages:
    1,891
    Likes Received:
    1,650
    Occupation:
    SEO Consultant - Marketing Strategy
    Location:
    UK
    They employee people look at your site.
     
  12. tombourlet

    tombourlet Registered Member

    Joined:
    May 9, 2012
    Messages:
    83
    Likes Received:
    12
    Could not tell if this was serious at all! Of course Google is clever enough to do this, the bots are highly sophisticated and it is real simple to tell if content is duplicated. They will check for good grammar, bounce rate etc. Just make sure your articles perfectly match the search terms and are of a decent length, I have found my better results have come from a comprehensive guide on a subject (very long How To's)
     
  13. twistedtrick

    twistedtrick Power Member

    Joined:
    Aug 21, 2009
    Messages:
    654
    Likes Received:
    376
    Location:
    United States
    Slightly off topic, but if I was creating a content-reading crawler one of the things I would look for is the word "kindly" which would auto-flag the page to be placed under further scrutiny and if enough flags went off eventually a manual review.
     
    • Thanks Thanks x 1
  14. Techxan

    Techxan Elite Member

    Joined:
    Dec 7, 2011
    Messages:
    3,093
    Likes Received:
    3,585
    Occupation:
    Local SEOist
    Location:
    TEXAS (you have to yell, its the law.)
    In addition to the bots and a manual review, there are thousands of website raters employed to examine and rate the quality of websites. I used to do this for a company called Lionbridge, and Leapforce is one too, I believe.

    When rating a website, we would rate it for how on topic it is for a certain term, and the quality of the site. I typically would do a few hundred a week, and when you multiply this times the number of people doing it, a great number of websites will be rated this way each month.

    Usually, when I was doing it, each search term and results page would be rated by 6 raters. If all 6 raters give the website a low quality score, that site will be considered low quality.

    When I was doing this, the main thing pushed by Google was quality. We were constantly getting training on the difference between low and high quality, and if you think that grammar and spelling mean nothing, you are wrong. Ditto for unreadable spun content.

    We also had to label sites we felt were spam sites too.

    I can tell you from experience there are a lot of useless crap websites out there....
     
    • Thanks Thanks x 1
  15. seo-addict

    seo-addict Regular Member

    Joined:
    Jun 10, 2012
    Messages:
    360
    Likes Received:
    196
    Occupation:
    Slaying Penguins
    Location:
    meh
    I rank sites with dupe shit content all the time ^^. What...? It's not 2008 anymore, you have to be smart about it. Optimize the crap content, make sure it passes copyscape etc...

    if Google currently has this "quality algorithm"... It's pretty shitty right now
     
  16. Mr.Ilyass

    Mr.Ilyass Newbie

    Joined:
    Apr 29, 2012
    Messages:
    36
    Likes Received:
    5
    Thank you guys for all these USEFUL information !! :D
     
  17. viantea

    viantea Newbie

    Joined:
    Jul 15, 2011
    Messages:
    9
    Likes Received:
    11
    Occupation:
    nothing
    Location:
    not useful to you
    SEO is just speculation. No one really know the exact answers to this.It is a secret Google will never show anyone.All we should hope for is for other search engines to spring out so there will be competition.At present Google has over 95% of the search-which is not good for us all.
     
  18. Deco89

    Deco89 Newbie

    Joined:
    Jan 24, 2013
    Messages:
    41
    Likes Received:
    3
    A little off question but, if i take an article in my own language and translate it to english and upload is it possible for google to find out?
     
  19. takeachance

    takeachance Power Member

    Joined:
    Jul 31, 2009
    Messages:
    557
    Likes Received:
    412
    Location:
    The UK of A
    There's much talk here about bounce rates. It's worth remembering that the cookie which records bounce rates only fires in two circumstances.

    1. When a user clicks upon another internal link on your website.
    2. After a period of inactivity that surpasses 30 minutes.

    In particular, #2 warrants further comment. This means that should you have compelling content and the reader takes 20 minutes to read it (I.e they find what they are looking for) and then leave the site contented you will still register a 100% bounce rate. This is an acknowledged downfall of analytics which G even states is a pity.

    How do you get round this? You may have seen many more websites (good ones) paginate their content. This means you split down your pages into 2 or more sections in which the user needs to click 'more' to see the rest. When they click next page, the cookie fires and thus a reduced bounce rate is recorded.

    There's little doubt that G is using this data albeit not a entirely accurate picture to assess the value of websites and their content. So, in order to take advantage you first need to understand how the system works adapt your design accordingly.
     
    • Thanks Thanks x 1
  20. youtalkmedia

    youtalkmedia Senior Member

    Joined:
    Dec 5, 2011
    Messages:
    830
    Likes Received:
    375
    Occupation:
    Web Developer
    Location:
    Toronto
    Home Page:
    I think they look at 5 things..

    1) Length of the text on the page
    2) If it is 100% unique
    3) If it includes images, or videos
    4) When was the last time it was changed
    5) your bounce rate