1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Is Google going after article sites ?

Discussion in 'White Hat SEO' started by almir012, Jan 26, 2011.

  1. almir012

    almir012 BANNED BANNED Jr. VIP Premium Member

    Joined:
    Feb 1, 2010
    Messages:
    556
    Likes Received:
    104
    http://googleblog.blogspot.com/2011/01/google-search-and-search-engine-spam.html
     
  2. imperial109

    imperial109 Regular Member

    Joined:
    Jan 19, 2009
    Messages:
    499
    Likes Received:
    361
    Yearly Google Public Relations scare. It's going to be an update that cleans up some of the crap out there. Then we find ways around it, and the cycle starts over. Good thing is, a lot of the newbies are going go to be weeded out so it's good for us.

    What I suspect is that they'll go after grammar, and those keywords that the noobs have screwed up with SB.

    This is also why you don't cheap out when doing SEO. Buy some well written articles and get some good tools. Also, spend more time working on your campaigns than asking why your site isn't getting indexed.
     
    • Thanks Thanks x 3
  3. ShiftySituation

    ShiftySituation Power Member

    Joined:
    Apr 15, 2010
    Messages:
    621
    Likes Received:
    314
    Occupation:
    Having fun
    Location:
    Jacksonville, FL
    I was just doing some KW research tonight on a couple of KW's I'm targeting. I wanted to see if I could hop on some of the pages that my competitors are linking from. Well, I discovered a site that is sitting at #3, less than a year old site and 95% of the links were pr0 form post links and the other 5% was from 2.0 sites, bookmarks and a blog roll or 2. There was 1000 links total so nothing special. They keyword gets about 30k searches a day so this site is getting some nice traffic all from spammy BH methods. Also consider that the competition isn't to hard, but tough and are some high pr sites that are fortune 500 companies. This goes to show that Google doesn't go after link farms like they should. I really don't believe to much of what I've read about how Google handles links, since I seen what I seen earlier today.
     
  4. homenet

    homenet Power Member

    Joined:
    Jan 5, 2009
    Messages:
    790
    Likes Received:
    338
    Location:
    Dimension X
    This is nothing new really when you think about it. Ever wondered why an ezine article will rank a lot easier than a goarticle one? Google knows ezine content is generally of much better quality when compared to a spam sink like Goarticles.
     
  5. WayneRoberts

    WayneRoberts Registered Member

    Joined:
    May 10, 2010
    Messages:
    59
    Likes Received:
    52
    Lol, here we go again.

    All google is going to do is devalue the power of the link from the article directories, but it cannot punish the site thats linked to it unless its from a seriously suspect site. Think about how easy it would be to sabotage competitors if you could create a ton of shit links using poor content and you just spam the web with it.

    Which means that article sites with a high PR will probably become even more valuable.

    My gut feeling at the moment is as follows, but maybe you guys can add to this or tell me if you are experiencing different.

    1 - Onsite SEO is becoming more important again and internal linking structure says a lot about the site. Some basic tweaks I have made to my sites have made a world of difference to my rankings, not to mention more links to pages on my sites within content itself

    2 - Blog comments are showing little value these days, while blog posts are still worth their weight in gold when you get them. Seems as if all the blog spam really does add little value. I am no longer going to point blog comments to my money sites, bar a few hundred just to provide some sort of link variety

    3 - Forums, both profiles and posts, are as important as they ever were

    4 - directories have become a waste of time

    5 - Outbound links to high PR sites are actually beneficial and add some level of credibility to your site as an information resource.

    6 - More weight to credible Web2.0 and social networking sites

    Its simple really folks. If you are going to submit articles, take the time to make sure that even when you spin them they are good enough to clear manual checks.
     
    • Thanks Thanks x 6
  6. WayneRoberts

    WayneRoberts Registered Member

    Joined:
    May 10, 2010
    Messages:
    59
    Likes Received:
    52
    Its important to understand that most websites if they engage in any SEO are normally paying a fortune for manual work. Manual blog comments, crappy profiles, the odd article here and there. Have a look at the average SEO company and the advice being offered and you will see the same thing.

    Lame ass white hat techniques that DO work. They just dont work as well as our rampant spam does.

    Well researched niches should not have much competition in the first place, and if its there, it should not be spamming the web to the degree the average user here is.

    The web is a big place bro
     
  7. dannyhw

    dannyhw Senior Member

    Joined:
    Jul 16, 2008
    Messages:
    980
    Likes Received:
    462
    Occupation:
    Software Engineer
    Location:
    New York City Burbs
    To be fair to Google, they are extremely effective at fighting spam. They do try their best, but an anti-spam measure can't have a measurable negative effect on SERP quality across the board so a lot of stuff that seems easy to filter falls through the cracks. It's not nearly as much as it used to be though. It gets significantly harder every year.
     
  8. locknload007

    locknload007 Jr. VIP Jr. VIP Premium Member

    Joined:
    Apr 14, 2010
    Messages:
    475
    Likes Received:
    67
    I am sort of glad to see it. Use to be able to google something and get meaningful results, now days it is pure crap.
     
  9. Radog

    Radog Registered Member

    Joined:
    Nov 21, 2009
    Messages:
    77
    Likes Received:
    44
    i don't believe a single word from that search company, its all propaganda.
     
    • Thanks Thanks x 1
  10. cbnoob

    cbnoob Senior Member

    Joined:
    Sep 27, 2010
    Messages:
    967
    Likes Received:
    455
    I don't care about this much. As long as you choose the keywords right, any type of link can be effective.
     
  11. Virus1

    Virus1 Supreme Member

    Joined:
    Dec 13, 2010
    Messages:
    1,326
    Likes Received:
    1,409
    Occupation:
    destroyer of worlds...
    Location:
    Welcome to Black Hat World........................
    Home Page:
    Google wants... what people searching with google want..... solid relevant information.

    If you have a bullsh!t autoblog.... with no solid unique relevant content... and its main focus is to push your products... then sooner or later... google will change something that will hurt you.

    but

    if you have unique solid content... then google actually wants you to rank high.

    same thing with backlinks..... if your site is a real site... you can do BH backlinking to your hearts content and not get sandboxed...
    but if your site is just a CLONE autoblog... you will get sandboxed.
    Google has real people that visit the sites....
    and when a site offers no real value... google tries to find a fingerprint or
    pattern that best describes this site.... and then adjust their algorithm to drop
    the ranking of that particular site and any site that matches that same fingerprint.

    google makes money by bringing relevant results to users...
    when we use BLACK hat methods....
    it fuks with their ability to bring the best relevant results.....
    so they adjust their algorithms to circumvent us circumventing them.
     
  12. aldragon

    aldragon Power Member

    Joined:
    Aug 5, 2010
    Messages:
    688
    Likes Received:
    192
    Location:
    ^^
    Looks that this is also the death of link exchange.
    After checking the new algorithm in detail I came out after 3 months with an amazing result, also I have been testing it and it works 100%. Im making a website little by little until I finish with all the written material.
    What I can tell you is I have copied on some of my sites articles from wikip*dia with all links to it and it has boosted my serp for many more long tailed keywords.
     
    Last edited: Jan 26, 2011
  13. BENNY8877

    BENNY8877 Supreme Member

    Joined:
    Jan 4, 2010
    Messages:
    1,278
    Likes Received:
    1,087
    Occupation:
    Wallet Inspector
    Location:
    In my mom's basement
    It makes a lot of sense, every time I try to do research for an article that I am writing I will have to weed through all of the affiliate review sites, and because of SEO, these sites are usually in the top 10. It's hard to find an honest review of a product or service anymore.
     
  14. dannyhw

    dannyhw Senior Member

    Joined:
    Jul 16, 2008
    Messages:
    980
    Likes Received:
    462
    Occupation:
    Software Engineer
    Location:
    New York City Burbs
    Well they still can't detect duplicate content so clone autoblogs are still safe. They do know exactly what's going on with black hat but it's rare that they actually identify something that they can deal with by tweaking the algorithm. The stuff that's used today, blog comments and forum profiles, these things have been around for many years. They seem like they should be detected, but implementing an algorithm change to do away with this sort of thing just isn't feasible.

    And as far as BH link building on a legit site and not getting sandboxed, possible if you've also got quality links, but it's not a matter of a unique site.
     
  15. Virus1

    Virus1 Supreme Member

    Joined:
    Dec 13, 2010
    Messages:
    1,326
    Likes Received:
    1,409
    Occupation:
    destroyer of worlds...
    Location:
    Welcome to Black Hat World........................
    Home Page:

    lol.... you are severely underestimating google...
    there are thousands of sites that have been deranked and deindexed for using methods not in line with google.

    google can detect duplicate content in a heartbeat.

    google has almost unlimited resources..... keep that in mind.

    in real estate... there are 3 rules.... location... location.. location..

    for google the rules are.... unique content... unique content.. unique content....
     
  16. dannyhw

    dannyhw Senior Member

    Joined:
    Jul 16, 2008
    Messages:
    980
    Likes Received:
    462
    Occupation:
    Software Engineer
    Location:
    New York City Burbs
    Believe what you want. I've had hundreds of domains deindexed, but for major, major stuff.

    The thing about duplicate content is that they know there are multiple copies, but they don't know which is the original. In this case, the stronger page all around gets the good ranking. Happens all the time.

    And there's no such thing as unlimited resources, especially in computing. To algorithmically combat some black hat techniques without adversely affecting the overall quality of SERPs would be an engineering nightmare.
     
  17. meakerseeker

    meakerseeker Regular Member

    Joined:
    Jan 3, 2010
    Messages:
    385
    Likes Received:
    23
    Wayne what do you do for onsite SEO to make a world of
    difference like you mentioned.

     
  18. Virus1

    Virus1 Supreme Member

    Joined:
    Dec 13, 2010
    Messages:
    1,326
    Likes Received:
    1,409
    Occupation:
    destroyer of worlds...
    Location:
    Welcome to Black Hat World........................
    Home Page:
    Well... they are going to start deindexing for minor stuff now.. you can believe that

    yes... this is right... but from now on...the plagiarizer is not going to receive any rankings "points" for having the article.

    Well comparing googles resources to blackhatters.... you might as well label them "unlimited" for the sake of this discussion.

    and its not that hard to do....

    this is all google has to do to wipe out most autoblogs...
    simple steps as follows:

    1) find blogs with a lot of traffic -google knows which ones already
    2) cross match articles to find duplicate content
    3) if a site has more than 20(or some mathematically calculated threshold) articles in which ownership is questionable... then it fair to assume that site is hosting duplicate and its part of its business model
    4) lower rank on that site or deindex until further investigation or until they complain about rankings and an actual google employee verifies it
    or whatever google feels is best to go about in fixing this problem.

    quite simple.... no nightmare.
    no one is saying they are going to do it overnight... that would just be dumb. But it can be done.



    This reminds me heavily of the days everyone and their mothers use to hack DIRECTV....
    I told them that directv could easily wipe all the hackers out....
    people told me the same thing.. it would be a engineering/financial/etc nightmare... they don't have the resources etc..etc...

    well they did it... they stopped the pirating of their signal cold.
    this was because it was their territory... their playground .... their rules.

    with google its the same thing....
    this is google's playground... its their search engine... their rules.

    don't underestimate them.

    from your posts... it seems that you make money from autoblogs or whatever.... and me saying google can wipe out these sites is the worst thing you could read about...

    so ... i have said what i needed to say...
    I will not be answering nor posting to this thread anymore.

    take care
     
  19. dannyhw

    dannyhw Senior Member

    Joined:
    Jul 16, 2008
    Messages:
    980
    Likes Received:
    462
    Occupation:
    Software Engineer
    Location:
    New York City Burbs
    That's actually a pretty serious nightmare. Any sort of algorithm that would perform that would have a big o runtime of O(n^2), meaning that as the data set grows, the amount of resources required to keep processing grows exponentially. Put very simply, to check a set of 10 pages for duplicates, 100 comparisons would have to be made. To check a set of 100 pages, 10,000 comparisons would need to be made. This sort of thing gets out of hand very quickly.

    I also don't do BH these days other than for research, but I know why some things still work after ten years or so. They've done a great job overall, but there are some things that remain just out of their reach.
     
  20. ShadeDream

    ShadeDream Elite Member

    Joined:
    Nov 27, 2008
    Messages:
    2,209
    Likes Received:
    5,230
    Location:
    He who laughs last, laughs longest.
    With all of this crap around the Internet I personally don't know in what to believe. I just believe myself. Screw what others have to say and make your own judgements.
     
    • Thanks Thanks x 1