1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Has Your Site Been Panda Slapped?

Discussion in 'Black Hat SEO Tools' started by innovalist, Mar 30, 2011.

  1. innovalist

    innovalist Supreme Member Premium Member

    Joined:
    Aug 19, 2010
    Messages:
    1,327
    Likes Received:
    194
    Hey guys,

    I thought I would post this as I am seeing just bits and pieces of this everywhere and a lot of people losing their shirts overnight. Its the most comprehensive article I've seen so far and just wanted to share with everyone here as a way of giving back to BHW which has done so much for me.

    This is not new stuff for those of us at the forefront. For for many of us and the noobs - it might make for interesting reading. So bear with me if you've read it all before ok? ;)

    At first glance, all these strikes me as google going after black hatters everywhere. Personally I don't think we should all start changing the color of our hats overnight (that will be the end of this forum as we know it!) But rather, if we know what the heck is going on, we can be a little bit more careful and narrow down our aggressive stuff to small sites and avoid doing it for our big money sites until we see that the test site hasn't been mauled by the Panda.

    Also - since we are on this subject - I would like to throw this question onto the floor and ask if the Panda striking one or more of our sites does actually affect all our other sites on the same server? I ask this because I've seen many people screaming they have lost ALL their sites. Is it because they did the same spam for all their sites or the Panda searched out the lair and killed anything in it? :eek:

    The article is written by Mark Nunney and be found here: http://www.wordtracker.com/academy/google-panda-farmer

    What posted here are some excerpts which hopefully sheds some light on this matter:


    What in the name of Google is going on?

    The aims of Panda are noble: to remove poor quality sites from the top of Google's results pages. Or as Matt Cutts, Google's head of spam, puts it in a blog post announcing Panda:

    "This update is designed to reduce rankings for low quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on."

    The last thing Google wants is searchers being unhappy with what they find. They might try another search engine if that happens.

    Few people other than the low quality sites' owners and their investors will have a problem with that.

    But all major Google updates leave ‘collateral damage' behind them: sites that just don't match the target or deserve to be penalized. Google are aware of this and so have asked those with "a high quality site that has been negatively affected by this change" to let them know about it here.

    So if you have a high quality site that's been adversely affected by Panda Farmer then let Google know.

    The site used as an example on this page is a high quality site hurt by Panda. It's core content is hundreds of long in-depth specialist articles plus a Q and A based forum for readers' problems.

    Perhaps the Q & A pages are the problem (those pages could like thin content to Google's robots). But then I know of two similar sites in different markets that have also been hit but don't have the Q & A based forum. No, it wont be that easy to work out why an innocent site has suffered.

    What factors make a site vulnerable to Panda?

    Google like to keep these things secret but the two engineers at the heart of Panda, Matt Cutts and Amit Singhal, gave us some strong clues in an interview with Wired.

    Cutts and Singhal revealed their process which I'll summarize as:

    • Conduct qualitative research (that's speaking with individuals and not a big questionnaire) to find out which of a sample of sites they considered to be low quality and why.

    • Use the results to define low quality sites with the factors that Google can measure. This gives Google a mathematical definition of low quality.

    If we start here, we can think of a number of factors that Google might be able to measure to define low quality, including:

    • A high % of duplicate content. This might apply to a page, a site or both. If it's a site measure then that might contribute to each page's evaluation.

    • A low amount of original content on a page or site.

    • A high % (or number) of pages with a low amount of original content.

    • A high amount of inappropriate (they don't match the search queries a page does well for) adverts, especially high on the page.

    • Page content (and page title tag) not matching the search queries a page does well for.

    • Unnatural language on a page including heavy-handed on-page SEO (‘over-optimization' to use a common oxymoron). Eg unnatural overuse of a word on a page.

    • High bounce rate on page or site.

    • Low visit times on page or site.

    • Low % of users returning to a site.

    • Low clickthrough % from Google's results pages (for page or site).

    • High % of boilerplate content (the same on every page).

    • Low or no quality inbound links to a page or site (by count or %).

    • Low or no mentions or links to a page or site in social media and from other sites.

    If any of these factors is relevant to Panda, it is unlikely that they will be so on their own.

    Multiple factors will likley be required to get ‘Panda points' (and points do not mean prizes in this game). Panda points will be added up. Cross a threshold (the Panda Line) and all the pages on your site seem to be affected. This includes quality original pages being ranked well below useless scraper sites that have stolen your content.

    Google have said that "low quality content on part of a site can impact a site's ranking as a whole."

    It's important to define the difference between an algo change and a penalty.

    A penalty must be served if it has a time limit and lifted if it is to be removed.

    An algo change exists and its results will continue until it is changed, your site changes (or your site gets whitelisted).

    Panda is an algo change but no ordinary one. It's an algo change that works like a penalty because if your site crosses the Panda Line then the whole site is affected, quality pages too.

    Panda is penalty by algo.

    What to do if you've been hit by a Panda

    Google suggest:

    "If you believe you've been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content."

    Let's add a bit more to that, put it into practical actions and make a process ...

    • Find the pages and page types hit worst on your site.

    • Isolate differences between those hit and those not.

    • Test changing those factors on hit pages but use this method of analysis with caution because the pages hit most might not be the pages earning you the penalty.

    • Make a list of your different types of pages. Eg, forum, quality article, low quality article, light category, quality category, product, blog post, etc. Put the list in a column in a spreadsheet and start building a table.

    • Add columns for relevant factors like ‘lots of ads', little content, some dupe, all dupe, etc and also number of pages and % drop in Google US organic visits. Fill in the values for each type of page.

    • Look at how much of your site (% of pages) is taken up by your lowest quality pages and improve that.

    • If you are scraping or otherwise copying other site's content, replace it with quality original content or test removing some (or even all) of those pages (and adding 301s from them to relevant pages higher up your site's hierarchy).

    • If you have a large number of pages with dupe (of your own copy), weak or almost no content, improve them or remove (and 301) them or block them from Google with robots.txt.

    • If you have lots of pages that dupe your own copy (eg, as happens with some content management systems and on a lot of ecommerce sites that build new URLs for ‘faceted' pages) then add rel=canonical tags to the ‘duped' pages. This stops Google seeing those pages as dupes.

    • Edit any ‘over-optimized' pages.

    • Improve anything that might make the user's experience better.

    • Offer users more when they first enter a page. Eg, images, videos, attractive text and pages linking to your best, related editorial content.

    • If possible, make your content's language more accessible and more real?

    • Promote your content on social media including Twitter and Facebook.

    • Build your brand awareness across the web wherever you can.

    • If you're sure your site is ‘Google clean' and worthy, let Google know about it but don't expect this to have much effect.

    • Make as many of these changes as you can at once in the hope of shaking off the penalty quickly. With editorial content improving, you can then add back any marketing you are missing, in steps, checking to see you don't get slapped again.
     
    • Thanks Thanks x 6
    Last edited: Mar 30, 2011
  2. J0kerz

    J0kerz Supreme Member

    Joined:
    Nov 2, 2009
    Messages:
    1,414
    Likes Received:
    435
    Occupation:
    IM
    Location:
    There
    Great Post!

    I am still experiencing stuff regarding this new Google Update. I will post my results in a couple of weeks.
     
  3. YouFoundIt

    YouFoundIt Registered Member

    Joined:
    Mar 31, 2011
    Messages:
    74
    Likes Received:
    4
    Location:
    U.S.A.
    What a good read. :)
     
  4. losille

    losille Junior Member

    Joined:
    Feb 22, 2011
    Messages:
    109
    Likes Received:
    95
    Does anyone know if Google would count a quote against you?
    It is common for the press releases to have a quote from the actor, singer or whatever. We all use the quote.
     
  5. cody41

    cody41 Power Member

    Joined:
    Jun 18, 2009
    Messages:
    682
    Likes Received:
    274
    Location:
    Texas
    In my experience, i've been hit the hardest with my cookie cutter xfactor themed sites. BUT, my other sites on the same host haven't been affected..still making money. So I just wanted to throw that out there.
     
    • Thanks Thanks x 1
  6. britcpa

    britcpa Power Member

    Joined:
    Mar 25, 2010
    Messages:
    509
    Likes Received:
    1,384
    -----------

    ive got 200+ sites on the same host so when i read the op's comments referring to ALL sites being discovered by G and them 'killing the nest' - i was like wtf??!! :eek:

    glad to hear your own experience as that kind of goes against the op's theories because YES, i want YOUR experience to be the same outcome for me & im sure everybody else will feel the same about their own sites
     
  7. qxxxp

    qxxxp Junior Member

    Joined:
    May 3, 2009
    Messages:
    185
    Likes Received:
    82
    Occupation:
    President of Planet Earth
    Location:
    /index.php
    Home Page:
    all my autoblogs and amazon shops are dead... i guess its time for something new.
     
  8. Flurbuff

    Flurbuff Regular Member

    Joined:
    Jun 17, 2010
    Messages:
    227
    Likes Received:
    94
    none of my sites have been hit by panda (cross fingers). this is how I make a site:

    - premium theme i've never used before; alter it a bit here and there

    - 100% unique content written and designed by me; if the topic is something i'm not familiar with then i research it and put stuff on the site that's true

    - i generally have 2-3 ad links per page; 1 (or two small) visual ads and 1 text link

    - the site has at least 15 pages of content; pages with articles are 400-800 words

    - i have several outbound links to related sites on nearly every page

    even though i put a lot of work into making my sites look nice and provide visitors with relevant information, i'm still paranoid that i'll wake up one day and they'll be deindexed. but, i guess that's something we all have to be prepared for.
     
  9. nycdude

    nycdude Jr. VIP Jr. VIP Premium Member

    Joined:
    Oct 1, 2009
    Messages:
    485
    Likes Received:
    560
    Location:
    Mazatlán
    I've been semi-hit with the Panda, I have a website that I first built for educational purposes (still is). It's mostly about what I do as a profession and the idea behind it was to do well in search and get clients. After time, I decided to monetize the site more by promoting CB products and Adsense. I built an extension to the site with an autoblog and built it to a whopping 800 pages during several weeks. My main site before I built the autoblog part had and still has a PR of 4 and it sent all that juice to the autoblog, my traffic increased two-fold and fast.
    Now, autoblog content sucks and no one wants to read it. The purpose of those pages were to draw those visitors to the main site and never to go back to the autoblog and it sort of worked. After awhile, my bounce rate was so obvious from the autoblog pages that G hasn't been ranking those pages anymore, but still gave me the credit for my good pages. My traffic, although a bit higher than before the autoblog, has dropped about 30% after G didn't recognize them anymore.
    My experience is G will partially Panda slap a website for those garbage pages but will let you keep your good standing with the good pages that people will actually read.
    My two pennies.