On the 27th of September, my client website got hit by the Google Panda refresh. Previously, it had avoided a (obvious) penalty from any of the updates, including penguin. The slap was not a huge one. The loss in traffic was about 15%. The website recovered in the next Google Panda refresh on the 5th of November. It is a strong brand that is an authority in its niche. The website is 5+ years old. During my recovery, I actually had no link building going on (but have since - focused on building qualiy links). This post takes you through the steps of the recovery. View from Google Analytics (Red arrows show updates) Previous to the panda update on the 27th of September, this website had been steadily increasing in traffic, and was reaching about 750-800 search engine hits each week. After the update, it went down to about 630-700. The red arrows show the different panda updates. This website's traffic dips dramatically on weekends; as you can see all of the updates have happened around the weekend. When the website recovered, it went back to around 750-800 hits each week. In the next update it did even better, and we were pulling in about 780-850 hits each month. Post the latest update, despite it being a holiday weekend, we're getting 1000+ hits from search engines. This is hugely significant for the website. The google analytics data from a glance doesn't look significant; but the lows were lower, the peaks were smaller and while there were spikes - they were spikes, rather than constant day-to-day numbers. Conclusion: In this case study, a recovery from Google Panda had dramatic consequences. It resulted in the website being rewarded, and all SEO efforts became magnified afterwards. View from Google Webmaster This snapshot starts in early October and goes forward into the following months. As you can see, the search engine views have dramatically increased. In the past, the low was always 1,000-1,300. Now the low never goes below 1,600. What I did to recover Website Audit I went and did a website audit of all of the pages on the website. This was hundreds of pages. I went and put "site:http://websiteurl.com" into Google and I scraped all of the pages indexed from there. I identified pages that were low quality. I narrowed in on 3 types of low-quality content; pages with no content, pages with duplicate content and pages with low-quality content clearly written by someone with little knowledge on the subject. I went and checked the speed of the pages. I checked for major bugs such as http:// not redirecting to http://www and things like that. In terms of that, the website came up clean. I identified pages that, while they included good content, were sort of "over the place". They weren't targeted towards good keywords, or the keywords overlapped with other pages. Once I identified these, I discovered the website had the following faults: A very low-quality blog. It had like, 50+ posts of 400 word articles. These were literally unreadable (yet written by humans). The person who organized this clearly thought that unreadable articles stuffed (and I mean beyond stuffed) with the same keywords each, INTERLINKED, was the best idea in the world. Each blog post had 10+ tags that, again, were keyword stuffed. Some pictures had their own pages that had been indexed into Google, creating their own non-content page. On the main website, there was a lot of duplicate meta data (titles and descriptions). There were a handful of non-content pages. There were about 10 core pages that, while they had good content, were not optimized for particular keywords. Instead, a lot of the focus overlapped between pages. Think of it like this. The website was about "spotted widget". So, each page was about a unique spotted widget. Instead of each page being like "zebra widget" they all just talked about being a variation of a spotted widget. Some pages were over-optimised. Once I identified these faults, I quickly went to work with the intention of fixing this mess up before the next refresh. Luckily for me, this was one of the biggest waits between refreshes, like 6 weeks or something. Fixing the Problems Low Quality Content I went through and mass deleted almost all of the blog articles. Boom, hundreds of dollars deleted in one swoop. I then went and deleted all of the tags. I went and deleted all of the images that had their own pages. I then went to work MANUALLY writing my own articles. I got about 20-25 in before the next update. These were a minimum of 600 words each. Most had a minimum of 3 pictures in each one, several have a relevant video. They all looked nice with pleasent formating, and they all read nice. For each article I picked a niche subject and made that the focus of the article. I used my knowledge of the subject and wrote authorative articles. I didn't have time to mass-write 25 fresh articles so I came up with a solution to get content fast; something I won't share here. But obviously, you could just purchase articles from a legit writer. The algorithum seems to be able to detect when articles use lots of verbs and empty words; it looks for LSI keywords. On the main website I rewrote those pages to be about completely different things and added new content to those pages in the same manner. Where possible, I made sure each page has 500+ words. I add about 5-10 blog posts each month. Since I understand this niche, I usually pull on topics that I know personally about. This makes them much higher quality. These articles rank very well in Google. Duplicate Meta Data I discovered that my meta descriptions that were duplicates were always attached to pages with duplicate meta titles. Clearly that unique data just never got entered. So I scraped my pages with duplicate meta titles in Microsoft Excel and I manually went through and edited each one. A lot of work but it was worth it. I stayed away from reusing same/similar content to other pages. Overlapping Content I picked a phrase for each page. So, the website is about spotted widgets. I couldn't get around reusing the word widget, but I did everything I could to use the word "spotted" as little as possible on these pages. So instead I focusd on their particular variation; so say, "Zebra Widgets", "Pink Circle Widget" etc etc. In some cases that was tricky. Say I had two pages relevant to Zebra Widgets - one is a page about the small version, the other is about the big version. In this case I came up with a solution; I picked a different keyword altogether for the less popular page. So, lets say the large widget was the most popular. I then did talk about it being a Zebra widget on the small one, but I emphasized and optimised it for "Unique Widget" instead. One great thing about this was I was able to increase my long-tail keyword traffic from these extra keywords that were highly relevant. Over-optimized Content I went and removed as many instances of the over-optimized keyword that I could. For instance, my homepage was about "Spotted Widgets". So, I removed as many instances of that as I could. I was still left with a lot, but there was nothing I could do about that. In some cases, the only way to remove it was to replace it. So I used lots and lots of variations of "spotted widgets". I had 2 H1 headers in the past. I removed one and kept one. In the past I used the word "Widgets" 3 times in my title and description. So I instead varied it up to use different but similar keywords and made sure to use the word "Widgets" only a maximum of 2 times. Social Media As a bonus to this, this website has an active social media presence. I leveraged it more and included the URL to the website in many posts designed to get likes and shares. We had some big posts that got 200+ likes and 40+ shares. Results In each Google update, the website goes from strength-to-strength. I have link building going in the background (lots of variations of keywords for the anchor text) and the website is benefting greatly from this (no I will NOT go into my backlink strategy). How long will this last? Hard to say. I am nervous about a penguin refresh. This website has spammy links created by the same guys that created the spammy blog (they get paid a tonne; go figure). So it could be slapped again - we will see. For now it is doing good. The website got slapped in its major keywords, particularly in one country. Interestingly, those major keywords are still slapped; but the website is pulling in tonnes of traffic from other keywords, it gets traffic from like 50-100 keywords a day. Conclusions Quick recoveries are possible. In this case, the website recovered from removing low-quality pages, changing duplicate meta titles, de-optimizing pages and optimizing non-optimized pages. Fresh, varied content in this case worked very well. Each time there is a panda update, this website does better and better. I hope this has been of some help to people!