Here are some highlights of John Mueller (Google Webmaster Trends Analyst) Feb 14th and Feb 24th hangouts on google plus. Panda and Content Spidering: If you are adding new content to your website daily make sure that spidering is increasing in Webmaster Tools. If you are adding content and they're spidering drops it is likely a Panda quality problem. HTTP and HTTPS: If you use both versions you need to have a robots.txt file for each version or it causes duplicate content. Disavowed Links: If you remove links from your disavow file that have been ignored by Google the links will count again, good or bad. The disavow file works like a robot.txt file with a list of instructions to follow. Whether the link is suspect or good it will get the link juice back. Don't use HTML Sitemap Pages: Google ignores these as a low quality page and will not trust it and could be a Panda issue. Does Google Pass Juice to Links in Files: John Mueller says that Google will read links in any file (pdf,xls,doc,etc), but will not follow them with link juice. Only proper HTML anchor tagged links in files will pass link juice. Does Google Use CTR Tracking: Mueller denied, but if you read between the lines you could figure out that they do. It is mentioned in the book, "In the Plex" on Click through analysis, fom SERP to site and back to SERP. If visitors are bouncing from that page, add some fresh content, video or links from internal pages that are ranking. Panda: they use internal usage metrics. If it looks nicer in the SERPS and a quality issue of on page content. "Have to have site optimized before sending traffic of any kind" this implies that they may watch paid traffic in terms of the bounce rate for panda but has previously denied it in past hangouts. They will ignore 301 is a directive to follow if you forward to a to b and they like A they will ignore the page your redirecting to. New Site Honeymoon: Make sure your traffic is responding to your pages for "New Site Honeymoon" split test to do A/B testing to see if it perfoms well. It implies that the algorithm puts your site in a category for user metrics and specifically "sharing". Social proof is something they look at and if your website is not getting social traffic it will disapear. Google has algorithms for rank, for trust and de-rank according to social signals. Site errors: 500, PHP, MySQL errors will be a "Panda" quality factors. Do not have errors for more than 48 hours with our testing. Google is Machine Learning It is alive, it is SkyNet. It is equivellant to a 3 year old child. Geo targeting Algorithm Working on a Best Guess Basis Even if you have .com or .uk Your content has to reflect your geo-targeting href lang. Based on quality or trust they will decide whether or not you are relevant to that specific geo location. Not just by the URL. Everything is proportional based and machine learning. That is what they are using on-the-fly to rank websites. Share Your Disavow File to Public: John Mueller stated that when you supply a Google spreadsheet for your disavow list that is set to "Public" otherwise the web spam team cannot access it. The Webspam team does not have access to your Google account backend. It is against Google's terms and conditions that they cannot get into your account. Review snippets: John Mueller implied that the review snippet is a quality factor. It's a great way to share the site. Cloned Sites: When you have multi URL with same design, topic and just trying to rank for keywords. You cannot use clones but you can have exact match domains if it is trusted. If you have non-exact match keywords pointed to the site is preferable. Google could detect that the IP is the same, registered by the same person, the design is the same, back links, keywords and content topic, that they will choose the best version that they determine which is the main site and will canonicallize it for you. Does Forwarding from a Penalized Domain to a New One Reset a Penalty? If you try to kill your domain that was hit by Penguin and you create a new domain and forward everything over that they will "help" by automatically setting a 301 redirect. So they look at old site information and forward all the signals, good or bad to the new site. They have 2 indexes one is the historical index and the ranking index to compare and make a decision based on the mined data. "They don't want spammers to get away with cloning sites" John Mueller said. They keep a changelog and can tell if you are maintaining the pages. They could figure out that these are all the same and that you are trying to manipulate the SERPS. Site Navigation is a Ranking Factor: Internal navigation needs to be logical. The page on your site that you want to rank the highest you need to point your internal links to that page to "sculpt" you page rank internally.