Thin Content Penalty on Redirected Domain

Russian425

Regular Member
Joined
Jun 7, 2011
Messages
339
Reaction score
153
Looks like Google went completely full retard. I have a domain (domain A) that is currently redirecting to another domain (domain B), so Google issued a thin content penalty to both domains.
Had anyone experienced this before? I don't see why they would issue a penalty to the domain A also, since in reality it just a redirect and doesn't have any content on it.
Also, anyone ever recovered from thin content penalty? If so, did you scrape all of your content?
 
Here we go again, mister "content is king" SEO specialist is going to spread his knowledge on SEO, this outta be good. 0_o
Content isn't king but it does play it's part. Sure you can rank spun,copied,garbage content but when google notices high bounce rate and low CTR they will move you down.

Useful content will help with CTR and bounce rate and maintain rankings.
 
Content isn't king but it does play it's part. Sure you can rank spun,copied,garbage content but when google notices high bounce rate and low CTR they will move you down.

Useful content will help with CTR and bounce rate and maintain rankings.

That is simply not true,l i use a lot of crap content that i redirect with a cloaker and really high bounce rates, like for years now (the sites that i monitor) and i have steady rankings for years, if bounce rate was really a big thing there would be tools all over the place mimicking visitors that stay longer on your website.

Look..... Just try to think logical about this bounce rate, like a business, lets say for argument sakes that Google wants to use the bounce rate as important metric, now there is some smart marketers that build a script/tool (i have been told it is quite easy) that mimic users with the use of proxies, now Google has to maintain a database with all the free, shared and private proxies to check and compare if the visitor is really a visitor.

That means huge investments and the uncertainty that they never going too collect all the free/shares and private proxies, you see the vase scale of things when you want to do this shit on a global scale, Google isnt going to use tens or hundreds of millions just to have the use of ONE metric that isn't even 100% sure, Google is smarter then that, they just tell some Google disciples on MOZ and searchengineland that they do, they spread the word to people like you and well..... not me :) and here we are, arguing about bounce rates.
 
Last edited:
Bounce rate isn't a metric alone. It inadvertently affects things though. Think about Google and how it tracks data and I think you can come up with ideas on how data could be collected when someone goes to a site, leaves the site and clicks another link or performs another similar search.

Think about users....

Think about how users see a site when it's a junk site. They start purposely skipping the site (let's assume it's #1) and clicking the following link. They start to say "oh, I know that site. I don't like it. I won't even click it even though it's #1." I'm sure you've seen sites like that. I'm sure there are sites you don't like that you skip purposely if you see them. Gather enough data that users are purposely skipping a site, and Google figures out that no one wants that site at the top.

How about similar queries? Let's say I search "red houses in miami." As an example, let's say #1 is red-homes-miami.com and I click on it. I don't like it. I go back and click on another link. Hmm don't like that one either. So, I try to search again with a different query. I search "small red houses in miami." Oh! I like this result better! If enough users don't like the site and try to tweak their query because they don't like #1, the site will naturally drop.

Google has massive data collection capabilities. When you pile on millions of people doing that same thing, of course Google will pick up that the sites its currently showing aren't what users want, so it devalues them and moves them down. You can't simulate that with a few dozen people in India clicking links. You need millions of users across several geographic locations with private IPs and real profiles.
 
Bounce rate is one of the factors that makes you stay on first page. If you do your own tests you would know this.

It's not as easy as you think to manipulate bounce rate with software as you say.

Your assuming I get my information from those blogs. I get my information from tests. Bounce rate is a real part of the recipe.

You totally wrong about costs...it wouldn't cost google millions to implement bounce rate factor.
 
and here we are, arguing about bounce rates.

It's not bounce-rate per-se. It's more like Google is saying "Hmm, I just gave this surfer your website and now they're back on Google again in a nanosecond, with the same search looking for different sites. Yours must not have been what they were looking for."

They don't need the stats from your site, they don't care if someone is actually even visiting your site. It is monitored by the person returning to Google, searching for the next result. That data is right there at their fingertips, and they can do split-testing on it by shuffling the SERPs.
 
OP, you need to tell us more.

How much content was on the page? was it unique? was it garbage or good quality content?
 
OP, you need to tell us more.

How much content was on the page? was it unique? was it garbage or good quality content?

I agree with this. OP, please provide us with more insights to your content. Also, it doesn't matter if it's a 500-word article if it's loaded with affiliate links. "Thin content" penalty is a manual penalty so that means a real person (yes, there are still real people working at Goorgle) reviewed both your site.
 
It's not bounce-rate per-se. It's more like Google is saying "Hmm, I just gave this surfer your website and now they're back on Google again in a nanosecond, with the same search looking for different sites. Yours must not have been what they were looking for."

They don't need the stats from your site, they don't care if someone is actually even visiting your site. It is monitored by the person returning to Google, searching for the next result. That data is right there at their fingertips, and they can do split-testing on it by shuffling the SERPs.


If that was really a factor a lot of sites on the first page of would not even be there, and also again it is a metric that is so easy to manipulate (just like social signals) and seeing my data from my (in WH guru's eyes) crappy thin content sites that are ranking year after year, i will keep with authority trumps bounce-rates every time
 
Actually the articles were pretty good, all minimum of 700 words, copyscaped etc.
Total around 20 articles
What's weird is they also did the same thing to my leadgen sites in the SEO niche. I have few 4-5 pages leadgen sites that actually had good conversion rate and good design. Those got hit also.
 
It is not bounce rate (as in seeing 1 page then leaving) , maybe the rate of returning to the SERPs,if all. Bounce rate can just be gamed by implementing an event in GA. For the other part it will be disabling the back button via JS.

Just my 2 c here
 
Thin content is the Gorgs way of saying it is a "thin" affiliate site, not that the "text" on the page (content) is the problem. That is the term they use to ban affiliates, and confuse most others.

Na, that's not true. Google doesn't hate affiliate sites, Google only hates sites without any real value. The way to build an affiliate site Google loves is really simple: Offer some value users won't find on any of the merchant sites you link to. The problem is that many people mix up "unique" and "useful/helpful/compelling/exiting/interesting" when it comes to content.

What you see in the toilet before flushing is unique, too. You flush it anyway...and so does Google.
 
Look..... Just try to think logical about this bounce rate, like a business, lets say for argument sakes that Google wants to use the bounce rate as important metric, now there is some smart marketers that build a script/tool (i have been told it is quite easy) that mimic users with the use of proxies, now Google has to maintain a database with all the free, shared and private proxies to check and compare if the visitor is really a visitor.

That means huge investments and the uncertainty that they never going too collect all the free/shares and private proxies, you see the vase scale of things when you want to do this shit on a global scale, Google isnt going to use tens or hundreds of millions just to have the use of ONE metric that isn't even 100% sure, Google is smarter then that, they just tell some Google disciples on MOZ and searchengineland that they do, they spread the word to people like you and well..... not me :) and here we are, arguing about bounce rates.

You really think a company like Google is not able to detect if the visitor is really a visitor or not? You are just mentioning proxies... Please dude give me a break... Proxies is only ONE piece of a really huge puzzle - the puzzle contains so many other things that can be traced, not just the IP. Google does not need to build a huge infrastructure and database to hold every proxy on the planet, it just needs to detect user behavior that is outside of normal patterns. It does that by tracing those 100 other things from a visitor not just its IP. And I know a couple of those smart marketers you are talking about - mimicking NORMAL user behavior is not as easy as you think. Also, Google needs this metric to make user experience better and better.
 
Back
Top