• On Wednesday, 19th February between 10:00 and 11:00 UTC, the forum will go down for maintenance. Read More

Preventing G from seeing affiliate links AS affiliate links?

GoogleAlchemist

Regular Member
Joined
Nov 25, 2009
Messages
309
Reaction score
45
I know this has been discussed before but damnit if various related searches aren't bringing up any results so I apologize for redundancy.

Besides, I wanted very current info...

I've started poking around for something like this recently. I saw on another forum a mention of wp traffic tools for these purposes...is that all that is necessary to effectively block search engines from seeing affiliate links as affiliate links? I saw some talk about G being able to, or soon being able to, read javascript or something along those lines???which would make a lot of the current techniques invalid. I remember the post was referencing some sort of redirect plugin or script from brad callen(i think) that was currently a valid solve for the issue but was on the way to being defunct.

Again, not claiming accuracy of the info but I gotta throw it out there to get some updated discussion going on this so I'm not in some false sense of security.

And finally, if a site gets a manual review and the reviewer sees affiliate links with any of this sort of thing being done to them, is the site going to get penalized or deindexed for breaking TOS?

Thanks yall
 
If it gets a manual review,yes your site will get penalised for trying to hide affiliate links.
Worse,probably ALL the sites in your google webmaster tools account will get hit as well.
It's happened to me,I had to work hard to get the other sites back to anywhere near where they were.Lesson learnt,I have one adsense site now,I don't touch it,it's not how I make my money but it trickles some money it.
It is a clean site with nothing hidden but doesn't have affiliate links.

I'm sure someone can advise you further but if you get caught expect the worst.
 
Maybe a structure like... /recommends/xxxxx with a php or htaccess redirect. This hides your affiliate links and could possibly pass a quick manual review.
 
Hey GA,

I saw this post over at BLF but I'm letting some others talk it up over there if they want to.

How to slice into this pie...

There isn't as much light on the issue as people would like to pretend. There are a couple of ways to do bot spider detection:

  1. useragent detect
  2. ip detection
  3. browser feature detection

A spider can fake a useragent, and in a battle to dicover possible cloaking, it's a safe bet to assume Google has rouge spiders with faked user agents. But it's not a certainty... It's just a safe assumed.

A spider can fake IP addresses through proxies just like humans can. There's services that use honey pots (very large networks) to detect and sort spiders from humans. These services can be hooked into plugins and softwares to auto-update an IP list. The thing is that with these services... they have to have a way to determine this, so how do they? And even if you are signed up with an IP collection service such as Fantomaster's service, you are still on a lagging delivery service that could out maneuvered by fresh IPs and other tricks I have not considered.

So how does WP Traffic Tools do it? It has a double layer of protection, firstly checking the user agent to determine spider or human, and then checking browser features.

It's said that modern bots have capability to trigger ajax. How? I don't know. So checking to see if the visitors browser has ajax support is not a full proof safety. So WP Traffic Tools uses some other tricks that appear in theory to be more full proof. If Fantomaster can do it through available technologies then so can WPTT.

Plus, we make assumptions that Google is participating in a high tech craft detection war against cloakers. Even if they are, it could be a safe thing to trust in the grey heart at the core of Google that understands that somethings deserve to go unchecked because they have a good purpose.

I personally think that, if there are very special rouge bots designed to flag certain sites, then there are not a huge trove of them and they can only be as smart as technology allows them to be. I also do not think the Google Spam team is that large and committed or we would of seen massive devaluations of sites that use easily traceable footprint link networks as well as some other counter spam measures that never ever manifested. This leads me to think that they (the spam team) sees more than we do about the value that spamming helps create on their own system, so they let it continue without warring heavily against it. But that's political theory and I'm not certain about anything!

What are flags that can throw manual reviews? I don't know. Many 307 and 301 redirects possibly. I bet they do not have the power to manual review the thousands on thousands of pages that might trip an algorithmic detection system.

They could target commercial tools like WP Traffic Tools, but the tool is really more of a compliment to internet activity than a drag or threat.

It's not like WPTT is a the leading cloaking utiltiy either. We focus more on link masking, redirection, advanced ad placing, and other nifty things. Spider differentiation is just an extra feature available for the link masking and redirections. And even these additional features have been created out of the fear of unfair Search Engine over-regulation. The existance of cloaking represents a contention of wills between the content providers and the content indexers.
 
There's nothing wrong with using redirects for affiliate links. Many people use it for tracking purposes.
 
Back
Top