That's the question I ask myself when I start some blackhat-ish operations (link building, content, social signals) using automated software. Most of the time, the answer is "looking for patterns" and "looking for footprints" in order to differentiate humans and robots. The concept of this thread is to post some basic theories, starting with "If I was Google, I would..." and follow with your thoughts. Some examples: If I was Google, I would compare the user agents of one IP, if more than 20 user agents go through this IP, I would consider it's a proxy. If I was Google, I would try to surf on BHW to discover the new techniques and see how to fight back ! =) Your turn !