As people know, I've been beta testing a SERP CTR service of mine with BHW members and have had some good results. I'm going to share, for free, what I've learned in the past ~20 months. Hypothesis: If the CTR of your SERP is higher than the expected CTR, Google will rank you higher that your current position. Expected CTRs based on position are as follows. Find the CTR for your position: So based on your search count, and the percentage of that search count of the position you want to achieve. You can guesstimate how many clicks you need. Google Webmaster Tools offers your CTR, but the problem with this, is it doesn't incorporate those who don't click. For example, my site is sitting #2 for it's keyword, and it claims the CTR is 6.58%. This will be a similar case with your site. This is because the expected CTR posted in the above image, doesn't include the percentage of those who don't click, or those who click to Google Images, etc. If it did, these percentages would be diluted. This has been a topic that has seen both success and failures when tested. I've been testing this since summer 2014. I'm a programmer, so I've been using Python to simulate via Selenium. I've found out what works and what doesn't. Rand Fishkin ran the best test on Twitter, and has offered the most accurate information, in another video. His findings match up exactly with what I've found. He ran a CTR test, and had put himself number #1 in just 3 hours. This makes manipulating SERP CTR the quickest way to move up page #1. So lets talk about what works, both stated by Rand Fishkin, and what I've tested. Using proxies? DOESN'T WORK Using fake clicks? DOESN'T WORK Using real people? WORKS Face it, simulating operating systems, screen resolutions, internet browsers and a number of other factors is not easy. When I tried doing that initially, it didn't work. I am known to use my own products, and build them around my own needs, thus why Keyword Scout was such a hit back in it's day. When I was building simulations, I had to read market share reports to accurately generate the proper simulations. For example, you randomly select the resolution of 1334x750. Now, you need to find which OS this is. Well, this screen size is the iPhone 6. Which OS version are you going to choose? iPhone 6's use both iOS 8 and iOS 9 and likely iOS 10. I honestly think the above may be a bit overkill. The real advanced ways Google will determine if the click is real or not will be a technical aspect. Take a look at this presentation: http://www.slideshare.net/SergeyShekyan/shekyan-zhang-owasp But my favorite examples are: http://engineering.shapesecurity.com/2015/01/detecting-phantomjs-based-visitors.html So unless you can BE SURE that the software you're using to manipulate clicks has taken care of vulnerabilities like these, you probably shouldn't be using it. With real clicks, you don't have to worry about these technical faults. There are just too many, and it's just too risky to put in the hands of some offshore developer. I'm a late-year computer science student at a top research university in Canada and I don't even feel confident enough to fake it. So, USE REAL CLICKS. So, now you're thinking, will SERP CTR manipulations work on my site and keyword? Well, real USA clicks are not cheap. So if your keyword has a search count of 20k, then you're going to need thousands of clicks to move it. So you have to consider how lucrative the top position really is. However, lets look at a proper example. Lets say you had a keyword that has a low search count, but high profitability, something like "injury lawyers in st petersburg." Lets say the search count is 500, and the CPC is $20. If you sent some clicks to it, it wouldn't require many, and you'd still turn a profit. Questions? Ask away.