Here is a method of doing keyword research that helps find new niches to enter and keywords to go after in the post Panda/Penguin world we are in now. A Warning: It's an involved processed so for those that see the potential and are willing to put in the work I think you'll be very satisfied. Plus, because it is not a one click fix I don't think you will have any problems with this method becoming saturated or over used. Ok, so this method involves finding keywords and niche ideas by exploiting the power of sites I call Rank Magnets. This is a fancy term for sites that rank for a lot of different keywords across hundreds of thousands or even millions of keywords. Sites like hubpages, squidoo, ezinearticles, etc. While they rank for tons of keywords across many many niches they aren't an authority on any one topic which makes them generally speaking easy to out rank. So if one of these sites ranks in the top 10 for a keyword that can be a sign that top rankings for that keyword are possible without tons and tons of work. Step 1: Start With Some Fancy Google Searches - To start off we want to use some advanced Google queries to find products, niches, and keywords that are in demand but not too competitive or too hard (hopefully) to rank for. Here are some advanced search queries you can use: For Squidoo.com: Code: "20..1000 Comments" "Ranked #500..1000 in Healthy Living" site:squidoo.com This will find all squidoo lenses that rank between 500 and 1000 in the "healthy living" category and have between 20 and 1000 comments. I like to use lower rank lenses instead of the very top ones because often the top lenses will have a lot more seo work done on them (more backlinks) so by aiming lower you can often find less seo optimized lenses. You can modify the query to change the rank and comment ranges and category. Some other things you can try is restricting results to a certain number of "Likes" or "pins" In the screenshot above, the result with an avatar shows "240 Like" so you could limit results by likes using this: "50..100 Like" This would limit results to those with 50 to 100 likes. Basically, anything that is visible on the page that is used to show the quality or popularity of the page can be used as a search filter! For Amazon.com: Code: "100..500 customer reviews" "Rank: #5000..10000 in Cell Phones & Accessories" "out of" site:amazon.com This one is a bit more complex. Here is what it does. It looks for all products with between 100-500 customer reviews, ranking between 5000 and 10000 in the "Cell Phones and Accessories" category. The "out of" part will make sure the average star rating (ie. 3.9 out of 5) is shown for each listing so you can easily see average rating of each product which will help you tell if it is a quality product people like: The number of reviews is a sign that the product has a lot of customer interest, the "out of" part let's us quickly see whether those reviews are good or not and the ranking let's us define how popular of a product we search for with the theory being the higher the rank then generally the more competitive the keywords are going to be to rank for. As with the squidoo search query, you can modify all of the values to make the ranges smaller or larger, change categories or add new parameters. One I would add is to limit results by the average review rating which can be done by adding "3.9..5 out of" This will only return products that have an average review of 3.9 stars or higher. The catch is that if you add this to the search query above Google will say the query is too long to process so you will need to remove something else to make the query work. For Hubpages.com: Code: "Sports and Recreation" "by * 300..1000 followers" "level 6..8 commenter" "50..1000 comments" site:hubpages.com Building on previous queries we get a little more complex with this one. There is a lot going on here. This will search hubpages.com for Hubs in the "Sports and Recreation" category, where the author has 300 to 1000 followers, where the number of comments is between 50 and 1000 AND there is a Level 6-8 commenter in the comments. The number of followers helps find Authors who have a following and which would indicate they write about quality products, services and topics of interest. The number of comments shows that whatever was written about people have an interest or opinion about. Now, I'm not naive. These metrics can and are manipulated by some so there is that to consider but it is still a decent gauge of quality. But since manipulation does exist, I've thrown in the search for a specific "Level" of commenter. The higher the level of commenter the more trust and influence that person has. So if they are commenting on a hub, chances are the hub is legit as a person with a high comment level is less likely to comment on pure spam hubs. Like the others, all of these parameters can be tweaked and changed out with others you might discover. For Ezinearticles.com: Code: "Internet and Businesses Online" "Submitted On * *, 2012. Viewed 100..1000 times" site:ezinearticles.com This is an old one that has been around in some for for a long time. I first saw it it (or a version of it) in a post by Jack Duncan on the WF. Which was my inspiration for finding similar kinds of queries for different sites. So props go to Jack for this. This query looks for articles in the "Internet and Businesses Online" category submitted this year that has been viewed 100 to 1000 times: You can get a lot of modification ideas just by looking at the result snippets. For instance you could search for articles with a certain word count "Word count: 500..600" or how many articles an author has written in which more articles written might indicate someone who is successfully ranking articles "500..2000 Articles" Or you could look for authors that have been around a long time: "Joined December *, 2009" (This would search for authors that joined in the month of December 20009) Finally, you can find articles posted on a specific date: "Submitted On August 22, 2012." Step 2: Find Keywords They Rank For: This step is pretty straight forward. Once you perform the searches and find some specific pages you think have potential, you can take the url of the page and dump it into SEMRush.com to see what keywords if any a site ranks for. This can be a bit of a time consuming process as you have to check one url at a time but if you stick with it you can find some worthwhile rank targets. Here is an example using the first result in our Squidoo example : As you can see this lens ranks in the top 10 for multiple phrases with over 1000 searches a month. Another thing to point out is that SEMRush uses the LOCAL exact match number from Google's Keyword tool and not the Global numbers as this screenshot shows. So it is entirely possible that SemRush under reports potential traffic numbers. Oh, another benefit is that you don't need to be a paid member of SEMRush. They will show the top 10 keywords for free and since they are generally sorted by highest search volume first and rarely does a single page rank for more than 10 high volume keywords you are going to see the "meat" of that page's rankings for free! Step 3: Checkout Backlinks: This could actually be Step 2 as they are pretty interchangeable. Once you find a page that is ranking well, you want to checkout the backlinks of the page to make sure you haven't stumbled across a well optimized page. Looking at our example we can see that it doesn't appear to have too much going on as far as backlinks are concerned: A total of 47 links from 9 different domains is not too tough to crack. Also, as you can see from this screenshot the top links to this page are all from other Squidoo lenses. What you can't see is that further below once you get passed the squidoo links the remaining links are very weak and lack authority. (Hop on over to OSE to checkout for yourself to see what I mean) So overall, it appear this would be a page that with a litle bit of work you could outrank...perhaps with another squidoo lens. Further analysis of the top 10 would obviously be required but I think you get the gist of this research method by now. Bonus Step: Ways To Further Automate This Process: If you're like me, you want to take a process like this and automate it is as much as possible. So here are a few tips to bring more automation to the table. I'm not going to mock it up with screenshots as I'm about exhausted putting this together 1) You could load up a few hundred or thousand of these advanced search queries and scrape the top 1000 results with scraepbox giving you a huge list of urls to research 2) Using a free program called Netpeak Checker you can load up all the URLs you collected into scrapebox and check them all for PR, backlinks, etc to get an idea which ones are using SEO and which are not. 3) Once you've eliminated the obviously seo'd pages you can either manually check the ones of interest in SemRush.com like above or if you are a paying member of semrush you can have a programmer put together a script that will use the SEMRush API to automatically check which keywords the urls rank for and output everything to a spreadsheet. You can easily get something like this done for less than $50...I use vworker.com Well, that's it! It's not the prettiest of methods but it can really help you uncover some new ideas and identify keywords that you can go after. What I really like about SEMRush, or any similar service is that it shows you what is ranking AFTER the Panda and Penguin updates. Sure, the results are typically a month behind but since we are getting well past the initial Panda and Penguin roll outs we're at a point that we can see what is still ranking and use this to our advantage. Hope this sparks some new ideas!