Starblazer
Elite Member
- Feb 28, 2019
- 4,923
- 6,431
When I was new to the SEO, I didn't bother much about backlinks or authority and continued to post helpful content in order to rank. Sometimes I've spent 3-4 days to research & write a single article. But, my pages rarely ranked and some trash pages from News or General niche sites used to effortlessly rank. I've started this case study when I realized that it's extremely easy for high authority sites on Google even without relevant content. The thread title is a clickbait, but in a nutshell, that's exactly what I'm going to do. I've learned a lot of things from this forum and I hope that this case study will help other members to identify opportunities to improve their sites.
A newbie, Little Johnny, joins BHW and asks, "Dear sir, I've created a coupons site 1 year ago but couldn't find my site on Google. Someone suggested me that I need backlinks to rank, so I got 1 Million backlinks on Fiverr for $5. When will my site rank?" An SEO expert, Tom Cruise, replies, "Hi Little Johnny, you won't rank with $5 backlinks. You need to spend at least $1000 for backlinks and $1000 every month for quality articles. Google's senior webmaster, Mr. MoFo, said that content is king."
This is a common occurrence on the forum. Although I don't disagree with any of those claims, there is no tailor-made solution for your site. You have to find what works for your specific keyword instead of following a general solution. Moreover, I believe that Google has a domain authority metric (not Moz DA) that helps high authority sites rank without relevant content. So, I want to confirm if I can rank web pages with the help of domain's trust & authority.
Requirements
Steps
Disclaimer
Tom Cruise don't need this disclaimer, but Little Johnny may waste his time thinking if it's a magic pill. Although I share my own results here, I don't guarantee that you can easily rank by repeating the process. There are also some problems associated with it, which I'll discuss later in this thread.
Niche
The mainstream niches like weight loss & finance are tough to rank (just an opinion, not tested them yet). Moreover, I'm doing CPA and thought it would be better to test something relevant. I have selected spammy niches like game hacks, freebies, dating etc. I won't reveal the niche for obvious reasons, but you can have an idea about it from the examples.
Parasite
There are tens (if not hundreds) of parasites, and it's really difficult to find the one that easily ranks. I've literally checked hundreds of keywords to find the parasites that have consistent rankings. Some parasites are ranking for specific keywords, but don't appear anywhere else on the internet. Although they may work for some of you, I particularly need a parasite that can rank for all niches, so that I can use the same techniques for all my projects. I've shortlisted Google sites and Medium, because they are preferred by Google and can easily rank. But, Medium has more control over their platform and they can delete whatever they want & whenever they want. So, I chose Google sites which can also be tracked on Google Search Console. Let me show you something:

This is the trend showing the growth of organic keywords for Google sites (data from Semrush). There are 2 things you need to understand from this trend.
Content
Well, you can't post random stuff on internet hoping for someone to search for it. You need to write content that already has a demand. I've found a few keywords that have high traffic. Spiderman once told me that with huge traffic comes huge competition, which is apparently true. Many News sites are hunting keywords just like how a Lion hunts down a deer on Discovery channel. So, the chosen keywords have high traffic, high difficulty and high competition.
As I've said it earlier, Google algorithm can't read and it doesn't matter what I write. Moreover, I'm not creating an authority site for dedicated users. So, I've decided to just post random spun content. But, still the question is where to get content and how to spin it. As the authority sites can rank with horrible content, it's not always a good idea to spin their content. So, I've checked 5-6 pages to find a few low authority sites or parasites, found content that is - fairly long, human readable, has subheadings. I've searched on Google for "free article spinner," and spun it using the site on first result. That's enough for me as I'm lazy to write and Google is blind to read.
After this step, add your site in GSC and submit it for index. It may take 7-10 days for indexing. There is an urban myth that a guy once submitted his site on GSC and went into coma after an accident. But, when he woke up from coma, his site was still not indexed. So, don't complain after 10 days that your site didn't get indexed yet. I've left the sites for 3 months to gain momentum (another way to say that I was busy with other things). Some sites ranked on their own but some require optimization.
Optimize content
Don't mistake this step as rewriting all the content. I'm only optimizing content for more keywords, so that I can get more traffic. I have 2 strategies here:
Reverse engineer Google algorithm
We're finally at the clickbait part. Fun fact: I've learned from YouTube videos to add clickbait content at the end, as it encourages more people to watch (read) everything. So, we have a high authority site which can rank if we figure out the requirements to rank. Pun fact: No one knows how Google ranks sites. As we know nothing more than what Jon Snow knows, let's start with the basics.
Google is a search engine that wants to provide a list of websites when a user searches for some keyword. Like every business, if they don't provide what the user is looking for, then they will run out of business. Imagine, you walk into a bar and order beer but the waiter brings hot milk instead. That's exactly how users feel when they don't find what they are looking for. Google has to understand where the user is from, what kind of results he clicks on, when does he close his tab/browser etc. This helps them optimize the results for a user group & keyword combination.
Now we know that we can rank for any keyword using high authority sites if we serve what user is looking for. But, how can we know what the user is looking for? The answer is "we can get this information from Google itself."
Have you ever searched for meaning/synonym of a word on Google?
If no, then do it quickly. If yes, then you already know that Google displays this information on their website itself, so that you don't need to check search results.
Isn't it so nice of them that they are saving a lot of time for us?
Not, really. They've actually ruined many businesses that over half of the traffic to dictionary sites returns from Google itself. They're basically scraping web content and stealing data to keep the users on their site itself for longer. This helps them show more ads to users and in turn make more profits. I know most of you know this, but still I have to say this to avoid any confusion.
What we will do is to look for snippets in our results and provide the same content on our parasites. If our keyword doesn't have any snippet, then we will look for 10-15 related keywords and collect all snippets. It means that if the user is looking for "netflix poster," then Google will show image carousel at the top of search results:

The user intent is to find images, so they prefer sites having images on top of their results. This is not the end of it. They also know which images to display (they collect this data from user searches). If not all, most of the images in their carousel will be relevant to the user's search intent. So, which sites will rank for this query?

The results include wallpaper sites, image galleries, image slides etc. Google knows what kind of results will appeal to their users and which images should be on the results page. So, if you have a combination of Netflix, Amazon Prime, HBO etc images, then your site may not rank, because Google knows that all your images are not Netflix posters.
This is a simple example, but you must've got the idea of what Google is looking for and what it wants to show its users. If you haven't forgotten what this thread is about, let's return to our original project.
I've found FAQ snippets, copied both questions & answers from the results. Being lazy again, I've re-written the same content and posted it as it is. Finally, I have a high authority site that provides wholesome information (covers a lot of keywords) and meets user intent requirements.
The million dollar question - "Will my parasite reach on page 1?" See it yourself.

The traffic is not falling, but that's weekend's peak. The average daily impressions are 250k-300k, which the site managed to reach. Although the August traffic seems like insignificant, it's 3k-10k impressions per day, but that's very less in comparison to the final results. However, the CTR is less due to the fact that people avoid Google sites in my niche. I've achieved this much without backlinks and average content (some call it horrible but that's enough for parasites in my opinion).
Have you thought if I rigged the results by sending bot traffic? Let me clarify that for you.

Are the results reproducible?
Yes. I've used the same technique to rank a few other sites. But, this is the most unexpected site to rank as it had a lot more competition and I ignored it in entire August thinking it won't rank. Check the results of a few other sites:

This parasite has ranked for 638 keywords and the traffic remained consistent.

This has ranked for 477 keywords.

This has ranked for 360 keywords.
I think I've covered everything required to explain the entire case study. But, there is a darker aspect to it. Let's get into that.
Problems
If you have already noticed, the last 3 screenshots are from Aug 1. I've posted old screenshots because the sites are dead. The sites remained active until mid-August and then someone reported them between mid to late August. Google had a problem that if someone reports your site, then it's most probably dead. You can't easily resurrect a dead site. If you're lucky, you may pass manual verification and then they'll ignore reports against you for sometime. But, if you're stuck with manual verification, then your site will be blocked without any penalty or verification for eternity.
The other problem is that there can only be 1 Google site in search results. I've never seen multiple Google sites ranking in top 10 pages. This is because they replace the currently ranking site if they want to bring your site. They will replace your site if they want to bring a new site into the results. So, it's a tough competition to survive and rank.
Conclusion
The case study proves that high authority sites are easy to rank if they meet user intent. Although Google algorithm is tough to decode, we can extract enough information to rank our parasites.
I think that's a fairly long thread and I don't know if I've missed anything. Feel free to ask in the thread if you have any questions & please don't send private messages asking questions about this thread. I'm not selling anything and this case study is only to share information.
A newbie, Little Johnny, joins BHW and asks, "Dear sir, I've created a coupons site 1 year ago but couldn't find my site on Google. Someone suggested me that I need backlinks to rank, so I got 1 Million backlinks on Fiverr for $5. When will my site rank?" An SEO expert, Tom Cruise, replies, "Hi Little Johnny, you won't rank with $5 backlinks. You need to spend at least $1000 for backlinks and $1000 every month for quality articles. Google's senior webmaster, Mr. MoFo, said that content is king."
This is a common occurrence on the forum. Although I don't disagree with any of those claims, there is no tailor-made solution for your site. You have to find what works for your specific keyword instead of following a general solution. Moreover, I believe that Google has a domain authority metric (not Moz DA) that helps high authority sites rank without relevant content. So, I want to confirm if I can rank web pages with the help of domain's trust & authority.
Requirements
- High difficulty, high competition and high traffic keywords (otherwise there is no point in this case study)
- Spun or irrelevant content (to check if only quality content ranks)
- No backlinks (to confirm that Google has a domain authority metric)
- Multiple parasites with varying difficulty & traffic (to confirm that the ranking is not beginner's luck)
Steps
- Choose a niche
- Choose a parasite
- Create content
- Optimize content
- Reverse engineer Google algorithm to rank
Disclaimer
Tom Cruise don't need this disclaimer, but Little Johnny may waste his time thinking if it's a magic pill. Although I share my own results here, I don't guarantee that you can easily rank by repeating the process. There are also some problems associated with it, which I'll discuss later in this thread.
Niche
The mainstream niches like weight loss & finance are tough to rank (just an opinion, not tested them yet). Moreover, I'm doing CPA and thought it would be better to test something relevant. I have selected spammy niches like game hacks, freebies, dating etc. I won't reveal the niche for obvious reasons, but you can have an idea about it from the examples.
Parasite
There are tens (if not hundreds) of parasites, and it's really difficult to find the one that easily ranks. I've literally checked hundreds of keywords to find the parasites that have consistent rankings. Some parasites are ranking for specific keywords, but don't appear anywhere else on the internet. Although they may work for some of you, I particularly need a parasite that can rank for all niches, so that I can use the same techniques for all my projects. I've shortlisted Google sites and Medium, because they are preferred by Google and can easily rank. But, Medium has more control over their platform and they can delete whatever they want & whenever they want. So, I chose Google sites which can also be tracked on Google Search Console. Let me show you something:

This is the trend showing the growth of organic keywords for Google sites (data from Semrush). There are 2 things you need to understand from this trend.
- After April core update, Google sites started showing in search results for a lot more (almost thrice) keywords
- The growth is not uniform over all pages. Almost all keywords are showing Google sites in top 50 results, which is strange.
- Google started preferring authority sites over new sites or random blogs
- Google started preferring sites covering a lot of topics over those covering only a few keywords
Content
A disciple once asked Lord Buddha, "What's the secret of success?" Buddha smiled and replied, "Keywords."
Well, you can't post random stuff on internet hoping for someone to search for it. You need to write content that already has a demand. I've found a few keywords that have high traffic. Spiderman once told me that with huge traffic comes huge competition, which is apparently true. Many News sites are hunting keywords just like how a Lion hunts down a deer on Discovery channel. So, the chosen keywords have high traffic, high difficulty and high competition.
As I've said it earlier, Google algorithm can't read and it doesn't matter what I write. Moreover, I'm not creating an authority site for dedicated users. So, I've decided to just post random spun content. But, still the question is where to get content and how to spin it. As the authority sites can rank with horrible content, it's not always a good idea to spin their content. So, I've checked 5-6 pages to find a few low authority sites or parasites, found content that is - fairly long, human readable, has subheadings. I've searched on Google for "free article spinner," and spun it using the site on first result. That's enough for me as I'm lazy to write and Google is blind to read.
After this step, add your site in GSC and submit it for index. It may take 7-10 days for indexing. There is an urban myth that a guy once submitted his site on GSC and went into coma after an accident. But, when he woke up from coma, his site was still not indexed. So, don't complain after 10 days that your site didn't get indexed yet. I've left the sites for 3 months to gain momentum (another way to say that I was busy with other things). Some sites ranked on their own but some require optimization.
Optimize content
Don't mistake this step as rewriting all the content. I'm only optimizing content for more keywords, so that I can get more traffic. I have 2 strategies here:
- Find all keywords with medium-high volume (varies between 1k-10k depending on volume of main keyword)
- Find all keywords that I'm already ranking (using GSC)
Reverse engineer Google algorithm
We're finally at the clickbait part. Fun fact: I've learned from YouTube videos to add clickbait content at the end, as it encourages more people to watch (read) everything. So, we have a high authority site which can rank if we figure out the requirements to rank. Pun fact: No one knows how Google ranks sites. As we know nothing more than what Jon Snow knows, let's start with the basics.
Google is a search engine that wants to provide a list of websites when a user searches for some keyword. Like every business, if they don't provide what the user is looking for, then they will run out of business. Imagine, you walk into a bar and order beer but the waiter brings hot milk instead. That's exactly how users feel when they don't find what they are looking for. Google has to understand where the user is from, what kind of results he clicks on, when does he close his tab/browser etc. This helps them optimize the results for a user group & keyword combination.
Now we know that we can rank for any keyword using high authority sites if we serve what user is looking for. But, how can we know what the user is looking for? The answer is "we can get this information from Google itself."
Google provides all the tools required to understand search intent for a specific keyword.
Have you ever searched for meaning/synonym of a word on Google?
If no, then do it quickly. If yes, then you already know that Google displays this information on their website itself, so that you don't need to check search results.
Isn't it so nice of them that they are saving a lot of time for us?
Not, really. They've actually ruined many businesses that over half of the traffic to dictionary sites returns from Google itself. They're basically scraping web content and stealing data to keep the users on their site itself for longer. This helps them show more ads to users and in turn make more profits. I know most of you know this, but still I have to say this to avoid any confusion.
Google understands what user is looking for and tries to provide it on their results page, so that the user need not have to visit any web pages.
What we will do is to look for snippets in our results and provide the same content on our parasites. If our keyword doesn't have any snippet, then we will look for 10-15 related keywords and collect all snippets. It means that if the user is looking for "netflix poster," then Google will show image carousel at the top of search results:

The user intent is to find images, so they prefer sites having images on top of their results. This is not the end of it. They also know which images to display (they collect this data from user searches). If not all, most of the images in their carousel will be relevant to the user's search intent. So, which sites will rank for this query?

The results include wallpaper sites, image galleries, image slides etc. Google knows what kind of results will appeal to their users and which images should be on the results page. So, if you have a combination of Netflix, Amazon Prime, HBO etc images, then your site may not rank, because Google knows that all your images are not Netflix posters.
This is a simple example, but you must've got the idea of what Google is looking for and what it wants to show its users. If you haven't forgotten what this thread is about, let's return to our original project.
I've found FAQ snippets, copied both questions & answers from the results. Being lazy again, I've re-written the same content and posted it as it is. Finally, I have a high authority site that provides wholesome information (covers a lot of keywords) and meets user intent requirements.
The million dollar question - "Will my parasite reach on page 1?" See it yourself.

The traffic is not falling, but that's weekend's peak. The average daily impressions are 250k-300k, which the site managed to reach. Although the August traffic seems like insignificant, it's 3k-10k impressions per day, but that's very less in comparison to the final results. However, the CTR is less due to the fact that people avoid Google sites in my niche. I've achieved this much without backlinks and average content (some call it horrible but that's enough for parasites in my opinion).
Have you thought if I rigged the results by sending bot traffic? Let me clarify that for you.

Are the results reproducible?
Yes. I've used the same technique to rank a few other sites. But, this is the most unexpected site to rank as it had a lot more competition and I ignored it in entire August thinking it won't rank. Check the results of a few other sites:

This parasite has ranked for 638 keywords and the traffic remained consistent.

This has ranked for 477 keywords.

This has ranked for 360 keywords.
I think I've covered everything required to explain the entire case study. But, there is a darker aspect to it. Let's get into that.
Problems
If you have already noticed, the last 3 screenshots are from Aug 1. I've posted old screenshots because the sites are dead. The sites remained active until mid-August and then someone reported them between mid to late August. Google had a problem that if someone reports your site, then it's most probably dead. You can't easily resurrect a dead site. If you're lucky, you may pass manual verification and then they'll ignore reports against you for sometime. But, if you're stuck with manual verification, then your site will be blocked without any penalty or verification for eternity.
The other problem is that there can only be 1 Google site in search results. I've never seen multiple Google sites ranking in top 10 pages. This is because they replace the currently ranking site if they want to bring your site. They will replace your site if they want to bring a new site into the results. So, it's a tough competition to survive and rank.
Conclusion
The case study proves that high authority sites are easy to rank if they meet user intent. Although Google algorithm is tough to decode, we can extract enough information to rank our parasites.
I think that's a fairly long thread and I don't know if I've missed anything. Feel free to ask in the thread if you have any questions & please don't send private messages asking questions about this thread. I'm not selling anything and this case study is only to share information.