Strategy for ranking WEB 2.0 site with GSA SER

Status
Not open for further replies.

Superzoom

Newbie
Mar 13, 2011
33
62
Hi there!

I bought GSA SER and Captcha Breaker 3 months ago and try to rank some WEB 2.0 sites, but still my tests failed pretty badly.
I target pretty low competitive long tail keywords in various niches, with every WEB 2.0 site made around 1 keyword.
Till now my every GSA SER campaign resulted in a filter (I hope it's a good name) on my site: it's still indexed by google, but nowhere
to be find when searching for its main keyword.

My strategy for creating WEB 2.0 pages:
1. First I look for niche and scrape 100 best (in my opinion) Long tail keywords in that niche (monthly search: 100-20.000)
2. Let's say WEB 2.0 (PR 9) site is something like wordpress.com with my page url www(dot)wordpress.com/this-is-my-long-keyword
3. I choose this site because with some cloaking methods I can make awesome looking (for humans) 100 pages (for every keyword)
with spuned, readable, but unique content in less than 1 day. 90% of my pages manage to survive +1year :)
4. After going online 80% of pages will be in google serps on 40-80 spot for their particular keyword.

Question:
What would you do to try to rank it? I don't care if it's churn and burn and my site last 3 months cause I can create them in hundreds.
You think I should buy social signals? Like 120$ full package from leets.co to distribute it on all of my 1 niche pages (100)?
 
Social Signals won't do you shit, web 2.0 backlinks backed up by 2nd tier or just powerful blog posts will though.
 
One thing: Blogspots are moving much more faster and better than wordpress.coms these days.

You don`t need social signals for such a project (churn and burn) - even they mentioned that SS will be a factor for ranks, it`s still a future and pretty long term strategy, they only could send You some social traffic.
 
Make multi tier GSA campaign...where upper levels are on more quality sites (higher PR) and lower layers are spammy blog comments and guest books.
 
Hi there!

I bought GSA SER and Captcha Breaker 3 months ago and try to rank some WEB 2.0 sites, but still my tests failed pretty badly.
I target pretty low competitive long tail keywords in various niches, with every WEB 2.0 site made around 1 keyword.
Till now my every GSA SER campaign resulted in a filter (I hope it's a good name) on my site: it's still indexed by google, but nowhere
to be find when searching for its main keyword.

My strategy for creating WEB 2.0 pages:
1. First I look for niche and scrape 100 best (in my opinion) Long tail keywords in that niche (monthly search: 100-20.000)
2. Let's say WEB 2.0 (PR 9) site is something like wordpress.com with my page url www(dot)wordpress.com/this-is-my-long-keyword
3. I choose this site because with some cloaking methods I can make awesome looking (for humans) 100 pages (for every keyword)
with spuned, readable, but unique content in less than 1 day. 90% of my pages manage to survive +1year :)
4. After going online 80% of pages will be in google serps on 40-80 spot for their particular keyword.

Question:
What would you do to try to rank it? I don't care if it's churn and burn and my site last 3 months cause I can create them in hundreds.
You think I should buy social signals? Like 120$ full package from leets.co to distribute it on all of my 1 niche pages (100)?


Your content isn't unique enough.
 
May be your tier 2 links are not getting indexed. I hope you are also diversifying your anchors while building links.
 
Better write 100-200 words by yourself, add optimized image, use the keyword in header tags, title and meta tags and it will works.
 
Just pick a few strategies and experiment.
 
Thank you guys for the replies, really appreciate it!

When I started with SER I applied tier linking strategy, but abondoned it when 3 weeks into the campaign they all get delisted.
Obviously did some newbie mistakes... Now reading your replies makes me think I should stay with that and try to make it work.

I know sites tend to "dance" in serps while building links, but how long, or how many links do you keep building until you know if
this campaign has or hasn't chances to succeed?

Strategy I'm currently testing is throwing everything I got, as fast as possible through a 301 redirect (URL shortener) to my WEB 2.0 site.
Surprisingly I've got the closest to first page ever (21st place) and after 20 days still not deindexed... :)
 
ok now let me tell you what i did. i built a micro niche site with relatively easy long tail keyword. i used spun articles and tried to rank it with gsa and other seo tools, i didnt knew seo and i am still learning it but my knowledge increases as a keep on reading other threads. 4 months passed my site was on 4th page but it stuck there i simply left the site and moved to the next one. after 2 -3 months when i got free from some personal life problems i again thought of trying it once more, the site was still on the 4th page on google.com for my main keyword. i tried the built backlinks with gsa ( only high pr) oh yes i also purchased a few seo service on this forum which were tiered links n other packages. my site was still not going up. i knew there was something wrong. i simply deleted all the articles from my site, went to google analytics and demoted the indexed pages and all the backlinks linking to my site, then i wrote a 100% unique hand written article for the main keyword, taking care of on page seo( internal linking, keyword density, bold/italic, length)etc etc. after that i manually created around 25 web2.0 blogs and posted 100% unique content on each.oh i forgot to tell as soon as i demoted indexed pages and backlinks after a few days my site went from page 4 to nowhere. it simply disappeared, after creating web2.0 i pinged those links through pingfarm.com. after 2 days my site showed up on page 9 , second day page 8. after that i used gsa (article,directory,social bookmarks and wiki) used it with wicked article creator for tier2 links on those web2.0. now i have started tier3 campaign on those tier2 links with gsa but more spammy links and articles generated from wicked article creator. My site is dancing from page 5-12. today i checked and according to serpfox.com my site is on 66th place. Its been not more than 10days i had demoted links and started web2.0 with unique content. i am still trying to rank it and its moving up slowly :). i hope this will help you and others who are still confused about seo.
 
Social Signals won't do you shit, web 2.0 backlinks backed up by 2nd tier or just powerful blog posts will though.

One thing: Blogspots are moving much more faster and better than wordpress.coms these days.

You don`t need social signals for such a project (churn and burn) - even they mentioned that SS will be a factor for ranks, it`s still a future and pretty long term strategy, they only could send You some social traffic.

Thanks! I will leave social signals for other ventures :)



The GSA SER is ideally for generating backlinks in the tier #2 or #3 of the pyramid model.

Make multi tier GSA campaign?where upper levels are on more quality sites (higher PR) and lower layers are spammy blog comments and guest books.

Do you mean creating tier 1 manually or with SER, because honestly doing 10 tier1 sites for every 100 WEB 2.0 pages manually is a little bit overwhelming :)



Your content isn't unique enough.

Better write 100-200 words by yourself, add optimized image, use the keyword in header tags, title and meta tags and it will works.

Could you say some more about why it's not unique enough? How to check if it is?

I treat my content as my own twist in this whole SEO business, I use it because it allows me to generate quickly hundreds of WEB 2.0 sites. Of course I'm not saying
I'm right... just testing and check how far it can take me.

My WEB 2.0 pages have:
For search bots:
-300 words of spuned content (I scrape 20 articles for niche and spin them into 100 articles)
- unique, niche-related image
- keyword in headers, title, URL.

And For users:
-1 long high quality article with Call to action, (let's say in a form of image so user can read it but not bots)



May be your tier 2 links are not getting indexed. I hope you are also diversifying your anchors while building links.

About diversifying my anchors, what do you guys think is the best solutions when my site targets only one particular keyword? Till now my anchors where
basic GSA options so: main keyword + partial keyword + "CammelCase" + domain + some misspells + generic (click here...) You think I should add some more?

I'm using the free version of a GSA Indexer for now, can you recommend some free service where I can upload 1000+ urls to ping? I have searched but came with
nothing good.
 
tier 1 should always be unique and readable content, tier 2 and tier 3 could be mass spam, but again, you will have to ensure every single link that you create is given plenty of indexing pushes.
 
Could you say some more about why it's not unique enough? How to check if it is?

I treat my content as my own twist in this whole SEO business, I use it because it allows me to generate quickly hundreds of WEB 2.0 sites. Of course I'm not saying
I'm right... just testing and check how far it can take me.

My WEB 2.0 pages have:
For search bots:
-300 words of spuned content (I scrape 20 articles for niche and spin them into 100 articles)
- unique, niche-related image
- keyword in headers, title, URL.

And For users:
-1 long high quality article with Call to action, (let's say in a form of image so user can read it but not bots)


@300 words of spuned content (I scrape 20 articles for niche and spin them into 100 articles)
+ you add long tails into the article?

This should work if you do not put all the articles on the same account/subdomain.
 
To rank a site you have to gain trust based on the links that go to it. You can't link from a non-trusted source (another new spam blog or other weak links) to your new blog or web site and get it to rank. You need to get a trusted link or many of them. Once you do this then everything changes. It's like you get removed from the wait list to the VIP list.

Now the trick is to create this trust and authority yourself by doing 3-4 tiers to "flip" the trust/authority switch.

This video seems to explain what I'm talking about:

http://www.youtube.com/watch?v=spkWijiKY5U

Some people say you can build authority by using many tiers and the tier 3 and 4 create the authority. See Matthew Woodward's tutorials for info about this.
 
It's not the point to generate tons of Web 2.0 that you will never touch again and therfor Google won't take/rank them serious.
Your goal should be to create about 30-40 high quality Web 2.0. Add at least 5+ unique articles at the beginning for each Web 2.0. Build tier 2/3 links to those Web 2.0. Diversity is the key - from do-follow high PR blog comments,web 2.0, wiki, bookmarks, social networks as your tier 2 to xrumer and blog comments as tier 3.

One of the most important parts of keeping your Web 2.0 strong is to update them at least once/month!
 
Hi, nice journey, I have found my old license. Is captcha breaker working good enought? Or so I need captcha solving service?
What kind of proxies did you used?
 
It's super hard to rank anything with GSA nowadays. It's pretty useless tool now. Maybe it's useful for backlinks indexing only
 
It's super hard to rank anything with GSA nowadays. It's pretty useless tool now. Maybe it's useful for backlinks indexing only
Are you sure? Maybe in USA market. But so far what I have seen in other countries, pure ai content ranks easily, 301 redirects still work etc. I think there is a still good chance to rank with GSA.
 
This thread is from 2014.

SER is nice to grow the number of backlinks shown in ahrefs and semrush if you are selling guest posts.
 
Status
Not open for further replies.
Back
Top
AdBlock Detected

We get it, advertisements are annoying!

Sure, ad-blocking software does a great job at blocking ads, but it also blocks useful features and essential functions on BlackHatWorld and other forums. These functions are unrelated to ads, such as internal links and images. For the best site experience please disable your AdBlocker.

I've Disabled AdBlock