Stuck with GSA SER for Tier 1

OrdoAbChao

Junior Member
Joined
Jan 31, 2014
Messages
172
Reaction score
58
I remember the days of churn and burn. I would scrape tens of thousands of links with SB and then blast links with GSA.

Now, I'm trying to tighten things up a lot.

I'm trying to use GSA for Tiered link builds. My Tier 1 links are Article, Web 2.0, Wiki - Strictly Contextual Links in the US and UK. I've got 25 Private Squid Proxies, I'm also using SErocket Verified Links piped right into GSA.

Now, I've been running my campaign for 3 hours now and I only have 3 verified links made and all are on Pagebin.

I'm set to not post on sites with more than 50 links on the page, is this whats slowing me down?
Are there additional options I should be using for Tier 1?

I'm just trying to build 40 verified links a day in my T1 with each campaign building links for about 6-10 pages, but this isn't working so well yet and I cant help but think I have SOMETHING off.

Thanks for any advice!
 
You need a good GSA source list. Where are you getting the sites to post links to? From an external list or are you letting GSA scrape?
 
Serocket is garbage. Your OBL setting wont matter for contextual engines since you are creating a new page. The obl filter is mostly used for blog comments and guestbooks where you have a lot of OBLs on a single page. Web 2.0 doesnt really work with SER. I dont ever use those engines.

You probably need a better list provider. Like loopline or 1linklist. Either that or you didnt import the list properly or didnt tell ser to get the list from the correct folder. You might also just be hitting a bunch of recaptcha and dont have a good service to solve recaptcha/recaptcha 2. Maybe watch CB and see what kind of captchas youre hitting or see if SER is saying it has nothing to post to. Without more info as to what SER or CB is doing its hard to help you.

Maybe just most of the links in the list are not from the US or UK.
 
I use GSA Captcha Breaker and DeathbyCaptcha as a backup.
I changed my input from Serocket from the Identified list to the Verified and I'm up to 15 verified links on one of my projects. Things are looking better. I'll check out those other link providers.
 
You shouldn't use GSA SER for Tier 1
I remember the days of churn and burn. I would scrape tens of thousands of links with SB and then blast links with GSA.

Now, I'm trying to tighten things up a lot.

I'm trying to use GSA for Tiered link builds. My Tier 1 links are Article, Web 2.0, Wiki - Strictly Contextual Links in the US and UK. I've got 25 Private Squid Proxies, I'm also using SErocket Verified Links piped right into GSA.

Now, I've been running my campaign for 3 hours now and I only have 3 verified links made and all are on Pagebin.

I'm set to not post on sites with more than 50 links on the page, is this whats slowing me down?
Are there additional options I should be using for Tier 1?

I'm just trying to build 40 verified links a day in my T1 with each campaign building links for about 6-10 pages, but this isn't working so well yet and I cant help but think I have SOMETHING off.

Thanks for any advice!
You shouldn't use GSA SER for Tier 1, Web 2.0 doesn't really work, just a few success sites. I have GSA SER but I just use to work for Tier 2 or Tier 3

These are GSA engines:
Ampedpages
Edublogs
Onesmablog
Rediff
Webgarden Blog
Wikidot
Wordpress
FC2
Unblog
Rediff
iamspost
grab.lv
BCZ

- Ampedpages(90%), Onesmablog(80%), Wikidot(50% success), iamspost (60% success), rediff (40% success), Webgarden(40% success)

- Edublogs(80% failed ), Wordpress(100% FAILED) , Unblog(100% FAILED) , FC2(100% FAILED), BCZ(100% FAILED)

In my opinion, RankerX is the way to go for web 2.0 creation currently. It's is a very powerful backlinking tool
 
Hmm, so RankerX has gotten better? I tried it I think when it came out and I found it a bit cumbersome to use. I may give it another shot.
If I was paying $50 a month for that. I wouldnt need serocket for the Tier2's/3's and could just harvest those links with Scrapebox.

I need some others to tell me thats RankerX has gotten more intuitive/easy to setup and use though.

I'm just trying to get a system down so if I pickup a local client, I can plug them in, build their links and social signals and just monitor progress in Semrush.
 
I dont use it. I use autofill magic for web 2 on t1. I did try it but did not like it either. Its been a while though so maybe it has improved. Im waiting on SERE 2 for SER to finally be released. I will probably still use autofill magic for t1 though. I have just gotten used to it over the years and know it well enough that 2.0 building is fast and obviously not a bot making them.
 
Back
Top