GSA "Premium" Lists - What You Are Really Getting

Shaunm

Elite Member
Joined
Mar 27, 2014
Messages
3,437
Reaction score
2,998
Morning Guys,

Two days ago this thread popped up on BHW and after reading some of the replies I decided to do a little case study to try and show people what you are really getting when you pay for a "premium" GSA SER List. Previously I had been a big fan of these lists, I have since carried out various private case studies and have came to the conclusion that it is actually cheaper and more efficient to build your own lists.

I have two hopes with this post, first to offer people as much information as possible when they are considering purchasing a premium list and secondly that the list sellers will up there game and create a product worth the price they ask.

For this case study I used Looplines Auto Approve Marketplace, it is important to note that in my experience all list providers have this exact same problem. It is also important to note that some people may have working methods that do not need to be so strict on the platforms they use but where I feel it is necessary I will explain my reasoning behind making various decisions.

Also please note that myself and a few others are currently reporting bugs in SER that cause some platforms that verify not to be added to the verified folder. In this instance the main platform that will effect the end result is Buddypress, in addition to this I am currently unable to carry out the final step of this case study as all public Google proxies I get my hands on are put straight to work on other tasks but I will explain further when we get to that.

View attachment 74236

This is the initial "verified" list synced right from Dropbox, it initially has 339,810 target urls, my first step in processing a list is to remove duplicate URLs as well as duplicate domains. For the benefit of anyone unfamiliar with GSA SER it has a built in feature when removing duplicate domains to ensure that you do not accidentally remove valid blog and image comments that are on the same domain.

After_Dup_Removal.png

This is the list once duplicates have been removed, it now has 334,359 targets. An initial loss of 5,451 (almost 2%) target urls from the list.

Next I move onto removing specific categories. It has long been agreed that RSS, Pingback, Exploit, Indexer and Referrer are a waste of time as they pass very little link juice. In addition as these platforms are easier to post to SER tends to priorities them leaving out the more valuable platforms. Additionally if they are included in your projects then they will clog up SER, your threads, your proxies and your system resources.

I also remove any web 2.0 targets they take up a massive amount of resources within SER for a very very low success rate, if you are wanting to use Web 2.0 platforms in your link building use a dedicated service such as Ranker X.

Since the Google patch to indexing this year (read about it here) that knocked out what was in my opinion the best indexing service around I also now remove Forum, Microblog, Trackback and URL Shorteners. Although these platforms do some link juice my personal testing has shown that it is now in my opinion a waste of time as again they clog up SER, your threads, your proxies and system resources but also now need a fair amount of effort to get them indexed that I am not willing to put in when I am able to use Blog Comments, Image Comments and Guestbooks that are already indexed as my T2/T3 that will push link juice and help with indexing for me.

In addition to these categories I also remove Wikis, my reasoning behind this is that my personal testing has shown both profiles and actual article wikis die extremely quickly when compared to Article Directories and Social Bookmarks so I decided to scrap them from my T1 a while back as I do not want to be wasting resources building out T2 and T3 to a link that will probably be gone within a month or two.

I also remove the Directory category, although many of these are already indexed for you similar to the Blog Comments, Image Comments and Guestbooks there seem to be a lot less auto approve directories when compared to the other three T2/T3 categories that I use. My testing has shown that T2/T3 projects within SER that has directories included in them build significantly less URLs over a set time period when compared to those without directories. If you are just starting out and have a small list then leaving directories in your list should be fine.

Next I remove individual platforms from the remaining categories. My testing has shown that these platforms are mainly no follow so I rather remove them completely to stop them taking up system resources. Even with the remaining platforms and categories you will still be getting between 10-30% no follow rate so in my opinion this does not leave a footprint. The platforms I remove for this reason are.....

ESO Talk
Question 2 Answer
Shownews
Jcomments
Keyword Luv
Blogspot
SPIP
Shoutbox
Kideshoutbox
DRB Guestbook
Guestbook Reloaded
Lazerous
Data so Gallery
Gallery 2
Plogger
Pixlepost
Pixlepost 2

In addition to those platforms I also remove Joomla K2. I know that there is this post over on the GSA forum condemning the use of Joomla K2 and some of my own personal testing suggests that the reasoning behind this is accurate it is not my only reason for not using it. When used around 90% of my T1 was Joomla K2 and I feel that Google may adapt to this soon.

I now remove the General Blog platform at this stage as so many of them are no follow. Again if you are new and your list is small then you an leave them in if you wish and carry out the General Blog step later in the post to make it worth your time.

Gash_Platforms_Removed.png

After pruning the list to my above specifications we now have 35,732 target URLs left. That is a total reduction of almost 90% of the initial list! This is BEFORE The list has even touches GSA SER! I know this list is sold and marketed as a verified list but all that means is that at one point it managed to get through SER into a verified folder. Since that time the web master could have taken any number of steps to either prevent that happening again or at the very lease make it harder for you to put that target through again.

My next step is to load the list into my identified folder in SER. I set up four different types of projects to verify my targets. Two of them target Article and Social Networks (contextuals), two of them target Blog Comments, Image Comments and Guestbook posts (non contextuals). One of each pair is then put through the following filtering system. I make the standard project that loads the target URLs from the identified folder, these projects have the continue to post to previously failed targets box ticked. These two are my main verifying projects and will keep trying to post to failed links on each cycle to get as many as possible. The other is a basic project set up with no ability to pull target links its self. I load this second project type with its target links manually by right clicking the project -> import target URLs -> From Site Lists -> Identified.

The purpose of this second project is to do one run through of the targets and then give the error message no more targets to post to so I know roughly how long it takes for my system to go through the full list. The message appeared after around an hour for the contextual project and three hours for the non contextual project. This means in the 24 hour test period both of the first sets of projects will loop the list a fair number of times bearing in mind the list gets smaller on each loop as it gets more and more links pushed through.

I did not carry out this step for the case study as I find it takes up too much time and resources but if you decided to keep General Blogs in the above filtering then you should now carry out this step. Open the non contextual project that is still running and turn off the general blogs platform. Set up a new project and select only the general blogs platform for it and get it to pull from my verified folder NOT the identified folder. This means that these target links have been pushed through your system at least once so you know they work. Let that project run for a few hours. Next go into your verified folder, open up the general blogs file and delete all the URLs in it, then right click the general blogs project -> show URLs -> Verified.

This loads up all of the verified general blog URLs. Sort them by the do/no follow column. Select all of them that are do follow (normally less than 10%) copy the target URL and paste it into your general blogs file and save it. This means that you now only have the do follow general blogs in my verified folder that have been verified by yourself for your ranking projects to work with. As your list gets bigger you may wish to remove this step as you may find it takes up too much time and resources for the number of URLs you keep.

First_Run_Through.png

After the initial run through this is what I have that I consider usable. 1431 verified URLs. that is 0.04% of the original list.

After_24_hours.png

This is what I have after 24 hours of running the projects. 3838 verified URLs. That is around 1.13% of the original list! I honestly think if this was left to run for a week it would probably get to around the 6,000-7,000 mark.As I said at the start, I currently believe there is a bug with SER so out of the platforms I keep mainly Buddypress and Alto CMS none of their links have made it into the verified folder so my guess is there would be around 50-75 additional URLs added to this number at this stage.

Screen_Shot_2015_12_26_at_12_36_07.png
Additionally I use an OCR Recaptcha solving service, the above screenshot shows the number of ReCaptcha and Drupal captchas GSA CB has had to send to my OCR service for this test (13,654 total captchas sent to it). If you do not pay the monthly fee for the OCR service then you will get less URLs throughout this process.

On my actual list I also carry out one final step that I'm unable to do for this case study due to using my public Google proxies on other projects and that is take my verified list and run it through Scrapebox to check if the root domain for the contextuals are indexed and the URL for the non contextuals are indexed as if they are not indexed then that site probably has a penalty meaning any links you build on them will never get indexed in Google and may pass a negative effect up your chain.

From my experience of running this final step I find that around 30-40% of the URLs are deindexed and scrapped but as I am unable to do it for this case study I will keep with the presumption that the list will provide around 7,000 total usable links.

I know this is a bit of a long post but if there is anyone still reading then it is time to ask yourself, is it worth spending almost $50 per month for 7,000 usable URLs?

I know people all have their own methods so I am expecting some people to jump in with counter arguments or to try and shoot me down but this is the method I use and I hope it helps a few people.
 
Very nice post. Thanks for taking the time n have a great Christmas!
 
haha awesome post. Save your money, folks.
 
That's a long ass read, but really good. Thanks for pointing that out for us man. Awesome!
 
This thread may harm List Seller's Business, But its 100% True.. Thanks for your Nice share..
 
Thanks for sharing this. I've seen people get pressured into removing threads critical of BSTs before, but I think this type of transparency really makes people step up their game and benefits BHW as a whole.
 
Nice post dude. Thanks.
 
I told you guys. Those "premium list" are mean to rip off everyone. What you basically need to do is not being lazy and buy ScrapeBox, Gscraper, or SEOlistBuilder and you can scrape only contextual links hunderds and hunderds or links daily.

I had many premium lists. And mostly they consists of ping back , index and blog comments. Thats a joke. Why the hell you gonna spend 30-50 dollars for single list which is just shitty RAW scrape with all shitty engines.

This shitty premium lists gave me usually 200-1000 contextual links. I can basically use any scraper and can gather fresh only contextual links in few minutes I will get my 1000 verified contextual links in gsa.

And once you get new fresh list you will get so many duplicate domains and URLs you can dream of.

But yeah they found loophole how to squeeze money from gsa noob users.

*DO NOT BUY GSA VERIFIED LISTS. ALL ARE BULLSHIT. SPEND YOU HARD MONEY TO GET SCRAPER. YOU WILL GET SIMILIAR LIST IN FEW HOURS

http://www.blackhatworld.com/blackh...er-lists-unique-url-lists-100k-links-2-a.html - This guy was only legit seller at proper price. But sadly he stopped his BST.

Merry Christmas. Save your money ! :)
 
Last edited:
Cheers guys :).

This thread may harm List Seller's Business, But its 100% True.. Thanks for your Nice share..

In all honesty I doubt it will do anything, best case senario they up their game to the required standard and their business becomes stronger.
 
I think you bring up some interesting points.

The duplicate urls and domains are due to a discrepancy of how SER removes them and how I do. If you load the same list in Scrapebox you will get slightly different results. I remove dupe urls and domains on the fly, but I am forever tweaking things as platforms change. At any rate, gsa also takes care of this on the fly depending on your settings, but its worth noting for total count.


I think most importantly is what you noted yourself, but what others likely won't register. If you study on neuro economics, for example a good book is thinking fast and slow - you will see that your brain has 2 systems/types. Some people call them system 1 and system 2, some call it type 1 and type 2, call it whatever you want. Ill call it system 1 and 2. System 1 takes what it sees, in this case reads and trys to make it true and coherent. For system 1 "What you see is all there is" Meaning it never considers that it might not have all the information or that there even is any more information.

In your post, system 1 will read it thru, think "oh this is great, its 100% of possible scenarios and Im going to take it to heart". People aren't even aware of how their own minds work. But enough of the neuro economics lesson, the point is that for your methodology buying lists doesn't make sense. However for example information that might not be noted is and that system 1 won't even consider is:

How many of the platforms you target, that are actually auto approve, even exist?

How many methods are there for seo today and how many of those does your method pertain to? (think churn and burn, think different niches work differently, think are we only targeting google? I know people that are only after yahoo and bing because of lower hanging fruit, are we only targeting search engines or trying to influence something else? The list goes on and on and on)

Should you have actually removed all of the platforms you did, or are there actually platforms there that are hidden gems that you tossed out? Should you remove them for all possible SEO methods on the planet?

Is your method even the best method?

You said "In my testing" a lot in there - so the first question I would ask myself is "ok is his testing correct, was it done with full accuracy, does it even apply to my methods, since he didn't state how he tested or what he tested or show proof can I assume its correct and will assuming help me rank my site better?" It all goes back to system 1 assuming the information provided is all there is and its accurate.

Im not saying your testing is bad, or your method is bad, Im just saying the fact that you say "in my testing" doesn't actually prove anything correct or wrong.

The list goes on, but with any review a question anyone should ask is "Ok this was solid, it was excellent, now what information is not here?" What is not in a review is often as useful or more useful then what is there, for putting the review into perspective.

Also the verified list only uses GSA Captcha Breaker. So even though your GSA CB sent those captchas to OCR, if you let it run on enough retries it will eventually solve those captchas. I know your going to say "well it didn't" but I could show you screenshots, the only captcha services setup and running is literally CB. (Of course thats for the main verified list, I do offer a recaptcha/text captcha list that does of course require these services. However its created on an entirely sperate server and synced into its own folder and they never cross. I will also say that after a given domain hits my list, I have seen them change to recaptcha from the captcha that they were using because once its in my list it then gets hammered on)

I do think you were on point with your statement that the user should ask themselves if a list is worth X dollars. They should. I would say for example, in your case, with your method and your niches that you tested in and using your testing methods and in your usage a list makes no sense. What does all that mean for everyone else? Well that is ambiguous.

To the user who said all lists are worthless (in so many words), blanket statements make no logical sense. I know people who specifically use my list and make good cash with it, so both can't be true. Again its system 1 at play assuming the limited information you have is all the information there is thats available.

To the OP, thanks for the review. Ill always take any review and any criticism because it always gives me someone elses perspective, and the more information I have, the more I can keep my own system 1 from making assumptions that would wind up hindering my users.

In your case I don't feel too bad though, you stopped paying for the service in September and I just failed to remove you as an oversight, so at least you didn't over pay. hehe
 
Some good points there Loopline. I would also like to apologies for the naming of your service as this is meant for all list sellers but your service was the one mentioned in the original thread I linked to and like you said I had easy access to it. I rephrased the part where I reference your service a few times before publishing as I felt it was too harsh.

Regarding the neuro economics example, SER List Sales letters (as well as many others) are in a similar situation, people read it and think it is the best thing ever so I am just showing another other side of the story where there is no financial gain to be had on my part.

People seem to brag about LPM when it comes to GSA SER, I used to do it myself but until they get some experience behind them they don't understand that pulling 300 LPM in many cases will be less effective that pulling 30-50 LPM on solid, indexed platforms.

I fully admit my current method is not the best, far from it. It evolves at least on a monthly basis, probably even weekly as I get results from my testing and adapt but I feel the core will be similar in many Google based methods as you need platforms that will pass link juice and if they are easier/already indexed then it helps a bunch.
 
Last edited:
Some good points there Loopline. I would also like to apologies for the naming of your service as this is meant for all list sellers but your service was the one mentioned in the original thread I linked to and like you said I had easy access to it. I rephrased the part where I reference your service a few times before publishing as I felt it was too harsh.

Regarding the neuro economics example, SER List Sales letters (as well as many others) are in a similar situation, people read it and think it is the best thing ever so I am just showing another other side of the story where there is no financial gain to be had on my part.

People seem to brag about LPM when it comes to GSA SER, I used to do it myself but until they get some experience behind them they don't understand that pulling 300 LPM in many cases will be less effective that pulling 30-50 LPM on solid, indexed platforms.

I fully admit my current method is not the best, far from it. It evolves at least on a monthly basis, probably even weekly as I get results from my testing and adapt but I feel the core will be similar in many Google based methods as you need platforms that will pass link juice and if they are easier/already indexed then it helps a bunch.

Oh no worries, by all means mention my service and link to it, its more links and more mentions. Its a mixed bag, while people will take a first impression and apply the halo effect in reverse, there is also the element of cognitive ease.

There are many crazy interesting studies on cognitive ease. For example 1 was of 3 words from a foreign language published in a university news paper over the course of multiple weeks. Each was published in a box by its self with no explanation or definition. The words were published something along the lines of 7 times, 14 and 21 times each. Then people were survey that read the paper. People consistently voted the word that was published 7 times as somewhat negative, the 14 times word as being neutral and the 21 times published word as being a friendly word.

Many other studies prove that the more impressions of something, the closer it moves to cognitive easy. System 1 consistently associates cognitive ease with truth and positivity. So if you were to just go post my business name around BHW over and over again and say nothing about it, when someone found my sales page they we be predisposed to believe it was true and better then others they had heard less often and more likely to buy.

So Ill take the exposure, but I do thank you for rereading your words to make sure they don't sound overly harsh, I appreciate that. :)

Yes you are correct, my sales page was professionally written (not by me, by Copy Army - they are top notch in quality) and SERlists faces the same thing as you noted and pretty much any other sales page. Thats what sales pages do, they depend on how the mind and buying cycle works.


Yes LPM drives me nuts. Most of my buyers at the moment have moved passed it or don't really query me about it any more, but new buyers and when I started before I spent good time educating people, all asked and talked about LPM. Its like driving a car really, getting on the interestate and doing 80mph or 128kph really makes you feel like your making serious tracks, until you find out that your supposed to be going north and you got on an interstate that goes mostly east and west and only slightly north east. Your going really fast, but your not really going where you want to go. Except in SEO its further complicated by the people not understanding a clear path, which would be like doing that 80mph on the interstate except being in a country where you can't read any of the signs because you don't speak the language and also not having a compass and driving at night. lol

Testing is what its all about, you have to always test because the engines change the rules all the time anyway so even if you had the perfect way today do nothing and in 3 months its probably far from perfect.

Thanks for the review.
 
@Shaunm Sorry to "hijack" your thread but are you saying you use public proxies when using GSA SER?

No mate I run semi dedicated proxies in SER.

To index check my verified folder I run them through Scrapebox' Google indexer checker and use public proxies for that but they burn out quickly on Google and my public proxies are at work scraping Google for target URLs and expired web 2.0/domains.
 
I'm buying lists and I will keep on buying. I know exactly how shite the most lists are, Ofc I don't buy shite to be served at my dinner table but I know they're good to fertilize my gardens. I buy lists with fully understanding that I'm buying shites, simply because what do you really expect from auto approve lists?

Even if I spent time, money, efforts and resources to scrape them my self, it would be the same shite. Same shitty list. That's fact. If there is something wrong here it is not with the shite (lists) but more on the persons, on how we utilize them, the purposing, the best use of it.

Sure we can tweak the settings to have "quality" filter, but just embrace the truth that it is still shite, albeit "quality" shite. These days we need to shift our paradigm from 'automate everything including the posting job' to automate the process prior to posting and tweak heavily the posting process.

The list is not the problem. Our mindset is. The golden days were over, long time ago. Distant memories, I miss dearly.
 
I'm buying lists and I will keep on buying. I know exactly how shite the most lists are, Ofc I don't buy shite to be served at my dinner table but I know they're good to fertilize my gardens. I buy lists with fully understanding that I'm buying shites, simply because what do you really expect from auto approve lists?

Even if I spent time, money, efforts and resources to scrape them my self, it would be the same shite. Same shitty list. That's fact. If there is something wrong here it is not with the shite (lists) but more on the persons, on how we utilize them, the purposing, the best use of it.

Sure we can tweak the settings to have "quality" filter, but just embrace the truth that it is still shite, albeit "quality" shite. These days we need to shift our paradigm from 'automate everything including the posting job' to automate the process prior to posting and tweak heavily the posting process.

The list is not the problem. Our mindset is. The golden days were over, long time ago. Distant memories, I miss dearly.

I think you missed the point....I wasnt trying to explain that the lists were shite, there are some total gems in there. I explained my filtering process of how to get to the gems and usable links that will help you out. You can pay around $50 a month if you like or you can scrape a larger list of stronger platforms cheaper in the same time.

I just got sick of people thinking premium lists for SER are an investment.
 
Last edited:
Good post. thanks for sharing such a useful post.
 
Back
Top