GSA SER save hit list

justmeseo

Power Member
Joined
Nov 16, 2017
Messages
587
Reaction score
414
Hello everyone, I recently purchased a verified list online. However, I am having difficulty creating my own list from this one. My goal is to only save the links that are successful in creating backlinks. Currently, not every link on the list is successful, so I cannot submit to all of them, and not all of them verify the backlink. Is there anyone who would be willing to share a screenshot or video demonstrating how to save only the URLs that are successful in creating backlinks, rather than saving all URLs with backlinks created?
 
First of all thanks for the reply, I have actually done that, the only issue is that I see only the backlinks built and not the URLs used to create the backlink, my ultimate goal is to generate my own list of AA URLs. I hope that makes sense, and if it works for you with that method could you show me where they are saved and under what name
 
I think you are right it saved for most lists the last backlink but this list will be still used for other projects. Maybe I am wrong but my list supplier give me this settings.
 

Attachments

  • gsa2.png
    gsa2.png
    69.3 KB · Views: 42
  • gsa3.png
    gsa3.png
    54.9 KB · Views: 42
You can do that by checking the tick on global option's verified. Create a new folder there and you are ready to save all the verified. But i don't think its going to be a good idea as URL metrics change often. Nowadays almost all link list providers offer sync lists so most of providers scan and remove whenever the link meet OBL count or other metrics. Its like collecting dust for me. I never got success by creating a own link list. ( surely you can use it for few days but not for months) plus if your server has sata HDD it become more lagging. This will work with identified links as they are not verified. but not worth if you already purchased a link list.
 
I think you are right it saved for most lists the last backlink but this list will be still used for other projects. Maybe I am wrong but my list supplier give me this settings.
That's exactly what I was thinking. Unfortunately, no matter how much effort I put into learning and mastering GSA, it seems that many people have neglected the software and are unwilling to share their knowledge. However, I believe there is a good reason for this. Ultimately, it will be essential to create your own list so that you can filter and determine which backlink types are suitable for parasite sites or should be saved for future use on money sites, based on their DA. I'm curious if there are any e-courses or comprehensive guides available on how to effectively utilize GSA with scrapebox.
 
You can do that by checking the tick on global option's verified. Create a new folder there and you are ready to save all the verified. But i don't think its going to be a good idea as URL metrics change often. Nowadays almost all link list providers offer sync lists so most of providers scan and remove whenever the link meet OBL count or other metrics. Its like collecting dust for me. I never got success by creating a own link list. ( surely you can use it for few days but not for months) plus if your server has sata HDD it become more lagging. This will work with identified links as they are not verified. but not worth if you already purchased a link list.
I agree with you too. I think that the dirt they're cleaning up could actually be valuable for building links on parasite sites. While it's true that everything can be saved in the "verified" section of the GSA settings, there isn't an option to save the URLs used for backlinks. This is problematic because it means that the links cannot be reused if needed, even though they change every day. I find it frustrating that we cannot keep something we paid for, in this case, the successful URLs.
 
That's exactly what I was thinking. Unfortunately, no matter how much effort I put into learning and mastering GSA, it seems that many people have neglected the software and are unwilling to share their knowledge. However, I believe there is a good reason for this. Ultimately, it will be essential to create your own list so that you can filter and determine which backlink types are suitable for parasite sites or should be saved for future use on money sites, based on their DA. I'm curious if there are any e-courses or comprehensive guides available on how to effectively utilize GSA with scrapebox.
I didnt use it for a long time but since a few months I run it again. Still useful for some projects. May I ask you. Do you think private proxies are still needed if you don't scrape search engines for new targets? Or will be a proxie scraper fine from your experience?
 
I didnt use it for a long time but since a few months I run it again. Still useful for some projects. May I ask you. Do you think private proxies are still needed if you don't scrape search engines for new targets? Or will be a proxie scraper fine from your experience?
I do think so, some sites do end up banning your IP, I'm not sure if the authentication provider does ban your IP as well.
 
That's exactly what I was thinking. Unfortunately, no matter how much effort I put into learning and mastering GSA, it seems that many people have neglected the software and are unwilling to share their knowledge. However, I believe there is a good reason for this. Ultimately, it will be essential to create your own list so that you can filter and determine which backlink types are suitable for parasite sites or should be saved for future use on money sites, based on their DA. I'm curious if there are any e-courses or comprehensive guides available on how to effectively utilize GSA with scrapebox.
Visit the forum https://forum.gsa-online.de/ and you will find a plentora of information and answers.
 
At this point I would really prefer to be spoonfed with the answer, I searched throughout the entire site already. If you have a link to the answer please feel free to share it.
Suggest you hire a freelancer than. I am not going to browse answers just because you are too lazy to search for these.
 
Right click your project > show URLS > Verified
Right click > select all > copy | Make text file > paste
Done
 
Right click your project > show URLS > Verified
Right click > select all > copy | Make text file > paste
Done
Trust me I have done that. Let me explain again.
When you use a AA list for example or a purchased Verified list, and you run it with GSA, every backlink that gets verified will be saved on that list, but thats now waht I'm looking for, what I'm looking for is to find out what url was used to create that backlink, and thats the list I want to save, let me explain why, because the verified URL created will contain your post/article/comment/whatever, and therefore you wont be able to create anything out of that verified URL because is not the right url point for you to execute the same process. Please let me know if I explained myself. The ultimate goal being to create my own approved list. In case im wrong with my comment in anyway please feel free to let me know.
 
I agree with you too. I think that the dirt they're cleaning up could actually be valuable for building links on parasite sites. While it's true that everything can be saved in the "verified" section of the GSA settings, there isn't an option to save the URLs used for backlinks. This is problematic because it means that the links cannot be reused if needed, even though they change every day. I find it frustrating that we cannot keep something we paid for, in this case, the successful URLs.

It saves the backlink you made inside the verified folder which is exactly as target root domain. So there is no different. But there is no point or even dangorous reusing same target without filtering. This is the reason many users use syncing verified link lists where provider filterouts junk. if you are using services like serlinks , serpowerlists and list goes on you need to use them directly from their dropbox / storage while its exists on their database. Otherwice you must filterout.
 
Your screenshots settings seems to be fine. No issues. Thats how it save (current backlink from root domain )
 
Back
Top