1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

GSA low efficiency fight with this software one year

Discussion in 'Black Hat SEO Tools' started by betterlife3, Feb 16, 2016.

  1. betterlife3

    betterlife3 Newbie

    Joined:
    Feb 16, 2016
    Messages:
    6
    Likes Received:
    0
    Hey, Can you help me to recognize my problem with GSA , I tried probably everything create their own list of the footprints, change settings, etc. nothing does not help , I want to linked my WEB 2.0 services with do-follow links for example i use my fresh list or use buyed list and the results are as low , with no difference on the list which I use

    Where is my problem using GSA ? Maybe You can help me overcome this problem of low efficiency

    I use 100 Threads , 10 semi-dedicated proxies and 3 captcha services , private emails in my hosting without spamassasin , I use Windows Server 2008 with net framework 4.5 without flash and java Look my main settings

    XxoMu3en.png
    LBmB49SP.png
    Captcha Tronix 5 Theards but I have CB and use CT only for Recaptcha

    Other settings

    cVYepBoY.png
    Engines only do-follow + Blog Generals if < 400 outgoing links per page ( this is tier 2 )

    I2LpqTuq.png
    I use my list for verified folder

    Z4LnCV4e.png
    and use 3 my private emails without spamassasin


    My efficiency

    j1l5D5ds.png
    Only 1239 Links ( very poor ) in 15 hours

    In Captcha tronix I see 6556 solving captcha today and CB i see 8000 success today


    I don't know what I do bad , this same problem with my scraped list with GSA footprints and buys list succes very low , what is wrong . Can you help me , please .



    2. I use mainly GSA for linked FCS blogs ( 200 blogs per project and settings 50 links for url per day ) and I have question this is good idea linked only do-follow links for TIER 2 and use this engines which are in my screens ? maybe we have another idea for me ?

    I don't have any idea ;/
     
  2. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,727
    Likes Received:
    1,995
    Gender:
    Male
    Home Page:
    If your using a purchased list then you want to uncheck the use urls linking on same verfied URL (supported by some engines only) as thats going to flood your target url list with urls. IF your trying to build a list then that feature is fine.

    Some of your engines, like comment engines etc... may get skipped by your filters like 400 OBL, thats a decent number to be sure, but that will slow things down. Do you have word filters?

    Also I would try to spend a few bucks and get some mails from somewhere like blazing SEO or wherever you like. In my experience its easy to get a domain banned for emails and then your out in the cold. But your error log would give the real clues. I mean maybe your just hitting bad urls with a bad list or maybe there is some issue. If you want to put a bit of your error log in paste bin and post the link that would be helpful.
     
    • Thanks Thanks x 2
  3. betterlife3

    betterlife3 Newbie

    Joined:
    Feb 16, 2016
    Messages:
    6
    Likes Received:
    0
    Thanks for your answer loopline ,

    I don't have a word filters , i look log enough often see problems with the solution captcha and download failed (Aborted) and download failed Return SockError Software caused connection abort . It is strange for so many CAPTCHA solved so few links

    Hmm , Maybe i need more proxies I will try to find some quick proxy and see how will this work
     
  4. asap1

    asap1 BANNED BANNED Jr. VIP

    Joined:
    Mar 25, 2013
    Messages:
    4,961
    Likes Received:
    3,179
    1. Change "if a form cant be filled" to "choose random". If GSA comes across a form it cant complete it will put random info in and that might work. If you have skip it wont even try. I used GSA for over a year and it wasnt until a few months ago I figured this out and you will be surprised how many more links you will build.

    2. check "skip for identification".

    3. uncheck "urls linking on same verified url"

    4. uncheck "add stop words to query"

    5. uncheck "skip sites with x outbound links" This slows GSA down because it checks EVERY urls OBL. If you are building links on tier 3 the OBL does not matter and if you are building tier 2 links you should be building all contextual links and you don't need to check the OBL for contextual links so all in all this feature is useless unless you want to build tier 1 blog comments which you can get for $7 in the marketplace.

    6. When im building contextual links I check "allow posting on same site again".

    Make these changes and you will build way more links.
     
  5. betterlife3

    betterlife3 Newbie

    Joined:
    Feb 16, 2016
    Messages:
    6
    Likes Received:
    0
    Hmm I use your tips and have 0,13 lpM now, in log I see all time successful add to site but this links don't add in statbar hmm

    I really don't understand this program :D
     
  6. dreamsoftware

    dreamsoftware Newbie

    Joined:
    Jun 27, 2014
    Messages:
    30
    Likes Received:
    3
    Ugh where to start..

    First unpark CPU cores on your server using UnparkCPU
    Then download TCPOptimizer and use these settings BHW /blackhat-seo/black-hat-seo-tools/337648-windows-server-2003-vs-windows-server-2008-a.html#post3095473

    Restart your server once you did top two.

    Now enable NonBlock Socks mode and uncheck two settings under threads.
     
  7. betterlife3

    betterlife3 Newbie

    Joined:
    Feb 16, 2016
    Messages:
    6
    Likes Received:
    0
    2265JqZ.png

    I settings as recommended asap1 and i really don't know what I am doing bad
     
  8. betterlife3

    betterlife3 Newbie

    Joined:
    Feb 16, 2016
    Messages:
    6
    Likes Received:
    0
    l9q9wmM.png
    nobody has an idea? List of fresh using footprints for GSA
     
  9. arc323

    arc323 Regular Member

    Joined:
    Sep 23, 2015
    Messages:
    200
    Likes Received:
    69
    Location:
    Denver, CO
    Home Page:
    Only use your proxies for submissions - nothing else. I think someone mentioned that earlier but I want to reiterate.

    For me, it's hit and miss. My average LPM is in the low thirties and I'm using 30 private proxies and lists from Serverifiedlists. I have the dropbox folder added to the global site lists. Sometimes I can get up to 50 or 60 LPM without using Ping, Trackback, Exploit etc. but it's not all the time.
     
  10. arc323

    arc323 Regular Member

    Joined:
    Sep 23, 2015
    Messages:
    200
    Likes Received:
    69
    Location:
    Denver, CO
    Home Page:
    I tried to send to you via PM but your inbox is full...

    Hey Loopline,

    I'm using Serverifiedlists.com at the moment but I think I might be able to do better in terms of LPM and the quality or the URLs. How does your list compare to ser verified? Do you have more target urls? If your list still updated as much as your website says? Do you offer coupons?

    I'm really interested in trying your list so any info you can provide is appreciated.

    Thanks!
     
  11. netomarquis

    netomarquis Junior Member

    Joined:
    Jan 15, 2016
    Messages:
    117
    Likes Received:
    9
    32 lpm its a good number IMO
     
  12. sylonious

    sylonious Junior Member

    Joined:
    Jul 4, 2011
    Messages:
    141
    Likes Received:
    43
    Yeah, I'm getting 3 LPM using Serverifiedlists.com. It looks like they sold their list to way too many people.
     
  13. redarrow

    redarrow Elite Member

    Joined:
    Apr 1, 2013
    Messages:
    4,321
    Likes Received:
    985
    Maybe it time to learn to use scrapebox and do your own lists in the millions ....
     
  14. sylonious

    sylonious Junior Member

    Joined:
    Jul 4, 2011
    Messages:
    141
    Likes Received:
    43
    I wish it were that simple. That was one of the first things I tried back in the day. I was really shocked at how few useable targets I was getting with all the time I spent scraping and then I was burning through my proxies way too quickly. I was looking at Gscraper because of the proxy service that comes with it, but I can't afford a better VPS right now.
     
  15. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,727
    Likes Received:
    1,995
    Gender:
    Male
    Home Page:
    Don't forget Bing exists and gives a lot of targets and google api and deeperweb. The server proxies that are free with scrapebox seem to work pretty well with these engines when set on like 10 threads etc...
     
    • Thanks Thanks x 1
  16. sylonious

    sylonious Junior Member

    Joined:
    Jul 4, 2011
    Messages:
    141
    Likes Received:
    43
    Looks like I figured out my issue. I dropped the timeout from 120 down to 10, 20 max.