1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Scrapebox back link checker

Discussion in 'Black Hat SEO Tools' started by shadow2015, Jul 22, 2016.

  1. shadow2015

    shadow2015 Junior Member

    Joined:
    Jan 17, 2015
    Messages:
    112
    Likes Received:
    14
    Occupation:
    L2 Security Engineer (Network)
    Location:
    London UK
    All,


    I have a query, maybe someone could advise me. I am trying to check the back links on scraped links (around 201 links) but what happens is that it gets stuck after a while. Meaning

    It will check most of them and then it will tuck on around 4 links with the message “Reading”

    If I remove those 4 links then it will come up with another 4 links with the same message “Reading”.

    However I was able to check 36 links.

    I have read the SB’s FAQs and cannot confirm whether its related to any issues that might come across as stated.


    E.G CPU = around 20% usage

    RAM = above 3Gigs available

    Running on windows 10


    SB = 2.0.0.69 (64bit)


    I let it run for over 24hrs and it wasn’t still completed.


    I cannot say the app is freezing, no black/white screen.

    Once press on stop, the button goes grey (no message like app is not responding on top of the app bar)

    Also if I check the CPU usage, it’s still in use.


    Any idea on this

    Thanks in advance

    PS: this also occurs on other plug-in too, like Alex rank checker.
     
  2. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,851
    Likes Received:
    2,046
    Gender:
    Male
    Home Page:
    It sounds like its hitting a threshold with some security software and then the connections are being scanned/hijacked.

    Try whitelisting the entire addons folder (which is in your scrapebox folder) in all security software such as anti-virus, malware checkers and firewalls.

    Does it always happen after 30-40 urls?

    Your using the backlink checker addon? With how many apis? Are you using proxies?
     
  3. shadow2015

    shadow2015 Junior Member

    Joined:
    Jan 17, 2015
    Messages:
    112
    Likes Received:
    14
    Occupation:
    L2 Security Engineer (Network)
    Location:
    London UK
    Hi Loopline


    Thanks for the reply,


    With regard to your questions, I do not have any AV/malware apps, the one I had I removed to test it (just forgot to mention it, apologies)


    If I try to run more than 36 URLs I saw this. As a test I export those 201 urls in 30 urls blocks, and noticed if it was stuck on reading and then going in to the file removing them would work. This would mean all files were less than 30


    Your using the backlink checker addon? = yes

    With how many apis? = I checked with 30 connections first, then 200, 1 connection, all behave the same

    Are you using proxies? = yes, the public ones from SB random scrapping


    I know I am going off track by asking you something different, hope you don’t mind.


    As you know you have the proxy scrapping (loop premium plug-in) this is very useful when you are harvesting as we have the option to auto log proxies.

    Can this be done on you other free plugin too?


    Thanks in advance
     
  4. shadow2015

    shadow2015 Junior Member

    Joined:
    Jan 17, 2015
    Messages:
    112
    Likes Received:
    14
    Occupation:
    L2 Security Engineer (Network)
    Location:
    London UK
    yes I am,

    two mistake on my reply

    Are you using proxies? = yes, the public ones from SB loop scraping

    As you know you have the proxy scrapping (loop premium plug-in) this is very useful when you are harvesting as we have the option to auto load proxies.
    Can this be done on you other free plugin too?
     
  5. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,851
    Likes Received:
    2,046
    Gender:
    Male
    Home Page:
    You want to only use as many connections as you have apis. So if you have 1 api, then use 1 connection. If you use more then you go to fast and violate the moz TOS. So they may well be doing something with your accounts.

    It is weird though, your saying you can check up to 30 urls just fine, but after that you get the issue, or did I misunderstand?

    Also Im confused on what you mean can that be done with the other plugin? Can what be done and with what plugin?
     
  6. shadow2015

    shadow2015 Junior Member

    Joined:
    Jan 17, 2015
    Messages:
    112
    Likes Received:
    14
    Occupation:
    L2 Security Engineer (Network)
    Location:
    London UK
    you want to only use as many connections as you have apis. So if you have 1 api, then use 1 connection. If you use more then you go to fast and violate the moz TOS. So they may well be doing something with your accounts. = Ok, I was not aware of this, thanks in advance

    It is weird though, your saying you can check up to 30 urls just fine, but after that you get the issue, or did I misunderstand? = what I mean with this is if I have 30 links to check there is always couple of links in “reading” state. If I remove those, which would be less than 30 links it works perfectly

    Also Im confused on what you mean can that be done with the other plugin? Can what be done and with what plugin? = apologies for not been clear

    let me elaborate



    on the harvesting window, you have the option(proxy) to select a file, then enable this and load every x minute. So if I am using “auto loader” to scrape/test proxy using SB and save to a x file, I can then load this file within the harvester every x minute I specified!


    So on the external link checker or expired domain plugin, you have the option to load proxy via file, cloud or harvester. To my understanding this is only one time upload. Meaning

    I select 10 proxies and lod them in to exp-domain plugin, after the 10 proxies burned out nothing will be loaded? Iam I correct in thinking like this?



    If yes, could we not have the option to load newly scraped proxies just like harvester? Or am I completely wrong in my thinking.



    Thanks in advance
     
  7. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,851
    Likes Received:
    2,046
    Gender:
    Male
    Home Page:
    Let me shift your paradigm. You do not want to use public proxies/cloud proxies with the expired domain finder or back link checker. So this is mute.

    Using public proxies would dramatically lower your success rates (and thats an understatement).

    The only thing public proxies are good for is scraping from the engines, because they work in mass with out caring for accuracy. Expired domains and backlink checking is about accuracy and thus should only be used with private proxies. Private proxies aren't going to die and they aren't going to get blocked, so long as your not being crazy about too high connections, so all this boils down to it being a mute point.
     
  8. shadow2015

    shadow2015 Junior Member

    Joined:
    Jan 17, 2015
    Messages:
    112
    Likes Received:
    14
    Occupation:
    L2 Security Engineer (Network)
    Location:
    London UK
    thank you sir, much appreciated
     
  9. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,851
    Likes Received:
    2,046
    Gender:
    Male
    Home Page: