1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Question about Scrapebox Havester Limit.

Discussion in 'Black Hat SEO' started by spectrejoe, Jun 18, 2016.

  1. spectrejoe

    spectrejoe Jr. VIP Jr. VIP

    Joined:
    Sep 25, 2013
    Messages:
    2,105
    Likes Received:
    439
    Home Page:
    What's the limit on the harvester before scrapebox crashes on me when exporting and shit?

    I'm currently running it for 7 hours and harvested 1 million URLS
     
  2. legoknekt

    legoknekt Junior Member

    Joined:
    Dec 26, 2015
    Messages:
    159
    Likes Received:
    124
    Gender:
    Male
    Occupation:
    Propaganda professor
    Location:
    127.0.0.1
    It happens from time to time, but not so often, Have you tried to split the links in to smaller batches?

    And check the harvester sessions folder btw, might be there.
     
  3. spectrejoe

    spectrejoe Jr. VIP Jr. VIP

    Joined:
    Sep 25, 2013
    Messages:
    2,105
    Likes Received:
    439
    Home Page:
    I can only split after exporting from the harvester not while im still harvesting.

    I don't know at what number I should stop the harvester. It's only at 7% and already scraped 1 million urls
     
  4. The SEO

    The SEO Jr. VIP Jr. VIP

    Joined:
    Dec 14, 2011
    Messages:
    4,215
    Likes Received:
    3,178
    Gender:
    Male
    Occupation:
    SEO/SMM
    Location:
    BHW
    Home Page:
    When you try to get the unique URLs they will be just 50,000 to 100K out of 1000K.

    Don't worry, all the scraping URLs also stored in a file in your ScrapeBox folder must be named as Harvesting Sessions, retrieve the list from there.
     
  5. spectrejoe

    spectrejoe Jr. VIP Jr. VIP

    Joined:
    Sep 25, 2013
    Messages:
    2,105
    Likes Received:
    439
    Home Page:
    So I can harvest 10M and its ok?
     
  6. The SEO

    The SEO Jr. VIP Jr. VIP

    Joined:
    Dec 14, 2011
    Messages:
    4,215
    Likes Received:
    3,178
    Gender:
    Male
    Occupation:
    SEO/SMM
    Location:
    BHW
    Home Page:
    If they are web 2.0 sites then let him go beyond 10M until reached at least above 70%.
     
  7. spectrejoe

    spectrejoe Jr. VIP Jr. VIP

    Joined:
    Sep 25, 2013
    Messages:
    2,105
    Likes Received:
    439
    Home Page:
    blog comment links using GSA footprints
     
  8. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,726
    Likes Received:
    1,994
    Gender:
    Male
    Home Page:
    depends on your memory of your machine, assuming your using 64bit. Then I would say maybe a billion or 2 billion, yes BILLION should be fine.

    I definitely work with hundreds of millions at times.

    Scrapebox V2 is 64bit so as long as your using the 64bit and your machine is 64bit then the limit is the amount of memory you have in your machine. But yes Scrapebox can handle 10 million no problem, unless your running really low memory then maybe not.

    Plus even if you run out of memory, the results are saved in the harvester session folder in real time so you don't actually lose them.
     
    • Thanks Thanks x 1
  9. spectrejoe

    spectrejoe Jr. VIP Jr. VIP

    Joined:
    Sep 25, 2013
    Messages:
    2,105
    Likes Received:
    439
    Home Page:
    Jesus those are some scary numbers o.o

    But thanks :D
     
  10. loopline

    loopline Jr. VIP Jr. VIP

    Joined:
    Jan 25, 2009
    Messages:
    3,726
    Likes Received:
    1,994
    Gender:
    Male
    Home Page:
  11. seedhano

    seedhano Newbie

    Joined:
    Jun 20, 2016
    Messages:
    8
    Likes Received:
    2
    Gender:
    Male
    Scrapebox is powerful and the fact that they automatically store all your harvested urls on your computer does not cause you to lose anything in the event of a crash.