1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Filtering 20M urls? how?

Discussion in 'Black Hat SEO' started by metalice, Feb 2, 2012.

  1. metalice

    metalice Junior Member

    Joined:
    Apr 12, 2010
    Messages:
    125
    Likes Received:
    4
    i have a small quick question if you will be willing to assist me it will be superb!

    its ScrapeBox question, lets say , i harvesting 20M urls. now, all the 20M is saved in my harvester_session folder, every file with 1M urls. right?
    ok, so i would like to remove duplicates from all the files.
    is there any way to load into scrapebox more then 1M url?
    is there another software that can make this filtering?

    thakns!!
    M.
     
  2. ivanpenev

    ivanpenev Regular Member

    Joined:
    Jan 25, 2012
    Messages:
    310
    Likes Received:
    90
    Try Scrapebox Dupremove addon, they say it is for massive amount of urls.
     
    • Thanks Thanks x 1
    Last edited: Feb 2, 2012
  3. metalice

    metalice Junior Member

    Joined:
    Apr 12, 2010
    Messages:
    125
    Likes Received:
    4
    exactly what i want! thanks man!!