1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Need a URL/Line filter software with lot of functions

Discussion in 'Black Hat SEO Tools' started by KfgR2, Jan 2, 2012.

  1. KfgR2

    KfgR2 Registered Member

    Joined:
    Jan 16, 2011
    Messages:
    98
    Likes Received:
    41
    Hi,

    need a URL/Line filter software with lot of functions.

    One that I need is extract unique domains with max. 3 (3 is selectable) of the same unique domain. So I can scrape good urls and cut that not flooded.

    Is there any program or a module with scrapebox?
     
  2. TheMatrix

    TheMatrix BANNED BANNED

    Joined:
    Dec 20, 2008
    Messages:
    3,444
    Likes Received:
    7,279
    So you mean you need maximum 3 (selectable) number of unique URLs per domain? Right?

    With SB, you need to do that manually by comparing lists again and again.

    Alternatively, with Notepad++ you can do this with some Macro or something.
     
  3. KfgR2

    KfgR2 Registered Member

    Joined:
    Jan 16, 2011
    Messages:
    98
    Likes Received:
    41
    Hi,

    yes, that I mean. Thank you for your replay.

    Do you mean the "Split duplicate domains" filter in scrapebox?

    Yes, I work with Notepad++ too. But about the marco, can here anybody help and write a small macro for notepad++?
     
  4. Aremys

    Aremys Regular Member

    Joined:
    Sep 13, 2008
    Messages:
    307
    Likes Received:
    72
    You can do that with Scrapebox with its Split Duplicate Domains feature.

    1. Paste all URLs on the harvester.
    2. Split Duplicate Domains / Save removed domains as "Removed.txt"
    3. Copy all what's left. Save it on something like "Main List.txt"
    4. Import and replace current list with "Removed.txt".
    5. Go back to step 2 until you get max 3 urls for each domain.