1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Interested on web 2.0 expired finder?

Discussion in 'Black Hat SEO Tools' started by yenerich, May 13, 2017.

  1. yenerich

    yenerich Jr. VIP Jr. VIP

    Joined:
    May 26, 2013
    Messages:
    1,181
    Likes Received:
    185
    Home Page:
    Hi,

    For my own usage i developed a tool that searches in tumblr and other web 2.0 to find expired accounts to could re-register.

    I start asking myself if other people could be interested on such software?

    The advantage of getting expired web 2.0 is ver obvious and powerfull.

    Please provide feedback

    Thanks in advance
     
  2. creativeshuvo

    creativeshuvo Senior Member

    Joined:
    Apr 10, 2016
    Messages:
    827
    Likes Received:
    109
    Gender:
    Male
    Occupation:
    Web Developer
    Location:
    Dhaka Division, Banglades
    i m interested to test your software
     
  3. SEO

    SEO Jr. VIP Jr. VIP

    Joined:
    Jan 6, 2017
    Messages:
    822
    Likes Received:
    612
    Eh, if the price were right, you actually supported it well and it was a lot easier than Scrapebox (not that Scrapebox is hard, but that's the tool I use), I'd probably pick a copy up. Good luck to you!
     
  4. chobohobodobo

    chobohobodobo Regular Member

    Joined:
    Jan 6, 2017
    Messages:
    385
    Likes Received:
    65
    Gender:
    Male
    I'd like to try it
     
  5. XxUnivaxX

    XxUnivaxX Jr. VIP Jr. VIP

    Joined:
    Jan 15, 2013
    Messages:
    2,531
    Likes Received:
    1,223
    Gender:
    Male
    Location:
    Basement
    We have some tools that do that like scrapebox, DHG and few more.
    But if you yours can work good yeah we want it.
     
  6. yenerich

    yenerich Jr. VIP Jr. VIP

    Joined:
    May 26, 2013
    Messages:
    1,181
    Likes Received:
    185
    Home Page:
    Thanks for your comments

    Whats the pros and cons of doing it on Scrapebox? How fast it does it?
    Mine works great, im in the development step, not ready for BETA.

    Also, any wish or idea are welcome :)
     
  7. SEO

    SEO Jr. VIP Jr. VIP

    Joined:
    Jan 6, 2017
    Messages:
    822
    Likes Received:
    612
    Scrapebox has a lot of tools in it. Primarily, it lets you handle huge data sets, trim/filter URLs, check stats. For crawling, I use Xenu Link Sleuth, although Scrapebox does have some crawling functionality built in.
     
  8. yenerich

    yenerich Jr. VIP Jr. VIP

    Joined:
    May 26, 2013
    Messages:
    1,181
    Likes Received:
    185
    Home Page:
    Thanks. Yes, i know Scrapebox, i used my self. But i want to know how usefull and well works from some users point of view to find expired web 2.0

    AND i want to receive some suggestions of things that the user want me to incorporate in my tool (if its possible to do it!)
     
  9. SEO

    SEO Jr. VIP Jr. VIP

    Joined:
    Jan 6, 2017
    Messages:
    822
    Likes Received:
    612
    I guess I would need to know what discovery method your tool plans to use. Does it crawl looking for 404s, or use some other method for discovery? The biggest issue with Scrapebox is that because it wasn't designed for this specific purpose, you're constantly having to export URLs, and import them into various addons. If I were to buy a tool as you are making, these are some of the things I would expect it to have/support:
    • A consistent interface. I don't want buttons and options jumping all over the place etc. Just place things consistently.
    • Proxy support (including importing proxies from file and clipboard)
    • The option of single thread and multi-thread.
    • Project support (so you can switch between projects)
    • The ability to handle a lot of data without freezing like crazy.
    • Multiple discovery options would be cool (Scraping for URLs, and crawling seed URLs)
    • If scraping, then multiple engine selection (although you do have to consider how much work supporting scraping search engines would be for you)
    • If crawling is an option, then also the crawl depth.
    • If crawling is an option, then control over which HTTP responses are considered "dead"
    • Metrics checking of some variety (this has gotten hard since all of the free metrics are basically disappearing)
    • Filtering data based on keywords (including providing individual keywords and filtering against a file loaded with keywords)
    That's all I can think of off the top of my head.
     
  10. pressrelease

    pressrelease Power Member

    Joined:
    Jan 6, 2016
    Messages:
    668
    Likes Received:
    239
    Location:
    Disneyland
    i will try that, if it can scrap with defined tf/da/pa criteria.
     
  11. DEFINE

    DEFINE Regular Member

    Joined:
    Jan 4, 2017
    Messages:
    274
    Likes Received:
    85
    Gender:
    Male
    There is already a few out there. Not really that interested. What would be cool if it can check if the domain is actually available like Tumblr username checker or something.
     
  12. jimbobo2779

    jimbobo2779 Jr. VIP Jr. VIP

    Joined:
    Sep 17, 2008
    Messages:
    3,726
    Likes Received:
    2,657
    Occupation:
    Software Engineer
    Location:
    UK
    Home Page:
    You basically just described Domain Hunter Gatherer.

    See my reply to @seo
     
  13. wawka13

    wawka13 Newbie

    Joined:
    May 8, 2017
    Messages:
    4
    Likes Received:
    0
    Gender:
    Female
    do you have a mac version?