1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Cleaning up ARM database

Discussion in 'Black Hat SEO Tools' started by fad3r, Nov 9, 2011.

  1. fad3r

    fad3r Power Member

    Joined:
    Sep 17, 2011
    Messages:
    733
    Likes Received:
    115
    Location:
    nyc
    Hi,
    While i was learning article marketing robot, I gathered various lists from here, from fiver, etc. A lot of the directories in my database are crap. Does anyone know how I can go about cleaning them up? Also does anyone know what files to backup if I wanted to make a backup?
     
  2. benwalton

    benwalton Newbie

    Joined:
    May 8, 2011
    Messages:
    30
    Likes Received:
    5
    Do a test submit and anything you can't submit to blacklist. Same for signups
     
    • Thanks Thanks x 1
  3. fad3r

    fad3r Power Member

    Joined:
    Sep 17, 2011
    Messages:
    733
    Likes Received:
    115
    Location:
    nyc
    Will def give this a shot. Do you know if it will remove them from the database? The app is def moving slower now. I am curious if I could just open it up in ms access and alter it since it is a .mdb file but I was wondering if something was built into the tool. Def thanks will try right now
     
  4. criticalmess

    criticalmess Regular Member

    Joined:
    Feb 7, 2009
    Messages:
    237
    Likes Received:
    210
    Hey, I'm working on a large AMR list now, if you want to join forces PM me, I can filter and run the checks

    critical
     
  5. fad3r

    fad3r Power Member

    Joined:
    Sep 17, 2011
    Messages:
    733
    Likes Received:
    115
    Location:
    nyc
    That did it. Rep given. Thanks for the prompt response and right solution
     
  6. fad3r

    fad3r Power Member

    Joined:
    Sep 17, 2011
    Messages:
    733
    Likes Received:
    115
    Location:
    nyc
    I dont mind sharing my list with you. To me I am still new and it is not a big deal. I do not know how to export it or the database that people provide here. If you teach me how to do it I will share with you. There are only 6700 greens. I removing all the red right now the way 2nd poster taught me. Not sure if this will provide a performance incrase but man I hope so. Mad slow compared to how it use to be
     
  7. jascoken

    jascoken Senior Member

    Joined:
    Nov 1, 2010
    Messages:
    1,135
    Likes Received:
    751
    Gender:
    Male
    Occupation:
    IT/Web Systems & Development...
    Location:
    Sussex:UK
    You CAN edit the db files directly in Access, but you really need to know what you're doing, and understand the relationships between the db's. I often do this to clean out historical data and auto import blacklisted directory ID's when rebuilding a clean installation. If you're a db programmer, then you won't have much trouble figuring the relationships out.

    It's slow partly because it is access. Access has always been slow and inefficient, but it's a simple compromise and quick to code for a desktop app.

    Remember that greens simply mean they're successful sign-ups, not successful submits. I VERY much doubt that you have 6000 successful submits. The major problem with all the scraped dir's is the amount of Wordpress blogs that will allow sign-up but not submission. It's only after a successful submission that a dir becomes useful. Remember that a bloated list will be costing a fortune in captcha sign ups. It's worth getting rid of all dir's that don't allow 5-6 submissions in a row.
     
    • Thanks Thanks x 1
  8. fad3r

    fad3r Power Member

    Joined:
    Sep 17, 2011
    Messages:
    733
    Likes Received:
    115
    Location:
    nyc
    Hi, this a great response. In my younger years I messed around with access a bit but I am not sure I would feel comfortable doing that. I supposed i could backup the database and then mess around with a throw away copy.

    If I said submits I apologize i have submitted zip. I will begin that next week right now just learning tools and getting up to speed. I wanted to get the db cleaned up because with 40k directories it was very slow. I am not quite sure black listing fixes that as though comestically it is cleaner performance wise it still sucks.

    I agree. I was looking over the feature set for the next version today. He will be actively supporting Captchasniper so that should helpe a little. He also mentioned supporting sqlli I believe or a light weight version of sql so that should help too

    I will take your advice on the 5 - 6 thing. Thank you so much.

    Best
     
  9. jascoken

    jascoken Senior Member

    Joined:
    Nov 1, 2010
    Messages:
    1,135
    Likes Received:
    751
    Gender:
    Male
    Occupation:
    IT/Web Systems & Development...
    Location:
    Sussex:UK
    Yes.. Blacklisting just adds an ID to a database of directory ID's. It doesn't actually remove anything. It actually creates MORE data to process. So each time you import and blacklist, there is more data to process. It has to do this for historical compatibility with all authors though.

    The only way to speed it up is blacklist it hard and then select all the successful submits and then export that list of URL's and then import it into a fresh install (or copy back the original clean db's - which if you do this a lot, you'll start to keep!)
     
    • Thanks Thanks x 1
  10. fad3r

    fad3r Power Member

    Joined:
    Sep 17, 2011
    Messages:
    733
    Likes Received:
    115
    Location:
    nyc
    Ha it is funny you say this because I am currently searching for how to export directories. I planned to nuke the whole thing then import my directories and be done with that (once I test next week to see what allows me to submit, no point in reimporting just because it is green. I want a directory of just pure submits).
     
  11. Danny1111

    Danny1111 Elite Member

    Joined:
    Jul 5, 2011
    Messages:
    2,096
    Likes Received:
    2,480
    I'm definately up for sharing clean databases etc - if someone shows me how to export / clean up and start over.

    or maybe we can all submit what we are using to one person - who can take all the sites and filter them .. and hand them back out

    I am just not expert enough to be able to do it myself without instruction.

    --- I scrapeboxed a bunch of stuff - so maybe I will submit tomorrow and see what's working and whats not.