1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Whats the easiest way to clean up an xrumer list?

Discussion in 'Black Hat SEO Tools' started by dgfalk, Aug 25, 2010.

  1. dgfalk

    dgfalk Power Member

    Joined:
    Apr 26, 2010
    Messages:
    687
    Likes Received:
    94
    I have a .txt list of all the sites my page got submitted using xrumer. Its in the format of

    1. URL Result: invalid;used self-learning system data
    2. URL Result: chosen nickname "XXX";captcha decoded
    3. URL Result: used self-learning system data
    etc

    I want to be able to get just the URLs so I can create an RSS feed with them to get some of them indexed quickly. What is the best/fastest to get rid of all the numbers and results so Im left with just the URL?
     
  2. dgfalk

    dgfalk Power Member

    Joined:
    Apr 26, 2010
    Messages:
    687
    Likes Received:
    94
    Well I found out how to get rid of the numbers my opening it up in excel, cant quite figure out to get ride of the "results" part. Suggestions?
     
  3. dgfalk

    dgfalk Power Member

    Joined:
    Apr 26, 2010
    Messages:
    687
    Likes Received:
    94
    Im still having a hell of a time trying to figure this out. Im sure someone out there knows how to do it, anyone?!?
     
  4. Lickalotpuss

    Lickalotpuss Registered Member

    Joined:
    May 11, 2010
    Messages:
    84
    Likes Received:
    8
    I think your looking for the same answer as what I just asked as well. I posted the thread and then your thread came up..

    Its amazing the lack of support on this software.

    I am sure you found in the help file where it tells you want to do but it didnt work right? Are we missing the Filter.txt files or are we supposed to make our own?

    I hope we can get this figured out soon!
     
  5. bezopravin

    bezopravin BANNED BANNED

    Joined:
    May 11, 2010
    Messages:
    461
    Likes Received:
    3,471
    To Both of You, You Can Get rid of Those Junk Texts and get Plain Posted URL's Using Search and Replace in Notepad++

    But i Would STRICTLY RECOMMEND you to Don't Do that To Submit Those URL's to RSS Feed as This Will Severely Affect Your Current Ranking and May Get Your Site de-indexed Quickly... Doing xRumer Blast is Safer As Long as you don't ping or feed those posted url's...
     
  6. dgfalk

    dgfalk Power Member

    Joined:
    Apr 26, 2010
    Messages:
    687
    Likes Received:
    94
    Can you explain a little more on how to do that with notepad? I dont even think i have ++ ill have to look.

    Also Im not going to ping all of them at once. I had a guy do a small 500 site blast for me and I was just gonna ping like 10 or so a day to kinda speed things up a little but not to much.
     
  7. bezopravin

    bezopravin BANNED BANNED

    Joined:
    May 11, 2010
    Messages:
    461
    Likes Received:
    3,471
    Thats Really Funny! Don't you really know what Notepad++ is? Its Advanced Text Editor Similar Like Notepad Which Comes with Windows. But it has tons of handy features. Its also a open-source Project!

    Download it here...

    Code:
    http://sourceforge.net/projects/notepad-plus/
    Let Me know if you need any assistance...
     
  8. dgfalk

    dgfalk Power Member

    Joined:
    Apr 26, 2010
    Messages:
    687
    Likes Received:
    94
    Thanks for the link, just got it and it seems like a cool little program. Im playing with it now and still really cant figure out how to just extract the urls. Im assuming its somewhere in the "TextFX" menu?
     
  9. Sweetfunny

    Sweetfunny Jr. VIP Jr. VIP Premium Member

    Joined:
    Jul 13, 2008
    Messages:
    1,747
    Likes Received:
    5,039
    Location:
    ScrapeBox v2.0
    Home Page:
    You own ScrapeBox, try importing them in the harvester and it should strip all the other data and keep just the URL's. Then you can export the URL's as an RSS Feed.
     
  10. dgfalk

    dgfalk Power Member

    Joined:
    Apr 26, 2010
    Messages:
    687
    Likes Received:
    94
    I tried it on scrapebox, didnt work.