Whats the easiest way to clean up an xrumer list?

dgfalk

Power Member
Joined
Apr 26, 2010
Messages
687
Reaction score
95
I have a .txt list of all the sites my page got submitted using xrumer. Its in the format of

1. URL Result: invalid;used self-learning system data
2. URL Result: chosen nickname "XXX";captcha decoded
3. URL Result: used self-learning system data
etc

I want to be able to get just the URLs so I can create an RSS feed with them to get some of them indexed quickly. What is the best/fastest to get rid of all the numbers and results so Im left with just the URL?
 
Well I found out how to get rid of the numbers my opening it up in excel, cant quite figure out to get ride of the "results" part. Suggestions?
 
Im still having a hell of a time trying to figure this out. Im sure someone out there knows how to do it, anyone?!?
 
I think your looking for the same answer as what I just asked as well. I posted the thread and then your thread came up..

Its amazing the lack of support on this software.

I am sure you found in the help file where it tells you want to do but it didnt work right? Are we missing the Filter.txt files or are we supposed to make our own?

I hope we can get this figured out soon!
 
To Both of You, You Can Get rid of Those Junk Texts and get Plain Posted URL's Using Search and Replace in Notepad++

But i Would STRICTLY RECOMMEND you to Don't Do that To Submit Those URL's to RSS Feed as This Will Severely Affect Your Current Ranking and May Get Your Site de-indexed Quickly... Doing xRumer Blast is Safer As Long as you don't ping or feed those posted url's...
 
To Both of You, You Can Get rid of Those Junk Texts and get Plain Posted URL's Using Search and Replace in Notepad++

But i Would STRICTLY RECOMMEND you to Don't Do that To Submit Those URL's to RSS Feed as This Will Severely Affect Your Current Ranking and May Get Your Site de-indexed Quickly... Doing xRumer Blast is Safer As Long as you don't ping or feed those posted url's...

Can you explain a little more on how to do that with notepad? I dont even think i have ++ ill have to look.

Also Im not going to ping all of them at once. I had a guy do a small 500 site blast for me and I was just gonna ping like 10 or so a day to kinda speed things up a little but not to much.
 
Thats Really Funny! Don't you really know what Notepad++ is? Its Advanced Text Editor Similar Like Notepad Which Comes with Windows. But it has tons of handy features. Its also a open-source Project!

Download it here...

Code:
http://sourceforge.net/projects/notepad-plus/

Let Me know if you need any assistance...
 
Thanks for the link, just got it and it seems like a cool little program. Im playing with it now and still really cant figure out how to just extract the urls. Im assuming its somewhere in the "TextFX" menu?
 
You own ScrapeBox, try importing them in the harvester and it should strip all the other data and keep just the URL's. Then you can export the URL's as an RSS Feed.

I tried it on scrapebox, didnt work.
 
Back
Top