GoldenGlovez
Senior Member
- Mar 23, 2011
- 890
- 2,201
Does your Scrapebox crash while using fast poster? This can help!
I use Scrapebox A LOT, each day I harvest and post millions of URL's. Over the last few month's I've noticed an alarming increase in random crashes while using the fast poster to post large lists. Every time, I have to reload Scrapebox, re-import my harvest and clean from the crash dump file. After getting tired of repeating this process 2-3 times a list I started looking into the cause. What I found is that the crash occurs each time fast poster comes across a long URL string (1000+ characters in length).
So I began to look for methods on removing these long strings from my harvest before posting. After much searching, slow results and failed attempts, I was about to give up. Then a friend (and moderator) here from BHW Apekillape sent me just the bit of info I needed to get the job done fast and reliably.
If this problem affects you, here is the breakdown on what you need to do:
1. First we need to install PERL on our machine. A free and lightweight solution for Windows is 'Strawberry Perl'. You can find the download links on the main website here:
http://www.strawberryperl.com/
2. Once you have that installed, next we need to create the script for PERL to run against our lists. Open and create a new text document, inside you will copy and paste this:
3. Save this new file as 'urltrim.pl'. (If using Notepad, make sure to select All Files before saving; Notepad++ is recommended)
4. Now put the new 'urltrim.pl' file in the same folder/directory as the text files you would like to clean.
5. Then you will need to open an MS-DOS command prompt (Either Start > RUN, or press WindowsKey + R and type CMD and then Enter).
6. Finally, CD to the directory containing your Harvested URL's and the PERL script and run this command inside DOS:
Within seconds PERL will output a new and cleaned URL list named 'new_clean_harvest.txt'
You have now successfully trimmed any URL's over 500 characters in length to a new clean file that should run through fast poster without crashes.
Hope this helps!
Regards,
GoldenGlovez
I use Scrapebox A LOT, each day I harvest and post millions of URL's. Over the last few month's I've noticed an alarming increase in random crashes while using the fast poster to post large lists. Every time, I have to reload Scrapebox, re-import my harvest and clean from the crash dump file. After getting tired of repeating this process 2-3 times a list I started looking into the cause. What I found is that the crash occurs each time fast poster comes across a long URL string (1000+ characters in length).
So I began to look for methods on removing these long strings from my harvest before posting. After much searching, slow results and failed attempts, I was about to give up. Then a friend (and moderator) here from BHW Apekillape sent me just the bit of info I needed to get the job done fast and reliably.
If this problem affects you, here is the breakdown on what you need to do:
1. First we need to install PERL on our machine. A free and lightweight solution for Windows is 'Strawberry Perl'. You can find the download links on the main website here:
http://www.strawberryperl.com/
2. Once you have that installed, next we need to create the script for PERL to run against our lists. Open and create a new text document, inside you will copy and paste this:
Code:
my $lRow;
while ( $lRow = <STDIN> ) {
chomp ($lRow);
$lRow = substr($lRow, 0, 500);
print "$lRow\n";
}
4. Now put the new 'urltrim.pl' file in the same folder/directory as the text files you would like to clean.
5. Then you will need to open an MS-DOS command prompt (Either Start > RUN, or press WindowsKey + R and type CMD and then Enter).
6. Finally, CD to the directory containing your Harvested URL's and the PERL script and run this command inside DOS:
Code:
type harvest_to_clean.txt | perl urltrim.pl > new_clean_harvest.txt
You have now successfully trimmed any URL's over 500 characters in length to a new clean file that should run through fast poster without crashes.
Hope this helps!
Regards,
GoldenGlovez