I have read many times that more than 50 links per RSS is too many and you should keep it at that or below... Now I have scrapebox blasts which run me over 10k links so if I were to upload them to links2rss and convert 50's I'd have HUndrEDS of links which I'd have to download one by one. Obviously that isn't realistic so I have just been making rss feeds of 1k at a time. Will this significantly hinder the crawling? I have been thinking even though it might not be ideal it's better than nothing. So those who are talking about RSSing huge lists how do you get them converted conveniently? I tried the scrapebox RSS thing but when it was checking every site (as a side note what exactly is it doing when it is contacting every single individual link? cos other .xml creators don't have to do this) it would keep 404ing me after a few minutes or other errors. So what is the best way to do this?