For a while now a company called link detox have been using a tool to speed up the rate in which it gets googlebot to crawl pages with bad links. Its part of their boost package and cost a fortune. The reason for it makes a lot of sense and works. Basically you upload a disavow file with all your bad links to google. Wait 48 hours to make sure Google has processed the links and then you upload the sdame list to this boost software and it somehow makes googlebot go and recrawl those pages again. The can be terrible profile links created by xrumor etc on very low quality pages which Google would normally not crawl for a very long time. Getting such a quick re-crawl will have immediate results from the Penguin webspam Algorithm (not manual penalty) as it suppresses results based on bad links. if those links are now disavowed and taken into consideration as such then you can free yourself quickly from the suppression. So does anyone have any idea how they are able to get Googlebot to crawl these links so quickly? It has to be more than just pings but I am pretty sure they are not blasting those bad links with more bad links? A tool/understanding to replicate that service would be amazing.