So I bought an expired domain for a new niche site. The name is relevant and it has a few links from highly relevant websites. The backlink profile came up clean in Majestic, Ahrefs and Moz. All clean in Wayback Machine and other snapshot websites. I set up a one page WP site and pinged Google and Bing, just to make sure it gets indexed. A day later I decided to check my Apache access logs. I use cloud VPS hosting - this means more messing around in Linux, but also lots of control, including access to all possible logs. I noticed that as soon as Bingbot came to my site, it started trying to fetch suspicious inner page URLs on my site containing stuff like Rolex, Nike shoes and whatnot. To make sure it's not some form of referrer spam, I checked the IP and the request really did come from the Microsoft network. I thought this might be some Bing quirk, so decided to wait. Unfortunately the next day Googlebot started crawling the same URLs. Presumably the website was used as an affiliate site at some point and actually has spammy backlink profile, though very well hidden. If I hadn't gone into the server access logs, I wouldn't even have known about this. The new website did get indexed quickly by both Google and Bing, though. Question: given that this is meant to be a money site and that it did get indexed quickly, what's the best way to proceed? a) Forget about it. b) Hook it up to WMT and try to disavow the dodgy links. c) Proceed with building the website. d) Anything else? Basically just wanted to share this so it may save some time and nerves for some people. Before you start setting up a website on an expired domain, try to get it indexed and check your server logs (if you have access to them) to see which pages are actually being crawled by the spiders.