Most websites I work on have the internal search results blocked from bots through the robots.txt either through the internal search plugin (WP) or manually added (Flat HTML). I've worked on a few sites where there were a couple hundred indexed search results for a site that was maybe <300 pages, where I later blocked from being indexed in fear of a spam penalty. I'm working on a site which has a plugin created by a company for MLS. Their website is 1,500 pages but a site: search has over 86,000 index pages. This is due to an internal MLS search. The company claims this is intentional to make the website look larger to search engines. This is also a common company for MLS REALTORS® to have complete their internal search engine. I've never dealt with this large of an issue. Should I block these 85,000 pages from bots? Will I see huge rank drops when a website loses 90% of its indexed pages? If I keep the pages indexed, what's my risk of a penalty?