I'm working for a client who just launched two big websites. Both websites are platforms that were purchased from another company so, at the moment almost every one of the pages on the site is an exact content-duplicate of the other company's pages. All the pages are blocked in robots.txt so Google isn't indexing the duplicate content. The duplicate content is being rewritten and as each page is rewritten, it's removed from the robotst.txt. Will Google see this as new content being added to the website? Will this help keep the website 'fresh'? Or will it just see this as previously disallowed pages now being allowed? Any ideas?