Hy. I have a situation in witch I would like to ask your opinion. What do you think about generate content to make a webpage pass a duplicate content filter. So here is an example. I have a website with hundreds of auto generated pages. All have the same template and the same content only one word will be different (the word will repeat few times) Ex: Best seo service in France Best seo service in US Best seo service in China Best seo service in India So only the name of the country will be different, the rest will be the same. To avoid duplicate content I was thinking to auto generate 300-400 words that will be placed in a <p> tag but with a very small font so they will appear on the webpage but the visitor will not be able to read them. So each webpage will have the content spited in 2 parts: 1. One part that will be readable and that will deliver some useful information 2. A part that will be only random words auto generated (related to the niche) that will not be readable or visible but that will count in on a content analysis. Is there is any way that G will see that and penalize the website? My opinion is that this will work with no problem but I will also like to hear your opinion. Thank you in advance.
Thank you for your response but I think that some of you didn't understand what is my problem exactly. I'm asking if it will be ok to use the 300-400 auto generated words to stay under the duplicate content filter. Thank you
If these auto generated words will not be the spam but related content,it will be useful to your site indexed.