Google does not give more value to small websites. We must have small pages and more in number. But only creating small pages will not do the job. So I thought of trying something on one of my clients site. The site has 84 indexed pages in Google, hence it is not a very big site. But when you see the site it is a large website, and the site's pages are not indexed. I cannot go one by one to the pages and indexed them. Here is what I did. I scraped all the site's pages using the software called "Parameter". Parameter gave me 5600 urls. I had to filter it manually by opening it one by one in mozilla to see if the urls really existed. I used "copyallurls" addon for firefox in this case. Now after checking it I had a set of 2300 urls of the site that really existed and were not indexed. The next step I made a Rss feed for all the urls and than I submitted the Rss Feed link to lots of site using a Rss Submitter. Within a week the indexed pages of my clients site went to 1800 from 84. Now what happened is I got more authority from google by doing this. I thought it would just bring my site on good rankings. I was right I got lots of traffic from the search engines. Even i did not know how many pages were ranking and for how many keywords. The client was very happy with me. Try this out guys you won't regret. This is a very good way to bring visibility to your site without much work.