Let's say that you were in charge of the Google algorithm that ranked sites, what would it look for? Maybe I am strange and just like to think about these things and how it may affect the way that I plan my sites for a semi-sustainable future. The current SEO way of writing X pages of keyword optimized content which are all Y number of words and have Z number of back links from high PR sources can't be the best way of doing things. There must be new and exciting metrics that will eventually rule out much of SEO, but I am by no means preaching that SEO is dead. Note: I don't work for Google or any other search engine provider. Hi Google In General, Users Want: 1 - High quality content (writing or multimedia) 2 - Correct content 3 - The information must be up to date 4 - Ease of use So let's look at some talking points for how you may go about judging each of these 1) High Quality Content - Word count, however you don't need to necessarily use 500 words to give the best answer in the world to a subject. Could you look at what the expected word count for a specific search string is? For example, if I search for "what is 1 + 1" then the best answer would be a page with a word count of below 50. However, if I search for "What is every word in the Declaration of Independence?" then the word count would be expected to be significantly higher. - Multimedia. Pictures, video and sound is good, however not in all circumstances. We need to identify which search strings should have multimedia and which should not. If I need an answer to a basic mathematical question then I don't need an image or video to go along with it. - Automated spelling/grammar checks. If these aren't in place already, then they should be. Sure, we may miss out on the best written paper ever because the person had loads of typos, but I can live with that. - Language/Location - To rule out the need for everyone to speak English as a first language, any language should be judged equally, however once again this will be dependent on the particular subject that is being searched for. If I want to find out the history of a small town in Japan, then it is likely that someone who writes in Japanese will have higher quality information than someone who has written in English. - Industry specific terminology (however you also need to cater for the general public who are new to the subject, so can't only rate content with jargon). 2) Correct Content - You can be the best writer in the world, however if you are sending out the wrong answers then your information is useless. - At present a high PR in theory should tell us if a site can be trusted. PR10 sites at present we know to be reputable large companies (save the discussion on whether or not the US Govt are reputable please ), but we have all seen numerous PR5,6,7 sites which are built purely for link building and contain crap that end users don't want to read. - Do we need a heavier reliance on something like the DMOZ or manual reviews by subject matter experts? However this would eventually be abused. - Domain age. 3) The Information Must Be Up To Date - However you have time critical subjects and evergreens. An article on physics written 50 years ago still may have standing. - Could an algorithm identify evergreen subjects and ignore the date? 4) Ease Of Use - Site speed, however you can't penalize those with quality content. - Easy to browse, however you want to have enough levels for big sites so that specific information can be found. - Time spent on site, however if the information is well written then the answer may only take 5 seconds to find. How do you distinguish between end users who find the information quickly (because they know how the site is laid out) and those who come to a site and leave quickly because it is rubbish. - You could look at return visitors, but you need to think about how easily this could be replicated artificially. What are your ideas for how we can see the Internet populated with more relevant information?