Google was founded to help internet users find the information they want and need to make their lives easier. The company’s mission is to “organize the world’s information and make it universally accessible and useful.” To accomplish this goal, Google constantly reviews the results they are returning for searches to make sure they are always current with what the user is searching for.
Search results are machine driven, using specific algorithms, but the reviews are performed by people collectively called Google Search Quality Raters, who perform the reviews based on detailed guidelines provided to them by Google. Now, for the first time ever, Google has released the full document, a 160-page PDF, containing their Search Quality Rating Guidelines to the general public. There have been partial releases of the guidelines in the past, but this is reportedly the first time the full in-depth guidelines have been released to the public. Read the Google update here.
How To Use The Google Search Quality Rating Guidelines
The rating guidelines aren’t a step-by-step guide, but they do give some very valuable insight into what the raters are looking for. The document also provides many great examples, so you can review your own website and determine where it might fall on the rating scale.
In a nutshell, good ratings come down to providing high quality content to your target audience. Once you are doing that, make sure Google and other search engines ‘know’ that your site offers that high quality content by having all of your SEO ducks in a row.
How Does Google Decide Which Websites to Include in Search Results?
In order to return the best results for any given search, Google continually updates the algorithms they use to scour the web for the best information related to that search. With an estimated 4.73 billion indexed webpages on the internet (as of Nov. 13, 2015) this can be a daunting task. So how does Google know when their algorithms are working as expected? That’s where the human quality raters come in! They rate the quality of websites based on certain search queries to help Google engineers know whether their algorithm updates and tests are hitting the mark.
But why do they have to change anything at all? Search engine companies often change the algorithms they use to determine the results they return for several reasons, including:
- The amount of information on the World Wide Web grows exponentially every day, so what is relevant today may not be tomorrow because better information could be made available at any time.
- Users get older, they move to new locations, go through various life stages, or simply change their minds. What someone finds useful now may not be what they find useful in the future.
- Less-than-honest website owners and developers try to “fool” search engines by making their site seem relevant for certain search terms, when in fact their content has nothing to do with the user’s intent. This is what’s known as “black-hat SEO” and search engine companies, like Google, work to prevent these tactics from being successful in order to give the user the best experience possible.
In keeping with the tradition of continual improvement, the latest Google updates could be going live before the end of this year. These updates will be applied to the latest large-scale update, known as “Penguin”, and would deal with spammy links quicker than in the past.