Google are very smart about their search engine. Everyone is trying to get to the top of the search results and everyone says they know how to do it yet Google keep the algorithm behind it all about as secret to the formula for Coca-Cola. And why would it not be - it is after all patented.
Very few organisations really know the whole truth and no sooner than you might know it than Google will change the algorithm. They have to in order to deliver more meaningful results and deal with all the “black hat” techniques used into fooling the search engines.
This is not to say that us mere mortals cannot be successful at improving web site rankings in the search results it is just that it is an ever persistent activity you have to keep working at and keeping abreast of the general principals of being successful.
I would say that not a month goes by that I don’t hear of what is Google’s current approach to their search engine algorithm. How much is truth and how
much is conjecture I can’t be certain but there however some general principals that one can follow and still gain considerable success. As to how
you implement the initiatives to deliver to these principals is another matter.
In understanding the basic principals of Google’s Ranking Algorithm however will go a long way towards what you can do as a web master or website owner
to improve your search engine ranking for your website.
While reading CopyBlogger I came across an article
which discussed the components of Google’s search engine ranking algorithm – the basis of how Google decides on where and when to rank a certain webpage.
In this article I have also included 2 charts that were provided by SEOMoz. SEOMoz is
a great source of information on various SEO tactics.
So what is important? The answer (for today but I will come back to that) is shown below in the following graphic. So what does this all mean? Read on
and you will understand what each component means.
I must say here and now, I do not know the basis of these results by way of any kind of testing or experimentation SEOMoz conducted to establish these
result however (for the most part) it reflects what my understanding has always been and further corroborates the general opinions I hear from SEO
“experts” whose opinions I respect.
So what do each of these sectors represent or mean?
1. Domain authority: This is by far the most important variable in the ranking algorithm. If you have a trusted domain with a lot of visitors, content pages and inbound links, you can actually get a junky, unimportant page to rank for very competitive terms. In essence you can actually beat many other low authority websites using this technique even if they have spent a lot more
optimizing that page for the search engines. It is for this reason that Wikipedia ranks between position 1 and 3 for extremely competitive generic
keyword phrases which other companies would spend thousands to rank for.
The lesson here is that Google loves long-term assets. If you can somehow turn your website into an authority domain, you can rank at
the top of Google for very competitive phrases without a lot of effort. The problem however is the time and energy it takes to develop an authority
2. Link popularity of the page: By this we mean the quality AND the quantity of inbound links to that page. You can have
thousands of junky links to a page but it will never help in the long-run. The idea here is to strike a balance between link popularity and link quality.
Many successful SEOs can manipulate this factor (and the factor mentioned below) in order to get ranked for competitive terms. New webmasters usually
have a hard time attending to this point properly either because they use too many junky links or they do not vary their anchor text a lot (among other
3. External anchors to the page: This is the third most important factor. You can optimize the web-page for as many terms as you want
but unless you get the right anchors from quality sources, you will never rank for your keyword term. Why? Simply because Google does
not know where to rank you unless you have anchored links coming into your page.
4. On-page keyword usage: I must admit, this is not always easy. To write a page of content that has high utilisation of a series of keywords
and keyphrases and still make sense to the reader. Interestingly, I have seen many webpages ranking for keywords which were not even mentioned on that
page. As a guide and if you want to play it safe and cover all bases, aim for a keyword density of 1%-2.5% and limit the number of keywords you are optimising on in each page.
5. Registration & hosting data: By this they mean the domain registration stats. A domain which is registered for 10 years might be
weighted more by Google because spammers will NEVER register a domain for more than a year. Also, .info domains might be devalued and find it
harder to rank because they are cheap to buy and have been exploited by spammers. With the price of domains these days this however has less significance
than it used to if at all. By the hosting data they might been the bandwidth, the down-time of the website as well as any bottlenecks while accessing
the website. So in essence, there are many different factors to look out for.
6. Traffic & click-through data: there is a good argument that this component deserves needs more weight. I say that because I have
seen many websites ranking between position 2 and 5 for competitive keyword terms however the problem was, the info on them was not targeted which
led to visitors instantly hitting the back button on their browser. The result? Those websites now rank #100-150. I feel that a horrible score on this
factor can lead to a quick decrease in rankings even if all the other factors are adequately catered to. So make sure that you get good traffic and
that you have a low BOUNCE% for your site.
7. Social graph metrics: Google is slowly (but surely) incorporating the social aspects into the ranking algorithm although my personal
opinion is that with all the spam in the social sites this may change. Until then, it means that you need to start using Twitter, Facebook, LinkedIn
and websites like Digg, Propeller etc to gain traction for your content. Google knows when your content is liked by the masses and will give you a
boost in rankings when that happens. The sad part is, I have only just started using this factor to my benefit as it takes much more time and the results
are a little uncertain.
If you want your webpage to rank high, you have to do much more than what most people do. SEO takes time but it can give you free traffic if you do it
Now I did say in the beginning something about this being the algorithm for today right? Take a look at the following chart which plots the relevance of
the above factors over time. This illustrates my earlier comment about Google changing the algorithm. So whatever we do today and where we put our
effort might be a good idea now but tomorrow it may all prove to be a waste of time and effort. I will admit however that the chart is a little old
and today you would probably have to say that social metrics are on an ever increasing curve (for today…)
Written by: Greg Tomkins