This article is just a quick look at how search engine optimisation has changed over the years and how the SEO “expert” and the search engines have played a game of “cat and mouse” over the years. It is only a brief look at some of the ways better search engine ranks could be achieved by SEO and how the search engine algorithms were updated in response.
The keyword meta tag
In the beginning there weren’t very many websites and nobody regarded the internet as a way of making money. With the advent of search engines and searching came keywords. These are the words you type into a search engine and the words you want people to find your site with. For example cornwall web development is a combination we like.
To make it easy for search engines web pages have a hidden keyword field (or meta tag) where you can put all the words and phrases you want to target that page with.
For a couple of years this worked great and it would be nice if that was the end of the story, however it is widely accepted that what you put in the keywords tag now makes absolutely no difference.
So why is this the case? Well, it was too easy for a webmaster to just put in the keywords they wanted regardless of the quality of the rest of the page and abuse became rife.
So what was the solution?
Keyword density and position – on page factors
As you’d imagine the obvious place for a search engine to find keywords would be in the text of your web page. So this is where they started to look.
Now using this logic if you wanted your web design site to do well you made sure web design appeared on your page (obviously!) or even better you made it appear 150 times, or if you were really cunning you made it the same colour as your background. Thus the era of keyword spamming was born!
So what was the search engines’ response? Well they decided there was an optimum keyword density – mention the keywords too often and rank lower, too few times and rank lower!
There were other improvements as well. Not only was the density of the keywords taken into account but where they were on the page and how they were formatted. It seems logical that if the keywords are used in a title or in bold text they are more important, it has even been suggested that making them red boosted there importance.
What this meant was the SEO had to stuff their keywords in all the right places. Near the beginning of the page, in the title meta tag and in the right format.
For a while this worked, but eventually every other site on the web knew the secret and the search engines needed a new way to pick the most relevant site. This is where Google extended its lead and search engine optimisation became hard again!
Link popularity and Page Rank (PR) – off page factors
With 100s of different sites competing for the same keywords, using the same expertise it became more a matter of how much money you could pay your SEO rather than whether you had the best information.
It was at this point that Google came up with the ingenious idea and seemingly fool proof idea of using a completely different factor – links, or more importantly how many websites linked to yours.
This was quite a clever idea. It was seen that every link to your website from another site was seen as vote for the sites quality. This tied in with Google’s ‘content is king’ mantra because the better your content the more links your site would accumulate.
Overnight the results improved ten fold for the majority of sites, of course as with all updates there were casualties.
Along with this new strategy came Google’s Page Rank (PR), a value between 0 and 10 assigned to sites according to how many links they had.
For a while all seemed well. However, webmasters became obsessed with PR and how to get it. The solution was to run a reciprocal linking campaign and/or buy links from high PR sites.
For a few years link swapping became huge, with many sites having page upon page of links to their ‘link partners’. And it worked!
There was another factor Google implemented as well, anchor text. This is the actual words used in the link from the other website. In some cases this became more important that the actual content on the web page itself. Thus was born the Google Bomb. No case illustrates this better than the miserable failure episode:
For quite some time starting in 2003 if you typed miserable failure into Google the top page returned was George W Bush’s biography on the Whitehouse’s website. Whilst most of you are probably thinking that is quite appropriate it is also true that the phrase appears nowhere on the page.
How it was achieved was by 100s of pages linking to this with the anchor text ‘miserable failure’. Not an incredibly commonly used search phrase so relatively easy to get a high ranking on.
This tactic was not wasted by the serious marketers who spent all their effort getting links with their keywords by whatever method they could. Anybody who has run a website with public comment will be familiar with this kind of spam from pharmaceutical product vendors!
As always the search engines had to get one step ahead of the SEOs. Currently Page Rank has far less influence than it used to. Previously a PR5 website would always rank higher than a PR4 website. Now it could be either.