Search Engine Optimisation (SEO) over the years has proven to be one of the fastest growing and most important industries for global ecommerce and for commerce in general. The history of SEO when compared to the history of, say, the automotive industry, is relatively short. But when viewed alongside the growth, or rather the shrinkage, of the microchip it is up there in the upper echelons of important rapidly growing digital sectors. As search engines have taken hold and their use has become common, so has the desire to appear at the top of the rankings, and as they have developed so have SEO techniques. Here is a rundown of the most important SEO factors as they have adapted through the last 16 years.
1993 – Netscape Navigator launched…
In the beginning…when Yahoo launched its Directory in 1994, the only way to perform any kind of SEO, was through on page activities. This included making sure the content was good and relevant, there was enough text, your HTML tags were accurate and that you had internal and external links, amongst other factors.
1994 – Excite and Lycos launched.
Search Engine Submission
On-Page SEO was quickly joined by Search Engine Submissions. This meant that webmasters could physically submit their page to the Yahoo Directory for indexing so that it would be there for Yahoo to find when someone performs a search with their search engine.
1995 – Internet Explorer, Infoseek, Yahoo’s Opentext partnership and Altavista launches. Altavista’s dominance brings about the beginning of the decline of search engine submissions as an SEO factor.
It wasn’t until 1998, around the time Google, and The Open Directory Project (DMOZ) were launched, that new factors came into play. PageRank, invented by Google’s Larry Page, is a link analysis algorithm which assigns a numbered weighting to a web page with a view to judging its value within the internet, based on the number, quality and relevance of links to an individual page. PageRank is essentially what made Google what it is today. PageRank helped them to deliver the most relevant search results and allowed users to experience much better returns on their searches. SEOs could now use this factor to improve the number of incoming links to their site, their quality, the keywords used and the page titles and headlines, and see the results in the Google rankings.
1998 – MSN makes a debut in search with Inktomi. Internet Explorer claims victory over Netscape, which begins a rapid decline in popularity. DMOZ, from its launch, becomes a key place for SEOs to get their pages listed.
At the same time as PageRank became a factor, Anchor Text also became a key factor for SEOs to consider. Anchor text is the hyperlinked text which, when someone clicks on it, takes them to another web page. Search engine algorithms value this factor highly because usually hyperlinks are relevant to the page which the link will take you to. This is why when someone links to your site, as an SEO it is of great importance that the hyperlinked text they use is relevant to the context of your site. So when someone clicks on the link that brings them to your site via this hyperlink, it is like a vote for your site in the eyes of the search engines.
1999 – Altavista loses so much ground to Google that it changes tack then fades from the limelight. November sees the first ever SEO Conference: SEO Strategies 1999, moderated by the now editor-in-chief at Search Engine Land.
2000 – Google Toolbar becomes available allowing SEOs to see their PageRank, ushering in an era of crazy mad unsolicited link exchange request emails galore. Also, in 2000 Google Adwords launches.
2002 – Adwords re-launches and works much better, becoming the most popular paid search platform.
Late 2002 saw the emergence of Domain Authority as a viable factor for SEO. Domain authority is basically how well known and respected your site is around the web. Judging the authority of a site for the most part comes down to one thing: time. The longer a site is active online the more authority it will gain. Through time a website should grow in size with new relevant content, steadily increase the number of valuable links to and from it, and avoid any negative SEO tactics. As time goes on sites should naturally grow authority and be seen as authority figures, much like the website design of big companies in telecommunication, large media companies, government agencies, social media domains and universities. In the same way these organisations have authority in ‘real life’, as sources of information and places many people frequent and interact with, authority websites grow their reputation in the same way.
With this new factor, SEOs were moved to avoid considering any naughty spammy tactics and were instead encouraged to grow the value of their site through content and links from authority websites. Particularly when a good link from a strong authority site’s domain, rubs some of their authority on to your domain.
2003 – Google changes its algorithm to combat web spam.
2005 – nofollow tags are created as a means to combat spam. SEO can use this tag as a way of ‘PageRank sculpting’. Also this year Google unleashes some major updates; ‘Jagger’, which helped to diminish the level of unsolicited link exchanges that were flying around, as well as heralding the decline in the importance of anchor text as a factor due to its corruptibility; ‘Big Daddy’ was the second major update which improved the architecture of Google to allow for improved understanding of the worth and relationship of a link between sites.
Unsurprisingly, Google’s update came around the time that Link Context became a valuable factor for SEOs. Link context is the text around a hyperlink, or even the terms that appear around the hyperlink within a webpage. Crawlers use link context to judge how valuable the hyperlink is, based on the text around it. So for example, if the hyperlink has absolutely no relation to the text surrounding it, the crawlers will value this link less. Crawlers can also make their predictions based on the entire parent page, while a combination of the two works best. This advancement encouraged SEOs to improve the content of the website as a whole so that their links would be valued more highly.
2006 – This year saw XML site maps garner universal support from the search engines. XML sitemaps allow webmasters to display to the search engines, every URL on their website that is available for crawling. An XML sitemap contains not only a list of URLs but a range of further information which allows search engines to crawl more intelligently.
The most recent factor to become a tool for SEO is User Signals, once again initiated by a Google update: ‘Vince’. “The Brand Update” was all about minimising the amount of time people need to spend searching in order to find what they want. It created a way of judging the satisfaction felt by users as they search and then learned from these results to slowly work towards perfect results.
For SEOs this means that if a user comes to your site and stays there, Google sees this as a plus in the satisfaction column, but if they arrive then immediately ‘bounce’ away, then this is a minus. Also, if your visitor stays and gets what they want i.e. information or completing a transaction, then this is another plus, while if they leave empty handed and need to go back to search again, then this is a negative. A tally of these results becomes a percentage by which the search engines can judge your site. The incorporation of this factor into their search engines has given SEOs yet more inspiration to improve their sites and to provide valuable content.
2009 – Shortly after this update Google releases another, called ‘Caffeine’, which improved the speed of their indexing. Also this year, Microsoft Live Search becomes Bing, who later become the providers for search results for that old dog Yahoo.
So search engine optimisation has come a long way hasn’t it? And it’s only getting bigger. Now we have many factors all worthy of your attention, all active at the same time. This is of course a sign of the importance of the industry and of the amount of work done by the search engines to ensure that its users get the best results possible. Their work has indeed rubbed off on the quality of websites and the internet in general, having fought the good fight to help to eradicate spam and unsolicited link exchange requests to name but two of their endeavours. Every step along the way through its history has been filled with exciting turns as new search engines are born, old search engines die, new algorithms are unleashed and SEO conferences become regular occurrences. And remember, this is only the beginning
What are the latest Factors to start considering?
Site Speed – Faster sites will be seen as better sites.
Social Media – It can be a valuable way to drive traffic in your sites direction and the search engines are increasingly viewing YouTube video posts, a Facebook presence and Twittering as positive marks about your site. Also being linked to from social media sites will of course help you with some ‘link equity’. While many social media sites such as Twitter apply a no follow rule to links, they are at the very least an arrow pointing in your direction from a social hub.
Personalised Search – A challenge for SEOs as search engines begin to provide those users that are ‘signed in’ to their services, more personalised results based on their profiles and search history.
Real Time Search – Twitter manages to produce practically instant results for things that are happening right now, while the search engines tend to take quite a bit of time to catch up. While Google and Bing have done deals with Twitter for now, real time search could be the next thing for them to tackle.
Universal Search and Digital Asset Optimisation – This is the optimisation for SEO results for images, videos, podcasts, blogs, tweets etc. For example: it is now possible to get better traffic through a video you post on Youtube than from your actual website, meaning that optimising all your digital assets is becoming more and more important.
It’s all so exciting…