Saturday, November 15, 2008

Search Engine Algorithms

The first technology to be defined as a 'search engine' goes back to 1990. It was named Archie and created by Alan Emtage, a student at McGill University in Montreal. The commercial search engines that we are familiar with today started to appear in the mid 90's - Yahoo Directory in 1994, LookSmart in 1995, Ask Jeeves in 1997 and Alta Vista in 1995. Google was launched in 1998.

It took website owners only a short time after the first appearance of commercial search engines to recognize the value of having their sites highly ranked in search engine results. The more visible the site, the more people clicked on the site. Clicks tuned into cash as the Internet became money-driven through advertising revenues, e-commerce and other commercial opportunities. Webmasters sought to find ways by which their sites would appear at the top of search returns and in so doing created what has since become the multi million-dollar Search Engine Optimisation (SEO) industry. As ever-resourceful webmasters and SEO marketers sought to find ways by which their sites would appear at the top of search returns, the search companies worked hard to keep ahead of them.

Over the last decade SEO techniques used to ensure top positions in search results have changed repeatedly, responding to the search companies evolving algorithms applied to determine which website is relevant to any given query. This game of cat and mouse reveals the essential tension between the search engines and web marketers. The search companies work hard to keep ahead of ever resourceful webmasters and SEO marketers, including some more unscrupulous SEO marketers, try to outwit the engines to achieve highest possible rankings. In the early days shrewd deployment of descriptive file names, page titles and Meta descriptions proved to be effective SEO techniques. As search algorithms advanced in an effort to combat this, search returns choked up with irrelevant pages and spam. As a result, page factors such as keyword densities grew more important.

Off-page factors such as 'PageRank' (based upon the number and strength of inbound links) became an increasingly important factor. As each of these search criteria became used and abused, the algorithms evolved as the search companies worked to stay one step ahead. These days the search engines use sophisticated, secret and complex algorithms incorporating numerous criteria to determine site ranking and generate relevant results.

Google co-founder Larry Page defines the 'perfect search engine,' as something that, "understands exactly what you mean and gives you back exactly what you want"; an ambition shared by all major search engines. Simply put, the job of a search engine is to index the web and provide the best quality and most relevant content possible. Returns that fulfil the users search desires provide the ultimate query/return match. While each search engine is different, they all share the same goal - to be the best.

In order to preserve the ideal of relevance the search engine algorithms are very closely guarded secrets. These codes are the search companies primary differential; the means by which they claim their competitive advantage over each other, thriving or failing as organisations through the ability of the codes to deliver what users want, namely relevance.

If marketers knew the exact algorithms then they could manipulate rankings as they wished until the search results became so irrelevant that the search engines became worthless and devaluing the relevance of the companies that provide them. In order to combat this the search companies strategically adjust their algorithms regularly in order to combat the threat.

An SEO industry (some companies employing more legitimate methods than others) has evolved around trying to interpret the algorithms and provide web information in the way that the search engines dictate. Any organisation claiming to know the search engine algorithms is being dishonest. It's through experience, expertise and an holistic SEO approach that quality returns are generated, not through trying to rig the system or play tricks. On the professional level effective SEO is not about quick fixes, smoke and mirrors. Any organisation serious about it's search engine positioning should make a point of using a reputable SEO consultancy - like SEO Consult - for long lasting and effective optimisation.

That Google appears to have won the hearts and minds of web users, some say, is through their focus on understanding relevance as the critical fact of Internet life much better than other companies who have compromised the integrity of their search returns with short-term financial returns. Google's philosophy states that if you focus on the user then all else will follow. They claim that "while many companies claim to put their customers first, few are able to resist the temptation to make small sacrifices to increase shareholder value". They may have a point - Google drives the majority of web search traffic and enjoys approximately 70% of world wide searches (75% in the UK) with over 7 billion having been conducted in the USA alone. WPP-owned research company Millward Brown reports that a combination of brand recognition and financial performance gave Google the top spot in a list of global brands. New research estimates its value to be $86bn (£43bn), a 30% year-on-year increase.

Here are two aspects of SEO, increasingly important across all the search engines that within each contain numerous signals that contribute to the search algorithm.

PageRank: considers millions of variables and billions of terms to determine the importance of pages. Important pages are more likely to appear at the top of the search results by receiving a higher PageRank. Inbound links from sites of authority carry great weight, casting positive votes and giving recipient pages greater value.

Hypertext-Matching Analysis: Search engines pay very close attention to page content. As well as scanning for page-based text (which can be manipulated by site publishers through meta-tags), their technology analyzes the full page content including fonts, subdivisions and the exact locations of each word. Neighbouring web pages are also scanned for content to ensure the results returned are contextual and the most relevant to a user's query.

No comments: