The Process Of Search Engine Optimization
Search engine optimization is the process of using natural search methods to take advantage of search engine algorithms. This process allows the web site in question to appear earlier on the results page when the search results page is returned to the person making the query. Those sites with an earlier appearance on the results pages are said to have a higher rank Specialists in the process of revising web pages so that they will have a higher rank usually work in one of two ways. The first group is known as 'white hat' optimizers and the second group is, of course use 'black hat' methods. The labels put upon the two groups would make the clients think that white hats are always good methods, while black hats are somewhat shady and perhaps even illegal. Actually, this overstates the characteristics on both sides.
Both groups attempt to determine from anecdotal evidence what the secret algorithms of the major search engines are. Interestingly enough, the number one search engine, Google, uses a different premise than that of the next two most popular search engines--MSN and Yahoo. Google's PageRank processes assumes that the web pages in a certain grouping or subject which receives the most links to other pages on the same subject is the one recognized as the 'authority' on the subject. Yahoo and MSN are more inclined to look at how effectively and how often certain search terms, known as 'keywords' are being used in a particular site. From that starting point in each search engine, the algorithms used vary.
For example, one search engine may look at how early in the page the keywords are used. Google theoretically looks at the value of the incoming links to a web site. A web page which has good incoming links and few outgoing links will rank higher than one which has many incoming links and many outgoing links. So, in practice, both the white hats and the black hats are after the same goal. The differences between the two come down to the approach that is taken and how effective it is in getting past the search engine spiders who 'crawl' the web pages. A white hat optimizer will attempt to write web page content that reads well for humans as well as spiders, but to utilize key words and page links effectively in order to increase the rankings. A black hat optimizer may be less concerned about the way the web page reads to humans, and instead try to circumvent the intention of the search engine in the first place. For example, until it was discovered and taken into account in the algorithms, link farms were used to sell links to web pages. Another favorite 'black hat' technique was to use white font messages which could be read by the search engines, but did not appear visible to humans Again, as soon as this technique was identified by those who write the algorithms it was blocked. Search engines sometimes take the drastic step of removing pages which are determined to be using 'black hat' techniques from the search engine results pages.
Website Developers Articles
Website Developers Books