+815.555.5555

info@mycompany.com

Services

How can we help you?

Web design

Fusce sagittis et nisi in feugiat

SEO Services

Fusce sagittis et nisi in feugiat

eCommerce

Fusce sagittis et nisi in feugiat

Social media marketing

Fusce sagittis et nisi in feugiat

Fusce sagittis et nisi in feugiat

The Definition of Search Engine Optimization (SEO)

by | Mar 29, 2010 | SEO Guide

What is Search Engine Optimization?

Few internet users are fully aware of all that goes into the results page they view after performing a search. The complex mechanisms which govern what they see when looking for information or products is the result of two activities. The first is by search engines, like Google, who analyze each website to extract key elements used to give it a rank. This can mean the difference between popularity and obscurity, as low ranked sites are not visible on the first page of results, meaning they will get fewer visitors and less customers. From these search engine activities, the second factor of SEO, or search engine optimization developed. This is the designing of a website in such a way that makes it come out in the top returned search results, preferably the first page, when Google or other search engines are used to find information. This is achieved by various techniques such as careful placement of words and key phrases, hyperlinks, meta tags, images, links and HTML and other forms of coding.

Search Engines

Google and Yahoo are the two best known search engines, a part of popular culture now the expression to “Google” a person or product has become commonplace. Estimates of just how many searches are performed every day vary, but with the rise of internet availability and use, the figures are well into the billions. When a search for the any information is performed, search engines take the information an internet user provides and compares it with its own data, then gives back results showing the sites most likely to contain useful information by matching some or all of the words entered in the search box. Data is collected by search engines using spiders, highly complex automated programs which read all the information on a website and then assign it a rank. The spider downloads it onto its own servers, and through a process described as crawling, who take all the relevant information on site to be cataloged by another program called an indexer. They work constantly updating search results by adding new data and start again as soon as one cycle of scanning is finished. Search engines use programs known as algorithms to rate the information they gather, the details of which are highly guarded secrets within the industry.

SEO Consultants

A huge industry of SEO advisors and providers has developed to help companies gain web presence, a term meaning their sites are search engine friendly, and consistently are found in the top results on returned searches. They may be involved from the design stages of a website or take an existing page and totally re-write it from scratch, using the content of the company in such a way that gets it higher ratings. A huge market, fraught with scams and other questionable techniques, which will be discussed below, has opened up for companies who perform SEO services for clients who often do not understand the mechanisms involved. SEO Webmasters submit the URL (web address) of the site they have optimized to the search engines, so they can send out a spider to analyze it. SEO experts analyze the patents that search engines apply for, when registering their specially designed algorithms, to try to find hints of how they operate in a drive to give their own techniques, or companies, the advantage in a very competitive business.

A History of SEO

The technique of SEO appeared for the first time more than 10 years ago. Danny Sullivan, the editor-in-chief of Search Engine Land, mentioned it in his blog which comments on news and information about search engines. By 1995, SEO was a known means for getting high results with search engines and the race to produce the best results for clients of SEO firms had begun,

The all-important programs feeding data to the search engines, the algorithms, have a history of their own. The first famous algorithm was in a program called “backrub”, developed by two grad students from Stanford University, Larry Page and Sergey Brin. This program eventually became a pat of the Google link analysis system, PageRank, named after Larry Page and used by the Google internet search engine. Pagerank gives a numerical value to each web page that it visits, to determine overall rank, by weighing the value of a its links, because as well as looking at written content, search engines also take into consideration inbound links. These links to other sites can make or break a page’s rating. Google describes how PageRank works by saying, “Google interprets a link from page A to page B as a vote, by page A, for page B.”

Search engines had to evolve quickly, as an engine that failed to get relevant results did not last long, with customers expressing irritation at often irrelevant and sometimes disturbing results from searches. As the engines were reliant on webmasters to give the correct keywords for a given site, unscrupulous ones found the system easy to abuse. Webmasters could, for example, stuff pages selling goods with keywords containing the names of popular celebrities, or news figures

In 1998, Page and Brin established the company Google, which fast became popular with internet users, for ease of use and reliability. The Google search engine was less likely to be fooled by dubious SEO techniques as it ranked pages by keyword frequency, meta tags, headings, links, page rank and hyperlink analysis, to name a few of its sources. Google now admits to using 200 different measurements when ranking pages for search results.

Not in the least bit discouraged, SEO webmasters were soon coming up with ways to exploit this new engine, using techniques developed on gaming the Inktomi search engine, which was eventually acquired by Yahoo. Link-farming, keyword stuffing and other processes all became popular ways to get high rank by fooling the search engines.

Google, Yahoo and Bing,(Microsoft) became the three largest search engines as by 2004 the smaller ones were left behind, often the victims of bad SEO practices, which caused their popularity to wane by returning irrelevant search results, for example, porn sites would buy highly ranked pages and substitute their own content, giving offense to users.

The next milestone came in 2005 with Google’s use of search histories to provide targeted results based on predictions of what customers would want based on those histories. The public outcry against invasive practices did not last long, as advertisers sought to exploit this new tool to their advantage. At the time, Bruce Clay commented that meant the end of the ranking system, because a page’s value for each person could be different, but time saw this actually just changed the game, and did not halt it.

2007 and 2009 saw two big changes in how Google rated paid links, taking on the “pageranksculpters”, as SEOs who exploited Pagerank were called, with a system called nofollow. This consisted of using HTML codes to tell search engines that some links should be excluded from a page’s overall ranking. Very effective in cutting down spam in search results, nofollow improved the service Google offered its customers, who responded by boosting it to the top the of search engine pile.

Search Engines and SEO companies, A Match Made in Heaven?

The relationship between these two often opposing sides of the same business is a complex one. In recent years, search engines have offered webmasters tools to check how their pages are performing, as incentives to work with, not against them. Yahoo offers a program which guarantees early search engine indexing, for a fee, and others only take manual submissions, which hampers the automated fee-based SEO companies While Google and others want to encourage good SEO practices by sending representative to SEO conferences, the bad guys, or “Black Hatters” as they are known in the trade, still cause many problems for them. A forum for discussion between the two parties was opened in 2005, with the establishing of AIRWeb, the Adversarial Information Retrieval on the Web. This body met every year to look for solutions to control the black hatters, who, drawn by the potentially huge amounts of money to be made by offering high page ranks were resorting to more destructive techniques, disrupting the activity of search engines.

Common SEO Techniques

These include both “White Hat” and “Black Hat” methods of gaining a high page rank.
The first relies on a well designed page with carefully chosen keywords, and the second is a mix of techniques that can result in the banning of a page from search results. White Hat SEO practices are planned to steadily gain views over the long term, by quality of content, while Black Hatters are mainly concerned with a fast spike in customers, knowing that the search engines will catch up with them, eventually. Another area, harder to define, as the methods could be said to belong to either of the above categories, is so-called” Gray Hat’ ESO technique.

White Hat SEO Methods

The “Good guys” of the SEO world keep within the guidelines issued by search engines, and exclude deceptive practices by keeping the content of their pages relevant to any chosen keywords, meta tags and titles. Since search engines ignore pages that are over stuffed with keywords, the sparing use and precise choice of them is an art form. Typical texts contains the most important keyword in the main title, the first paragraph, perhaps once or twice more in the general body of the text, then at the end or in the last paragraph. Recent research on how viewers look at web pages gives some very useful information to those engaged in good SEO work. It was found that customers tend to look at the page of search results in a certain order, from top to bottom and from left to right. Since search engines have introduced paid advertisements which are typically displayed to the left of the returned search results, webmasters combine their SEO skills with buying space on pages showing their sites, and so increasing visitors. Webmasters use this same information when placing content on the site itself, as part of SEO technique. They also ensure the websites provide useful information to the viewers, rather than just targeting the spiders from search engines, which produces some almost unreadable text and worse customer reviews.

Black Hat SEO Methods

These include anything the search engines feel is not fair game to improve page rankings. Quite creative in the variety and scope of their activities, Black Hat SEOs know their activities they may lead to a client’s site getting banned by search engines, and may not disclose their methods to unsuspecting customers who only find out the facts when page views drop from amongst the top sites viewed to zero. Examples of their methods ar found in pages with text that is invisible to the eye, hidden in colored areas, but are seen by the tools search engines send out to get information. This may add to the overall keyword density of the page, or have some currently trending topics secretly embedded in the site, producing totally irrelevant search results. Some recent scandals, such as those involving BMW and Ricoh, two German companies that Google banned, showed that not only small, but also large, well-respected companies could become involved with Black Hat SEO methods. In both instances the pages were corrected and Google reinstated them.

Gray Hat SEO Methods

This involves some methods that are frowned upon by search engines, but not as strictly as the Black Hatter’s activities. These techniques involve buying links, second page poaching and some creative use of content. They manage to escape notice as questionable sites by appearing to be authentic. One example could be the inclusion of customer reviews that are actually really cleverly worded adverts, so when carefully scrutinized; the same keywords found in the main text will be seen. The trading of links for money is not appreciated by Google, Yahoo and its fellow search engines. Bad SEO practices involve creating “Link Farms” that involve pages linking back to each other in a loop, creating a false sense of web presence. Webmasters can also create a number of very similar sites, all with the same content, which link to each other, usually on separate domains, to attempt to fool search engines into accepting them each as an individual and independent entity. Yet another method used by bad SEO webmasters takes records of domains whose ownership will expire soon, waits for the sale to be advertised, and then uses their already established popularity to insert different content of his own choosing.

SEO as a Tool

Companies advertising on the internet have a far smaller chance of getting higher page ranks now that all so many businesses use SEO practices. At its best, it is a useful tool to present any website in the most appealing way to viewers. As a source of extra traffic, it may be somewhat over rated, as experts in the business have observed the majority of views come from links to other popular websites. Another factor to take into consideration is the ability of search engines to change the configuration of their algorithms, meaning a site that gained heavy traffic from searches one week, may not have the correct SEO to maintain that level after adjustments are made in search terms.

Choosing a SEO Company

As the choice could result in either a nice boost in page rank, or being banned from search engine’s results, the decision is one that needs to be carefully made after good research. The reputation and web presence of a company can be at stake from shady SEO webmasters who collect their fees and vanish mysteriously by the time the victims learn their trust was misplaced, if they have used Black Hat methods and been caught by the search engines. Good firms, on the other hand, can take a website from boring to booming by utilizing such tools as keyword research to better target customers, content development to capitalize on the strengths of any product or service, and add their technical expertise in areas like coding, to produce results that reflect better on the company as a whole. Questions that any prospective SEO company should be happy to answer include finding out which guidelines they follow, since Google, Yahoo and other search engines keep lists of methods they condone, and those they will ban sites for using. Asking to see websites the firm has completed work on is a good guide to the quality of their SEO abilities.

The Future of SEO

Google has won the race to be the top search engine, with over 75% of searches on the internet carried out via its engine. This may not last as the next generation of search engines is already emerging. Wolfram Alpha is only the first that aim to provide a totally different way to search the internet, and Microsoft’s Bing has re-configured to reenter the market. All these changes will also affect the way SEO is carried out.

Article – Copyright 2009 Seo Las Vegas

About Dr. SEO – Steve Kim aka “Dr. SEO” is an Expert Internet Marketer who works in Las Vegas, NV.  He specializes in Search Engine Optimization, as well as Video and Local Search Optimization.  See more at www.drseo.org