Home » Internet » Search Engine Optimization

How to Get Targeted Search Engine Traffic

Jan 8, 2008
Just like you, search engines exist to deliver value to their customers. They do so by listing the sites most relevant to the words their customers search for in descending order of importance. For instance, if a person searched for "grilling burgers," the engine's task is to sift through the billions of pages that comprise the Internet and list all the pages that mention grilling burgers, grilling or burgers.

Listing those pages at random would only confuse the person searching for the phrase, so to help the customer discern which of those pages is most likely to give her information about grilling burgers (as opposed to just "grilling" or just "burgers"), the engine ranks the results.

And since studies show that people's eyes scan search page results from top to bottom and left to right, the most important pages are listed at the top left of the initial screen. But if the customer clicks the top two or three links and finds nothing about grilling burgers, she'll blame the search engine for not giving her what she wants and won't continue using the services of that search engine.

So if your site is the best match for a particular keyword, the search engines want to list you at the top because that makes them look good to their customers. To receive that top honor, though, you must optimize your site according to their rules.

Trying to discern the rules of the dozens of search engines that exist is a daunting task, one you don't need to stress about. You really only need to concern yourself with the big three: Yahoo, MSN and Google. To further simplify things, you should pay the most attention to Google's optimization standards since the majority of Internet surfers start their search with Google. Google is therefore the reigning King of Search Engine Traffic, so by doing what the King wants, you become part of his empire and make your site relevant for the other search engines as well.

Before you understand how to market your site using search engines like Google, you need to understand the basic concept and technology behind these engines. A little over a decade ago, search engines were born to catalogue the growing amount of web pages on the Internet. As a site owner who wanted to be included, all you would have to do is register your URL with the search engines.

The engine would then send a "spider" to "crawl" the page and download it for storage in the search engine's server. Then an indexer would use a specific set of guidelines, rules or algorithms to analyze and extract such information as the words used, the placement of the words on the page as well as all inbound and outbound links.

The early algorithms, however, relied on keyword Meta Tags. These tags are seen only by the spiders as part of the code to provide a guide for the content of each page. They don't show up as actual text on the page for readers. That's why webmasters soon began to abuse these tags. They would include irrelevant keywords to get their site listed in more searches. This increased their page impressions and allowed them to charge more for advertising space.

For instance, if their site was about fitness, they would include a keyword like "finances" in their Meta Tags. Then their page would show up in the search results for both "fitness" and "finances" even though their site had zero relevance for a person searching for finances.

Because the factors relied on the webmaster's control, early search engines were easily manipulated. Searchers weren't finding what they were looking for, so the search engines had to adapt in order to provide people with relevant searches. The algorithms became more complex and added additional factors to make it more difficult for webmasters to manipulate results.

Enter Larry Page and Sergey Brin, graduate students at Stanford. Their search engine, "Backrub," relied on an algorithm, PageRank, that rated pages by the number and quality of inbound links. PageRank estimates the probability on a scale of 0-10 that a random web surfer will reach the site by following links from one page to another. If your PageRank is 9, it's highly likely that you will be found by a surfer who keeps clicking links on sites related to your topic.

Backrub soon expanded and became Google in 1998, a search engine that considers off-page factors like PageRank and hyperlink analysis in addition to on-page factors like keywords and content. Although this was more difficult to manipulate, webmasters soon found a way to adapt by focusing their efforts on buying, selling and exchanging links on a huge scale. Some of these schemes developed into link farms, thousands of sites that did nothing but link to other sites.

The purpose of this brief history lesson is to explain why SEO has become so complex today. On the one hand, marketers want to get their sites viewed by as many people as possible, so they try to beat the search engine systems. On the other hand, the search engines want to deliver relevant results to those searching for keywords. To make that possible, they now refuse to disclose all the factors they consider for their ranking algorithms. Google, for example, uses more than 200 signals to rank pages.

Fortunately, they don't keep all 200 factors a secret. All the search engines provide guidelines and information to help website owners optimize their sites. In addition, they all they all use crawlers to find pages. Where the words are found (title, lead paragraph, etc.) and how often those words are used throughout the copy on the page hold different weight for the different spiders, which is why you get different results when you search for the same keyword in different search engines.

And because it takes time to get your site indexed, you can't make a change and expect instant results. You have to change, then wait for the spiders to crawl over your site. If they like what they see and your competition isn't doing a better job, you are rewarded with a higher ranking.

Your site is "optimized" when you give the spiders what they are looking for based on the seven fundamental factors: Keywords, PageRank, Content, Sitemap, Navigation Links in HTML, Page Title and Meta Tags.
About the Author
Glen Hopkins is a Best-Selling Author, Information Marketer, Speaker and Consultant. Glen specializes in teaching struggling entrepreneurs how to turn their small Online businesses into thriving money machines using specific systems that will allow you to work less and earn more. Get his List Building Report and Web Traffic CD (valued at $97) for FREE at: http://glenhopkins.name/>http://GlenHopkins.name
Please Rate:
(Average: Not rated)
Views: 146
Print Email Report Share
Article Categories