Home » Internet » Search Engine Optimization

How To Keep Your Website In Google's Good Books

Aug 17, 2007
Unless you've just recently returned from a protracted space furlough cruising around the solar system, you are probably aware that the current Undisputed Heavy Weight Champ of Search is none other than Google a/k/a The Big Dawg of Search! What you may not know however, is how much of a Champ of Search they really are. Recent surveys and polls list Google as hogging more than a whopping 60% of global search market share.

In order to best understand how to keep your website in Google's good graces you first need to know what their ultimate goal is, or that of any search engine for that matter.


Search is a huge money-generating business and not merely a 20th century tool that enables you to conduct research while you slouch on your couch. So how big is the search industry really? Well let's put it this way; in the year of 2006, online advertising racked up revenue in excess of $16.7 billion dollars and is expected to reach $45 billion in the next five years. Impressive huh?

Anyway these figures should give you some idea of the stakes involved in the search engine industry.

Further, more than 97% of Google's not inconsiderable fortune is derived from online advertising. Remember as noted earlier, Google commands more than a 60% share of the global search market; which makes for one very good and big reason you should ensure that your website is googlicious (google extra-friendly).


We've already established that search engines are dedicated to more than simply returning pertinent results to yours or any number of the various billion queries conducted each day; they are dedicated revenue generators.

Google is by far the largest search engine and derives more than 97% of its revenue from online advertising. And let's not forget that Google is a publicly held company, meaning that it has an obligation to its investors to maximize yearly profits.

This then clues us in to what Google's main objectives are:

1. Maximize Profits; by

2. Hosting the Largest Number of Advertisers; by

3. Retaining the largest number of conducted online searches; by

4. Supplying the most accurate results per query; which

5. Ensures that more people will use Google as their search engine of choice.


The year 2007 launched with considerable anticipation as well as turbulence for the SEO community. A Google algorithm change or modification that SEO analysts cannot yet as one agree to term either an update or data refresh was the cause of such unrest.

However whether it was an update or data refresh, what is not in contention is that it certainly shook up the SEO horizon and left quite a number of webmasters upset and disgruntled because their websites suddenly lost significant position for targeted keywords. The flip side of this particular coin had a good number of webmasters ecstatic over the realization that that their websites were suddenly ranking very well for their most coveted and targeted keywords.

The ripples of this Google shakeup had authoritative sites (which for time immemorial had been ranking well for particular keywords) suddenly drop several positions on the SERPs and in the worst scenarios disappear completely off the grid! (Most have been reinstated to their former glory.)

This shakeup now also meant that wizened old websites could no longer rely on the age factor alone to guarantee their index listings or page rank. In order to maintain their long coveted positions such sites need to be refreshing data or adding new content regularly.

Disconcerting as these new rules may be there actually is a lot of sense to this approach. Perhaps you may recall that in 2006 Google experienced a server overload super-crisis. There simply were not enough servers to accommodate the burgeoning number of web pages of the ever-expanding internet.

So to alleviate the situation Google adopted a number of modifications of which perhaps the most notable was the one which restricted the Google spider to crawl only those web pages that were accessible via external links. The rationale behind this was that pending the acquisition of new servers only such web pages that were considered of the utmost relevance would make it into the Google index.

Thus it is not too far of a stretch to consider the 2007 Google shakeup as a reinforced strategy on their part to avoid the server debacle of 2006 and by proxy also sharpen the accuracy of their indexed data.


To maintain or elevate the page rank of your website or its listing on the SERPs, you need to add new content or at least refresh it on a regular basis. In the past it was possible for fairly old websites to remain static yet maintain their high page rank and SERP listing purely based on the age factor. Due to algorithm tweaks following the 2007 Google shakeup that is no longer possible.

Using inbound links as the premier tool of measure for evaluating the importance of a website is not going to change any time soon. If you consider that an inbound link is an approximate thumbs-up vote for the linked-to site then you easily understand why this measure of evaluation is not about to disappear quite yet.

Certainly not all links are created equal such that links from topically relevant and related web pages are more highly rated than those from topically unrelated web pages. However it is not always possible to get such topically related links but the bottom line: your website is not going to mature on the SERPs or graduate page rank-wise without external inbound links.

There's also a new twist in the way links are evaluated. Beforehand a high page rank link was the most valuable kind of link, but now that search engines are more sophisticated there's a new system in play. The strength of a link will be determined by the number of clicks that link garners. In other words a very active PR3 link characterized by a high click-through-rate will score much higher from an SEO point of view than a PR8 link that has next to no clicks! This is just one of the latest ways in which the search engine algorithms are combating artificial link acquirement.

When all is said and done however--keep those links growing!

From the outset the ideal vision of the internet was one where a mass of data would connect together through a natural progression of topical relevance via links. Well as you're probably aware, ideal and paradise tend to exist more in the hypothetical realm than the real world.

Anyway back on this theme of interconnectivity, for the longest time it was considered good SEO practice to hoard your page rank as much as possible by having as few outbound links as possible. That has changed and now having next to no outbound links could even result in a minor penalty. It would appear that as the algorithms have smartened up the search engines are now finally reaching for the sky--aiming for that elusive web paradise!

In the same manner that you can tell how long people are spending on your website from your web stats so can the search engines. The longer visitors spend on your site the stickier your site is said to be. Suffice it to say that the more sticky your site the better as far as search engine optimization goes. A stickier site will improve your page rank and SERP listings. The best way to make your site sticky is to have interesting, valuable and stimulating content!
About the Author
Internet Business Mart: The website that provides marketing solutions and tips . Internet Marketing Online
Please Rate:
(Average: Not rated)
Views: 120
Print Email Report Share
Article Categories