Home » Internet » Search Engine Optimization

Solving The Problem Of Siloing A Website

Jul 9, 2008
For the last few weeks we have been sleeping, drinking and eating Silos and Latent Semantic Indexing (LSI). But before we can wrap up on his important technique of internet marketing it is prudent to consider it in a realistic context of Search Engine (SE) crawling. You could structure your website with the best of siloing CMS (Content Management System) and still not get the high SE Ranking Position anticipated. And this is not at all because the SEs do not find your website relevant to your optimized keywords. On the contrary, as discussed, siloing makes the website nearly super optimized.

Rather it is because the SEs do not find your web-pages and as a result you are not indexed. And as you should know, if you are not indexed you are as good as non existent as far as SE are concerned. So the question that comes to mind then is why are your pages not indexed?

SEs have a program that scouts the internet for web pages, takes a copy of these pages and sends them back to the SEs database for indexing. It is these indices that are used in sorting out the ranking for different pages as regards a particular search query. The scouting program is often refered to as SE crawler, bot or spider.

These spiders have been known to be very quick in indexing homepages. But they are also known to be rather shy in indexing the deep tier pages. This is technically referred to as deep crawling. By the very nature of siloing, a CMS or other siloed structured websites will tend to have many tiers. A tier is the structural level of a page as measured by the number of clicks it is from the homepage. For example a first tier page will be only one click from the homepage. While a third tier page will be three click or links away from the homepage. Siloing can result to even more than five tiers.

As deep crawling takes time and resources, SE bots reserve deep crawling to "important sites". So the reason your web pages will not be indexed is because the bot bumps on to tiers but does not find you deserving of a deep crawl. Basically the tactic that was meant to increase your traffic just resulted to you not even being indexed, leave alone getting traffic.

The good news is that there is a solution. You encourage the bots to deep crawl by increasing you web pages importance or popularity. Specifically this means you will need to get other websites to link to your site. Technically it means you need to increase your pages Page Rank. And this is a whole different topic.
About the Author
Courtesy of Home Business Review the leading All-in-One Small Online Business solution. Find out more before you buy site build it here.
Rating:
Please Rate:
(Average: Not rated)
Views: 143
Print Email Report Share
Article Categories