Content Architecture and internal linking in SEO

One of the most important tasks a SEO must perform is the correct distribution and organization of the contents published in a domain. And this is true for several reasons:

  • Usability: Optimization of navigation and user experience
  • You must prioritize the pages according to their importance and / or meaning of grouping
  • You must speed up and facilitate crawlers or search engine spiders tracking and indexing

It may seem simple at first but in practice it is not usual to find good examples of web architecture.

Within this work there are different types of actions to be performed, in this article we will focus on the distribution of pages at different depth levels and the internal linking that will be carried out between them.

About depth levels, there is an old principle in SEO that says “you must have your content accessible to less than three clicks deep from the home”. Although this principle has been outdated, because the search engine spiders that track the network are much more powerful. Even so, it is still a good recommendation whenever possible to reduce the depth levels to the minimum possible. The SEO explanation to this …

, is that the algorithms that evaluate the flows of authority transmitted (commonly called “link juice”) have a “dumping” coefficient in each page to page hop, through the links that the crawler is visiting. This is so, because otherwise the “Pagerank” of all the pages would tend to infinity. The value of this coefficient is not public, but in Google’s own patent ( they comment on an approximate value of 0.15, which means that each jump of the spider is lost 15% of the “linkjuice” or “authority” transmitted from one page to another.

“The value of α is typically around 15%. Including this damping is important when many iterations are used to calculate the rank so that there is no artificial concentration of rank importance within loops of the web. Alternatively, one may set α=0 and only iterate a few times in the calculation.”

Once explained this, there are different types of possible architectures for your site. Before choosing the type of architecture, there is something important to decide. We will resume it here:

  1. Large / homogeneous site: at this point you want to maximize the evenly distribution of your linkjuice through all your URLs. So deciding on an architecture as “flat” as possible, and an equitable distribution of links between all the URLs may be the best decision in most cases.
  2. Small / heterogeneous site: in this case you probably have a small number of URLs that concentrate your greatest interest and benefit. So you may want to distribute your page following an architecture that enhances these pages over the rest of the domain.

Apply this logic to the rest of possible situations to make the best decision, knowing that even with these indications having an architecture and interlinked evenly distributed among all your URLs, is usually a good idea. This at least guarantees you the point at which all your domain will be tracked with the greatest ease and speed by the search engine spiders (and saving Google time and resources is a good idea;)).

For this reason, in Safecont we introduce an authority analysis and URL volume analysis by depth levels that we call “Depth Levels”. It is one of the features that our users liked the most.

It is used to show:

  • The depth levels that your domain has.
  • The number of urls in each level.
  • The “Level Strength” which is the authority or “pagerank” accumulated in each level.
  • And a bar whose width depends on the number of URLs and its color indicates if the “level strength” is appropriate for that level (red indicates that there is not enough “linkjuice” to the urls of that level, yellow and to the green that would be a correct level).

By clicking on each level you will go to see the detail of the urls that make up that level along with their respective metrics.

We have been asked to give examples of domains with good architectures to take as a sample, since as I said at the beginning of the article it is not usual to find good architecture. And very difficult to get a perfect architecture.


One of the pioneering Spanish domains and the one that best performed SEO optimizations, its structure is that of a software Marketplace. An architecture worked and taken care of in detail (although it is one of the few that decided on a structure in subdomains, which today makes less sense).

Another example:

This is the page of a well-known cosmetic surgery clinic. Corporate website, more news and contents about the different interventions and treatments. A very good architecture, although its third level is a little loose because it does not recover links from the lower level, surely are the pages that have less interest for them.

Another example:

This is a great fast food marketplace in the United Kingdom, they have decided to make a simple architecture in four levels and optimized to boost their main categories, redistributing the rest of the linkjuice in an equitable way among all the other urls. Since for them, they are “homogeneous” contents in importance. Although they lack a good horizontal cross linking system that would avoid these losses in the two deepest levels.

Another example:

One of the main directories of lawyers and legal services. It is a directory structure by categories and geographical with some other different sections of offers, community, etc. A correct architecture, the levels in yellow does not indicate that it is something negative, but maybe they lack some “power”, although as a level grows in number of urls it is complicated to make a perfect distribution between them.

Are you curious to know if your architecture is as good as you think? It is usual to take surprises, do not hesitate to use tools to optimize the architecture and structure of internal linking.

comparar precios


Comparte la noticia