June 10, 2015
Often when people talk about SEO, the importance of building a site with a proper, SEO-friendly site structure is generally left out.
In fact, some experts might even argue that there is no such thing as an SEO-friendly site structure and might claim that search engines are quite capable of making sense of what’s on each individual web page.
However, a site structure can provide your website with huge SEO benefits and many major websites such as Amazon, Wal-Mart, and CNN and taken advantage of many of the SEO-friendly techniques I’ll share with you below.
Siloing Your Website
Website siloing is an SEO technique I don’t hear much about among white-hat SEO experts. However, it’s a technique that works and a technique Amazon has definitely mastered.
A website silo structure can be best described in the picture below.
Fix broken links
Broken links on your website are very non-user-friendly and they also pass unnecessary link juice to your 404 error pages. Using a tool like Screaming Frog, you can scrape your website for broken links and fix them quickly.
Don’t use www. and non-www
This is something that I find overlooked quite often.
If both the www. and non-www version are your website are active, you’ll end up with two versions of your website being indexed in search engines. And when you have two versions of your site live in a search engine, you end up splitting and diluting the power of the links between the two versions.
So as a rule of thumb, it’s a good idea to setup a 301 redirect so that only one version of the site is actually live.
Use Relevant URLs
By default, when you create a page through many content management systems, the URL will generally look very messy and ugly kind of like this URL:
However, you ideally want a URL more like:
The second example provides some type of information about the type of content on the page. Search engines will take this into account when ranking your website online.
And because it looks better, it’s much more user-friendly to anyone seeing the link simply because the URL alone will be able to give then an idea about the type of content on the page.
Use an XML Sitemap
An XML sitemap is simply a list of all the pages on your site. This makes it much easier for search engines to crawl and index your website.
Block Irrelevant Pages with Robots.txt
Pages that are not relevant to the content on your website should be blocked off because they dilute the power of the links between the pages of your site.
Keep 301 Redirects in Mind
Now that you’ve read the guide, it’s a good idea to keep 301s in mind if you plan to make changes to your site’s structure.
A 301 redirect notifies search engines that a specific page has be permanently moved to another URL. This is great because you won’t lose or dilute any of the link juice.
Did you like this article?
Get more delivered to your inbox just like it!
Sorry about that. Try these articles instead!