Technical SEO elements are a pain to master for the non-technical minded. However, they are absolutely essential to help your website rank higher and ensure you don’t fall foul of a Google penalty.
That’s why we’ve created a simple to understand guide, breaking down some of the most important technical SEO techniques into easily digestible explanations. Learn what they are, why they’re useful for SEO, what tools to use and how to carry out some technical SEO best practices.
A HTML sitemap is a page on your website that contains a number of links that helps users to understand your website. Put simply, it’s a list of easy-to-understand and keyword-research-rich links that help guide people around your website. It takes the layout of your existing website and breaks it down into its simplest form.
What’s good for humans is good for SEO. A HTML sitemap helps enhance the usability, navigation and and accessibility of your website. It also improves the internal linking structure of your website which in turn can boost your SEO and site authority.
To create a HTML sitemap you can simply add a new page to your website and include all relevant links in order of importance. If you use WordPress, however, it might be quicker and easier to use a handy plugin that helps you create your sitemap.
The best places to put your HTML sitemap is in the footer of your website and on your 404 page to help users who may be lost. You should also ensure that your links include keywords that accurately describes to users what the page is about.
An XML Sitemap is a document that contains a list of all the page URLs on your website you want Google’s search spiders to crawl and index. This is for search engine spiders’ eyes only and therefore you do not need to create a beautiful page for users (this would be pointless as they won’t see it).
Creating your XML sitemap is a relatively intuitive process. Google recommends using this XML sitemap generator. You simply enter your website’s URL, fill out the optional fields and click ‘Start.’ The tool crawls your website and redirects you to to the generated sitemap details page. Copy and paste your sitemap URL and upload it to your Google Webmaster account.
You can noindex a page to prevent it from being crawled and displayed in search results. You can also block it with a Robots.txt file. However, a Robots.txt file tells Google not to crawl the URL but allows it to index the page and display it in search results. The best practice is to apply robots meta tags to the relevant page. This will allow your page to get indexed and enable you to tell Google not to list the relevant page.
The canonical tag is a html tag that lets Google know that pages with the tag are copies of the original page. Canonical tags are used to help Google distinguish the original content source from its duplicate pages.
When you create duplicate content (one or more pages containing the same content as an existing page) search engine spiders get confused about which one to index. This causes a huge issue as Google divides the page’s authority. Worst case scenario? Duplicate content without the canonical tag could actually result in a Google penalty for your website.
That’s why it’s important to apply a canonical tag to the duplicate pages you don’t want Google to read as the original source. Google will then crawl and index the original source and ignore your duplicate pages.
If you use WordPress and have the SEO Yoast plugin installed, you can quickly and easily add a canonical tag without messing with any code. To do so simply visit the ‘Advanced’ SEO section on your SEO Yoast dashboard and copy the URL of your desired page into the Canonical URL parameter.
Finally, you will want to click the Meta Robots Index dropdown menu and select ‘noindex’ to ensure your page doesn’t get indexed. When you save your blog post or page this information will be saved and applied accordingly.
It is absolutely vital to regularly crawl your website for any SEO errors you might not be aware of – consider this a health check up for your SEO performance. It’s one of the best ways to ensure your website doesn’t get penalized following a Google algorithm update or new page creation.
SEO crawling tools are one of the best investments you can make. They will help you identify any SEO errors that could have a negative impact on your ranking quickly and easily. Some of the best and well-known crawling tools in the SEO business include:
Below are some of the most fatal errors your SEO crawling tool of choice should help you identify and alleviate: