We’re generating 10,000+ intent-based leads every month, with a total of $500,000,000+ in revenue generated for our clients since 2018

The first step in getting your website to rank within search engines is to check the technical integrity of your website from a search engine’s point of view. This can be accomplished with a technical SEO audit. The technical portions of your website are the basics for SEO and should be the starting point of any campaign. Without fixing these foundational issues, your website will suffer to rank highly in search engines.

There are many free tools available that will run a free technical SEO audit on your website, however these tools usually only scan 1 page of your website and not the entire site itself. These tools can help you get an understanding of where your website may be lacking, but it fails to paint a complete picture.

The Technical SEO Audit Checklist

In order to maximize your rankings in search engines you’ll want to hire a company to do a full technical SEO audit on your website. Ignition Digital Marketing provides these services and we’ll also fix all the issues for you. Or you can try to do it yourself by following the steps below. These are the same items that we search for when we provide our technical SEO audits.

4XX Errors

These errors normally occur because a page does not exist (404), it requires authentication (401) or it is forbidden to access the page (403). Make sure you deal with each type of code appropriately to ensure the page can be crawled.

Having a lot of 404 errors tells a search engine that your website is broken and may result in a bad user experience. This is especially harmful to your ranking if the page that is ranking in the search engines results in a 404. In order to fix these issues, make sure to use 301 redirects to redirect the user to a page that is not broken.

A 301 redirect is a permanent redirect, telling search engines that the information has been permanently relocated to another page. A 301 redirect will also end up passing on the ranking of the old broken page.

5XX Errors

These are fatal errors that will prevent anyone, including search engines, from accessing your website. They are normally caused from a programming bug or a server misconfiguration.

The best way to fix these errors is to contact your web developer or your web hosting provider, like Bluehost, as it may be a server-side issue. 5XX errors are usually caused by broken code on the website.

WordPress websites can start generating 5XX errors if a core update failed and the code became damaged during the update process.

Blocked from Being Indexed

If a page contains a noindex meta tag, then it will tell a search engine to avoid crawling this page. This may have been done intentionally, but if not, then this page will have no presence in a search engine. Simply remove the noindex meta tag to resolve this.

There are reasons for a page to not be indexed, namely pages you don’t want to be indexed like specific landing pages that are tied to ad campaigns.

Broken External Images

An external image is one that links to another image hosted on another website and is considered broken when it will not load. Generally it is bad practice to reference external images since it limits your control. A simple solution is to download the image and host it internally.

Broken External Links

An external link is one that links to another website and is considered broken when the page cannot be accessed. Since the linking website is not under your control, your best option is to remove the link. Otherwise this will diminish the reliability of your website according to a search engine or visitor.

Broken Internal Images

An internal image is one that links to another image within your website and is considered broken when it will not load. This could occur because the file does not exist or the image could be too large and is randomly timing out when trying to load it.

Broken Internal Links

An internal link is one that points to another page that exists on your server and is considered broken when the page cannot be accessed. This could be because it does not exist or there is an error trying to connect to it. Make sure the URL is inputted correctly and that you clear up any issues with the page. Excessive broken links will not only impact your visitors experience, it may also cause search engines to diminish the importance of your website.

Doctype not Declared

A doctype is the first thing that appears in your page’s source code and instructs a web browser which version of HTML you are using. If this is not specified then your code could be interpreted incorrectly and become uncrawlable.

Duplicate Content

A page is considered to have duplicate content if it contains very similar text to another page. Duplicate content will diminish the quality of a page since it is unclear on which page has more relevance to a given topic. Since there would be no purpose for a search engine to index the same page twice, it may ultimately lead to banning both pages from the results.

Duplicate Meta Descriptions

A meta description is a hidden tag that describes the purpose of a page. Search engines may use this description in the listing for this site and in determining the topic of the page. If the same description is used on other pages, it maybe be difficult to differentiate between pages. Make sure meta descriptions are unique and use topical keywords to describe the content of the page.

Duplicate Titles

A title is considered to be a duplicate if it matches the exact title of another page. Duplicate titles will diminish the quality of a page since it is unclear on which page has more relevance to a given topic. Furthermore, it will also confuse the user when navigating your site.

Encoding not Declared

Specifying an encoding for a page will ensure that each character is displayed properly. Generally this usually set to utf-8 but depending on the language of the page, this could be different. Example: <meta charset=’utf-8′>

Flash Content Used

It is generally a bad idea to use flash on your website. It isn’t possible for a search engine to interpret flash content and may skip over your page when crawling it. Furthermore, it creates a bad user experience as they will have to wait for it to load and may not be able to see anything on their mobile device.

Frames Used

Using HTML frames are considered to be dated and should be avoided. It is difficult for a search engine to read them and creates a bad user experience. Try to remove any frames from your pages in favor of better and newer methods to accomplish the same thing.

Https Redirect

Every website should be accessible securely with an https url. In order to make sure that you’re always using https, your website should redirect any request from http to https.

Incorrect URLs in Sitemap.xml

A sitemap.xml lists all the public pages of your website so a crawler can easily find them. You should only include pages that you want a search engine to crawl. An error is triggered if any URL is not found.

Invalid Sitemap.xml Format

A sitemap.xml lists all the public pages of your website so a crawler can easily find them. You should only include pages that you wish a search engine to crawl. An error is triggered if the syntax of the xml is incorrect.

Large Page Size

In order to keep page load time low, you should try to minimize the amount of content and HTML contained in it. Generally, a page file size should be less than 2 MB in order to avoid any search engine penalties.

Long Title

Any title with more than 70 characters is generally considered to be too long. Most search engines and sites will automatically shorten such a long title. A long title could penalize your site especially if you are keyword stuffing.

Long URLs

URLs longer than 100 characters are considered to be not ideal when it comes to SEO. A long URL can be difficult to read or share and can even cause problems with browsers or applications.

Low Text to HTML Ratio

The amount of text compared to HTML tags represents your text to HTML ratio. This test will fail if the ratio is less than 10%. A search engine can only look at your text to determine the page’s relevance. If there is an abundance of HTML compared to actual content, it will have difficulty segmenting the content. Furthermore, too much HTML may cause your page to load much slower.

Low Word Count

This test fails if the number of words on a page is less than 200. If a page does not have much content, it is hard for a search engine to properly assign a topic to it and may not bother indexing it. Try to make use of relevant content while using as many keywords as possible.

Matching H1 and Title Content

Using the same title as your H1 content is an ineffective way of defining the page topic. Use this opportunity to create two distinct phrases that illustrate the purpose of the page.

Missing Alt Attributes

An alt attribute is used to describe an image in a textual context. Search engines may interpret an alt tag to identify the purpose of the image. This is a great way to increase your page relevance as it relates to a topic.

Missing Canonical Tag

Canonical tags help to avoid duplicate content when unique content is accessible via multiple urls. Defining a correct canonical tag for all pages will keep them away from possible duplicate issues.

Missing Canonical Tags in Amp Pages

AMP stands for Accelerated Mobile Pages. This is used to strip down a pages HTML so it will render faster on mobile devices.

Missing H1

H1 tags are considered to be the main heading of a page and are used to help define the topic of the page. Creating a descriptive heading is an effective way to improve your search engine presence and make it easier for a user to navigate your page.

Missing Meta Description

A meta description is a hidden tag that describes the purposes of a page. Search engines may use this description in the results listing and in determining the topic of the page. Make sure each of your pages has a meta description that is unique and topical.

Missing Sitemap.xml Reference

If your site contains a robots.txt and sitemap.xml file, it is a good idea to reference the location to sitemap.xml within your robots.txt. Your robots.txt file is what a search engine will read when indexing your site so it is good to make it easy for the crawler to find the links you want indexed.

Missing Title

A <title> tag is one of the most important components of a page. It is often used as a link to your page on a search engine and is meant to describe the purpose of the page in a few words.

Missing Viewport Tag

This is a meta tag which allows you to control the scale in which the page appears on a mobile device. This will ensure that the page is not too small or large and is easily legible on a user’s device.

Multiple H1 Tags

Generally it is best to have only one H1 tag on a page to specifically define its topic. Multiple H1 tags can confuse a search engine or a user in determining the focus of the page.

NoFollow Attributes in External Links

If a link contains rel=’nofollow’, then it instructs a search engine to avoid crawling it. This may be done on purpose but if you wish to pass link juice then you should replace this link.

NoFollow Attributes in Internal Links

If a link contains rel=’nofollow’, then it instructs a search engine to avoid crawling it. This may be done on purpose but if you wish to pass link juice then you should remove this.

Overused Canonical Tags

Canonical tags are used to identify a duplicate page so that Google only indexes one URL. Make sure that each of your pages are not pointing to the same page or else it will be the only one page indexed.

Robots.txt Blocking Crawlers

A Robots.txt file gives instructions to any web crawlers, including search engines, on which pages of a website they should crawl. This way you can choose which pages you would want to be indexed on Google for example. Any errors in this file could cause a search engine to not index your website at all.

Robots.txt not Found

A Robots.txt file gives instructions to any web crawlers, including search engines, on what pages of a website they should crawl. This way you can choose which pages you would want to be indexed on Google for example. Missing this file may cause some of your pages to be ignored by Google.

Short Title

Generally, using short titles on web pages is a recommended practice. However, keep in mind that titles containing 10 characters or less do not provide enough information about what your web page is about and limit your page’s potential to show up in search results for different keywords.

Sitemap.xml not Found

A sitemap.xml is simply a way to list the pages of your site that you would like a search engine to index. This will make it easier and faster for a search engine to crawl your site and notify them about any new or updated pages.

Slow Page Load

A slow page can be frustrating for a user and will lower your relevance in the eyes of a search engine. Most users will not put up with a slow page and go elsewhere. A search engine understands this and will do the same. This test fails if it takes longer than 7 seconds to load the page.

Temporary Redirects

Temporary redirects are triggered when a page has a 302 or 307 http status code. This means that the page has moved temporarily to a new location. Although the page will be indexed by a search engine, it will not pass any link juice to the redirected page.

Too Many On-Page Links

This test will fail if a page has more than 500 links. Having too many links on a page can overwhelm the user and offer too many exit options. Search engines also have a limit on the numbers of links that they crawl on a page.

Too Many URL Parameters

Over using parameters in a URL is not the proper way to segment a page. It often creates an ugly URL that is not easy to read or pick out any defining keywords. Generally, parameters should be transformed into a path-based structure (ie. /param1/param2). This test fails if there are more than 2 parameters in the URL.

Underscores in URL

Semantically, underscores are allowed in a URL but is bad practice in terms of SEO. It is a good idea to separate words, however, you should use hyphens to accomplish this.

Hire The Pros To Fully Audit And Fix Your Website

Trying to run a technical SEO audit your own website can be a daunting task. Even though you now know everything you need to check, it’s best to leave it to the pros who have been doing it for over 10 years. Contact us today and we’ll help you get started on your path to search engine dominance.