In this era of Google semantic search, on-page SEO and off-page SEO don’t work as effectively as before. They are important aspects of SEO strategies but if you mess up with the technical foundation, your other SEO efforts may not bring about any results.
To ensure this, follow our technical SEO checklist.
It is possible that you have more than one version of your website such as:
All versions should point to the preferred, correct version of your website. If users visit one site, they should automatically be redirected to the correct version.
For instance, if your preferred site is https://www.mysite.com, all others should 301 directly this. You can test for validity using the cognitiveSEO tool. Go to Indexability>Preferred Domain. You should get an All is OK message.
The robots.txt file is a simple text file. It directs search engines as to which areas of your website are out of bounds and which are not.
To check if you at all have such a file, visit yourdomain.com/robots.txt. The display of a plain text file indicates you are on track. If not, you need to create the file. Search Google for “robots.txt generator.”
You can also check the file using this tool:
Google Search Console > Crawl > robots.txt Tester
The structure of a website is vital to keep visitors engaged longer on the site. For search engines, website structure counts hugely in understanding and indexing a website easily. Google takes into account your entire web structure when evaluating a particular page.
Below is a screenshot of Google’s guidelines. A well-defined structure aids webmasters to point out important content to Google and hitch up higher rankings.
Breadcrumbs got Hansel and Gretal home. Breadcrumbs used on websites serve the same purpose; to guide the user through the website. Visitors can follow their location on a website and gives guidelines for easy accessibility.
Breadcrumbs enrich the user experience and also render a clear picture of the site structure to search engines.
Existing crawl errors can swiftly be identified using the Google Search Console. Examine the coverage report and you can detect both errors and excluded pages. Resolve the errors and figure out the reasons for excluded URLs. They could be because of 404 errors or wrongly canonicalized pages.
Crawl budget is a measure of the number of site pages that search engines are able to crawl in a given time frame. By itself, it is not a ranking factor but it is an indicator of how often your pages are being crawled, or, even if they are being crawled.
In this instance shown above, the average times that Google crawls the site is 7906. This crawl rate factor can be manipulated to get it effectively smaller. The crawl rate limit can be set in the case of Googlebot making too many requests per second that hampers your server speed. In the case, where the crawl rate is low, try submitting a sitemap or put in a request for indexing through the URL Inspection Tool.
A few tips on what you can do:
Time is vital on the web and globally most sites are slow taking on average 19 seconds to load. Studies have shown that users abandon the search if the site does not load in 3 seconds.
Faster loading time means greater conversion and lower bounce rates. Using Google’s Speed Test you can do an easy and short analysis of your website’s speed.
To understand how your website is faring, consider page load distributions.
There are two metrics involved here: contentful paint (FCP) and DOMContentLoaded (DCL). These indicate which content loads faster and which needs improvement.
Also, speed and optimization indicators reveal where the website stands.
Sometimes users can see your page and everything on it, but Google can’t. And if Google cannot access your page fully, it will not rank.
You can use Google Search Console’s “Inspect URL” feature for this. Just enter the page URL at the top of the GSC.
You will then view your page from Google’s perspective.
Mobile searches are on a dramatic high, so having a mobile-friendly website is very important. You can check if your site needs any update with Google’s Mobile-Friendly Test tool.
Google is very concerned with broken links which are not good for the user experience. So, Google regulates their algorithms to hound sites with too many distracting ads.
To discover broken links on your website, you can use Ahref’s free Broken Link Checker.
Internal links interconnect your pages. You can construct a powerful website by spreading link juice, or link equity as some refer it to. Linking similar pieces of content is called Silo content.
When auditing internal links, the following need checking-
All of these can be accomplished with CognitiveSEO Site Audit Tool.
If you are still not hitched on to HTTPS in 2020, you are heading for serious trouble. Ranking potentials apart, HTTPS will provide protection for your visitors’ privacy. More so if you if contact forms are posted on your website. If passwords or payment information are involved, HTTPS is an absolute must.
Duplicate content is confusing for visitors no doubt but more importantly from a technical SEO viewpoint, for search engine algorithms. Search engines frown on this. Both Google and Bing urge webmasters to fix duplication issues.
It occurs commonly on e-commerce sites because of faceted navigation. You can fix this duplicate content issue glitch with Ahrefs’Site Audit.
This is a file that aids search engines to comprehend your website whilst crawling it. It is a roadmap that tells search engines where each page is located.
How does one create it?
For WordPress users, there is Yoast.
Your sitemap is extremely important. The sitemap reveals to search engines regarding your site structure and unearths any fresh content.
If you are missing a sitemap on your website, you really do need to put one up. You can do it easily using the Website Auditor.
When you check your sitemap, it should be:
AMP is a project backed by Google. The goal of this project is to speed up content delivery on mobile devices using a special code called AMP HTML. AMP styles of your webpages load super-fast on mobile devices and because of this, these versions of your pages have greater odds to reach users and be shared. Dwell time is increased as also the backlinks pointing to your content.
Pagination is used when you have a long page that needs to be split into many shorter pages and when paging is enabled in your category pages. To avoid duplicate content and combine links and page rank to the main page the links rel=”next” and rel=”previous” that will indicate to search engines that the succeeding pages are a continuation of the main page.
If you have content in more than one language on your site, the hreflang attribute is made use of to inform Google about your site structure and content.
Structured data markup is code that has been added to your website so that search engines can make better sense of the content on it. Structured data helps search engines to index your site with greater effect and yield relevant results. Also, it enhances search results by adding ‘rich snippets’ such as star ratings to reviews, product prices, and more.
They are visually appealing and draw the attention of searchers. The click-through rate (CTR) is bettered and more traffic to your site is generated. The rewards of a higher CTR is preferential treatment by search engines. Therefore, it’s well worth the effort to add structured data to your site.
Both of these are free tools from Google and Microsoft respectively. It permits you to present your website for indexing to their search engines. When you are prepared to launch your website, you need to forward its XML sitemap to both Google Search Console and Bing Webmaster Tools so that begin to crawl your new site and put out the results in the search results.
These services let you monitor the general performance of your website from a search engine’s perspective.
Other things that can be accomplished with these tools are:
In the process of checking up on your website’s health, you should also keep an eye on the competition.
For instance, are their pages loading fast? Are their sites mobile-friendly? And so on.
It’s an advantage to know where you stand compared to your competition. There are tools that help you glean information to outperform the competition and pinpoint weaknesses.
Nacho Analytics and SpyFu are two such tools that will provide an organic snapshot of your competitors.
The fore-mentioned site audit will throw up issues on your site that need to be fixed. Once you have got them corrected, to ensure the changes are in effect, you will need Google to perform a re-crawl of all pages.
Using Google Search Console, send the updated URLs using the URL inspection tool. Enter the page URL you want re-crawled and hit the Request Indexing.
SEO is an ongoing, dynamic process. It is virtually impossible to mention everything important in one checklist. For starters, by tackling the technical SEO checklist items mentioned above, you will get a head start to higher rankings.
Fill in the Form Below & Tell Us How We Can Serve You
We don't just focus on meeting our clients' needs, but we strive hard to delight them with excellent results! Our client feedback reflects this effort of ours. Once you team up with us, we've the belief that you too will speak highly of the Doodle Digital team.