Technical SEO Audit

technical seo audit by fajelaTechnical search engine optimization is done to improve indexing. For some niches, it’s extremely important, yet, in others it doesn’t have a great effect. However, if you are building a new website or want to enhance your current one, make sure they are technically up to date.

You can download the checklist here (or on Google Drive), detailed instructions are below.

You can order technical (price starting from 100 USD) or full seo audit (cost starting from 300 USD) using the order form on the page or by e-mailing to seo@fajela.com.

Redirects

  • Order Form

  • * - required 

  • Find Out More

You will have to choose whether you want your site to have those www or not. No matter what you choose, the version should be one and only. Thus, if you type www.fajela.com you will still be redirected to fajela.com. This has to work for every other page or post. The same goes for other aliases (domains that lead to the same folder with your main domain) and http/https versions.

Duplicates have to be merged as follows:

  • when the case is different, e. g. fajela.com/Blog and fajela.com/blog;
  • home, main, index.html and alike (make sure there are no remaining inner links to those);
  • page variations due to several slashes or hyphens, e. g. fajela.com/technical-seo-audit (technical–seo–audit);
  • some php or html pages, e. g. where page = page.html;
  • non-Latin clones, %%%% variations.

Once again, there should be no variations left in your internal links.

Robots.txt

Lots of people install plugins (especially in WordPress) to build this file automatically. However, I strongly advise making this file manually and updating it from time to time.

Pages you should make unavailable for indexing include:

  • print versions;
  • admin pages;
  • personal workspace;
  • cart;
  • technical pages.

prohibiting bots to index your site in robots by disallowAt the same time, make sure that promoted pages are not locked for indexing. The funniest example that everybody knows of is when they put Disallow: / – into robots.txt and cannot understand why their site is not indexed :). A great method to double-check yourself (and we’ll get to it again a bit later) is going to Google Search Console to robots file for errors. In Bing Webmaster Tools, you may find robots.txt information in Crawl Information report and Index Explorer.

If needed, you should specify additional restrictions to different bots. Sitemap address has to be specified for general instructions (and Bing if you address it separately). When making your robots manually, it’s easy to generate prohibitions conflict from meta robots to sitemaps, and sometimes canonicals. For instance, you may direct canonicals to the page disallowed in your robots, or list a restricted page in a sitemap. Also keep an eye on contradictory prohibitions: if you put noindex on a page’s meta, you should add it to your robots file, and exclude it from the sitemap.

Other pages you’d better exclude from indexing include:

  • pages without content;
  • site search results;
  • catalogue filters that generate duplicates.

Sitemap

First of all your sitemap, specified in your robots file, has to be available 🙂 and be utf-8-encoded. You may check it in your Google Search Console and Bing Webmaster Tools. There are lots of third-party services that may help you verify your sitemap.

All the pages in the sitemap must return 200. If you have too many of them, your sitemap has to be divided into section sitemaps. Modern sites usually don’t make sitemap.html (or php), however, old school websites still do. If you want to have such a page, you’d better make sure it’s not overspammed and is unavailable for indexing. Once again, your utility, technical and disallowed pages should never appear in your sitemap!

404

Technically speaking, non-existent page has to return 404 on your website for each section or 302 to 404 redirect with a noticeable delay. In any case, don’t redirect all 404 inquiries to your main page. When a visitor comes to your 404 page, they should at once understand where they are. It’s better to offer them an option to visit other pages, be it the most popular posts or a search form or anything else relevant to your business. There should be no 404 response for internal links or because of relative links.

Friendly URL and Structure

some talented seos restrict access to the site in robotsYou may include query in your domain name, but then make sure you are not overspamming urls for pages. In any case, don’t name your page like /seo-seo-seo.html. Avoid making extra folders in your landing page url. For e-commerce sites with lots of filters and catalogues, it may be appropriate to have a link like category/subcategory/features/product.html, but if you have a smaller website, you should try to avoid it. Restructuring your site may be painful, thus, think in advance what you want it to be in 5 years and build the structure accordingly. Pages that have smaller nesting structure are indexed better.

Promoted URL should not exceed 65 characters and has to be informative – your visitors should understand where they are just by reading your URL text.
The pages you promote should be no farther than 2 clicks away from your home page. There should be no more than 100 internal links (50 is better, but impossible for ecommerce). And external ones should be limited to 3 and bear a nofollow tag (that doesn’t apply to your social media links in the footer). This is not a rule, lots of exceptions are possible here. However, you may want to have a reference point. Of course, Neil Patel in his long reads includes a ton of various links, but that’s different and he knows what he’s doing.

Another important aspect is to make at least 1 internal non-site wide link to a promoted page, ensuring relevant anchor text is unique throughout your website.

Indexing

This one is perhaps the most important part. Only the lazy would not write about how load speed is important for your rankings. Sometimes you can get to the top just by optimizing your site speed. It also improves your indexing in a way that your pages become more visible by a greater keywords variety. It’s highly important that your site loads no longer than 2.5 to 3 seconds. However, check your competition! If their sites load in a second only and you are the only one to load in 2 seconds, chances are you won’t be able to beat them.

If you have a very small budget and don’t know how you can spend your money to improve your site, Fajela recommends starting with load speed!

Another key aspect here is your page’s size. It has to be below 50 KB (sometimes up to 100). The server has to return the same encoding specified in your pages code (in the header). Promoted content should not be in frames (Youtube video is an exception). Coding should be neat and clean, without errors – there are many services to validate your coding, but you should also look through it with your own eyes, for sometimes really strange things happen. Look for <head></head> and <body></body> tags as well – they have to be present on the webpage.

As it has been already mentioned, pages must return 200 response, no session variables are to be generated by the search engines.

Useful Duplicates Optimization

It’s now a common practice to specify a canonical page and canonical attributes – there should be no conflict between them. This is also a way to deal with pagination pages: canonical points to the first one and pagination is made with AJAX, scripts, next attribute and preview. The same is relevant for filter pages (or sorting) – you specify those as non-canonical, include meta name = “robots” and content=”noindex follow”. Don’t forget to add those to your robots.txt. It’s better not to use nofollow for your own inner pages because it’s your own content.

Webmaster and Other Tools

using webmaster tools is crucial for technical seoThese are very basic things, however, make sure your website:

  • has an account in Google Search Console and Bing Webmaster Tools, set up and configured;
  • has Analytics account (preferably set up through Tag Manager);
  • has proper citations including Google Business and Bing Places for Business;
  • is available in Google and Bing Maps;
  • has a favicon;
  • is on a good hosting account and has nice neighbors there.

In both Google Search Console and Bing Webmaster Tools check that your site doesn’t have scan errors, html errors, 404 errors, etc. Google also has a wonderful tool to check for your page insight errors – do use it, they even offer how you can deal with errors found. If your website has several languages, make sure your hreflang attribute is working the right way.

Even if most of your visitors come from a desktop, your mobile version should be there and working fine.

Micro layout is becoming extremely important. Google offers a good service to check if there are no errors in it and you may rely on this tool no matter what you use for your mark-up.

Navigation

If you use breadcrumbs, and if you have an online store, you have to use breadcrumbs – those have to be configured and work correctly. No overspamming, easy in navigation. Overall navigation has to be available to search bots – avoid using scripts there.

Links have to be non-circular. What does it mean?

  • Your logo on your home page (that’s leading to your home page) is not clickable;
  • Menu button in the relevant section is not clickable;
  • Breadcrumb leading to the same page where you are right now has to be non-clickable too.

Menu should not disappear when you disable images and scripts. The only exception here is a bottom menu. It should be different from your top menu, but if it’s similar, make it using scripts. Sometimes they make menu out of images but forget to make relevant titles. If the images don’t load or are disabled, it’s impossible to see the whole navigation.

Avoid using tag clouds, but if you choose to use them, don’t use h tags for them (such as h1, h2, h3 etc.). Most of the themes I’ve seen have the tags built-in, but you should get rid of it – it’s very bad for your rankings. Again, site search has to be available for users, but be closed in robots. In Analytics, add search parameter to your admin panel to see what people look for on your site with search.

Upon clicking logo, home image or button, you don’t open the index page.

Inbound Links Validation

It might seem that there’s nothing here for technical SEO audit, but a good specialist has to look at those inbound links to see the whole picture. Besides, some of the links may result in GWT errors and you’ll have to deal with them by asking Google for revision and adding others to disavow.

In Fajela, we usually use Search Console, SemRush and Ahrefs to ensure that:

  • There are no link explosions;
  • No spikes in links;
  • Anchor list is adequate and not overspammed;
  • Donors are indexed, not spammers, don’t sell links and provide you with some traffic;
  • You have links from social media;
  • Your competitors are not much better than you are with those links.

UX

Users behavior on your page is sometimes also a ranking parameter, and it is a great conversion parameter – no doubt there. Just make sure it is easy for your visitors to return to the main page or starting page no matter where they are on your website. It’s a part of the navigation, but you should once again check if they can move up or down in a section or go to an adjacent section. Do they at once understand where they are on your site? What do they see when entering a wrong url? Try walking in their shoes: is it convenient to use your pages?

Content Optimization

Provided the keywords were chosen correctly and distributed in the right way, pick a page you want to promote:

  • There’s a unique good title (compared to other pages as well as to competitors), description;
  • Headers in layout comply and are available. No headers in menu, links, images, for design;
  • Text is divided into paragraphs, structured well, is easy to read, unique, has low rewrite % and reflects search intent;
  • If there are lists, they are not overspammed, not too long or too short;
  • Theme-based, unique and relevant pictures are present;
  • The text bears right keywords, synonyms, looks natural, its density complies to the norms;
  • Internal relevant links with adequate anchoring are available and clearly visible and distinguished from other words.

Ordering Technical SEO Audit

If you want to hire Fajela to audit your site, simply fill in the form below and we’ll get to you as soon as possible. The technical audit will be done according to the checklist, but a corresponding report will be prepared with matching recommendations of what to do and how to do that.

If you incorporate all the recommended changes in one week, new technical SEO analysis is FREE OF CHARGE!

Reviews about me from the web

review on technical seo audit

technical optimization review

Do you want to get more information?

Or perhaps, you need a consultation
or any other kind of help?