Conducting an SEO audit of the site - part one

06.10.2014

Проводим SEO-аудит сайта – часть первая Many believe that website promotion is a dense forest in which an inexperienced hunter for high traffic can easily get lost. In fact, everything is so :) However, we strongly recommend that you “don't be afraid of the wolves” and immerse yourself in a fascinating journey with us through the stages of analyzing the technical characteristics of a resource from an SEO point of view.

Удобные сервисы для SEO-анализа сайта

To begin with, it would be nice to get a general idea of the resource by analyzing internal and external links, code, images, content and other components of the site using several convenient services:

  1. Yandex.Webmaster;
  2. Google Webmaster Tools
  3. Xenu's Link Sleuth;
  4. Screaming Frog SEO Spider.

Having received so much valuable information, you can “stomp” further to consider the components of the resource more carefully.

Robots.txt file

Robots.txt, also known as the robot exclusion standard, is contained in the root of the site and is used to configure resource indexing. The file allows you to block search robots from accessing certain pages or the entire site. In addition, it has many other useful settings:

  • determination of the main mirror of the site;
  • specifying the path to the sitemap (sitemaps.xml) for proper indexing;
  • the number of page loads in a certain time;
  • prohibition of access for certain search engines or different robots of the same search engine.

You can create many crawling rules depending on the goals and wishes of the site owner.

From all of the above, it is clear that robots.txt is an extremely important promotion attribute. During site analysis, make sure that this file is properly configured, and duplicate pages or sections with low-quality content are not indexed. All the same Yandex.Webmaster and Google Webmaster Tools will help with this.

Sitemap

Sitemap

XML Sitemap is also located in the root directory, the correctness and speed of indexing of all pages of the site depends on it. Thanks to this file, search engines will know which pages to crawl first and how often they are updated.

Sitemap is especially important for sites with a large number of pages. It is important to indicate to the robot the priority of certain sections in relation to the rest, since he himself cannot always find all the pages and determine their significance. When conducting an SEO audit, it is necessary to check the correctness of the code of this component of the site using the validator in the webmaster tools from Google and Yandex.

The Sitemap file must not exceed 10 MB and contain more than 50,000 links. If this limit is exceeded, it is necessary to create several Sitemap files, one of which (Sitemap Index) will contain lists of all maps.

It may also happen that there is no sitemap, then it must be created manually or with the help of special services. Then you need to add the path to the Sitemap to the robots.txt file and report the map to search engines using the already mentioned tools for site owners.

Website indexing quality

The “site:” command will help you find out how many pages of the site have been indexed by the search engine; it limits the search to one domain. Therefore, all scanned pages of the resource will be displayed in the output. Enter "site:" + domain name, no prompts, then compare the number of pages crawled with the actual number of pages on the site.

The resource is well indexed if both numbers are almost the same. If the search turned up many more links than there actually are, then most likely the site has a lot of duplicate content. It may be that there are much fewer pages in the SERP. Possible reasons for poor indexing:

  • there are many pages closed for scanning on the site;
  • the resource is not optimized for search engines;
  • The site was penalized by search engines.

Site pages

Be sure to check the HTTP status code - no errors 404 (page not found), 500 (internal server error), 503 (service unavailable), etc. The convenient HTTP Header Status Checker tool from Monitor Backlinks, as well as the services for site owners from Google and Yandex, which have set the teeth on edge in this article, will help to cope with this.

Требования SEO к URL сайта

Ideally, all URLs on a site should be SEO friendly:

  • consist of easily readable words;
  • be 100-120 characters long;
  • contain keywords relevant to the page;
  • separating words with hyphens;
  • simple addresses without parameters (preferably static links);
  • using directories rather than subdomains for sections of the site.

Checking these parameters and strict adherence to them is necessary both for better indexing of the site and for the convenience of visitors.

Website loading speed

Скорость загрузки сайта

An important aspect not only for impatient users, but also for search engines. The latter index nimble resources faster and more thoroughly, since they have a limited time allotted for scanning.

You can view page loading speed reports in Yandex and Google web analytics tools. More detailed information can be obtained on special services, for example, on YSlow and Google PageSpeed Insights.

SEO site audit is a difficult and time-consuming process, it is natural that all of its points did not fit into one article. In our next publication, we will talk about external SEO analysis of the resource, auditing the structure and content of the site.

Last in our blog

Internet Marketing
04.11.2019