Sitemap file or site map: use it correctly
Search promotion is a resource-intensive and costly process, and it is not so much about finances as it is about time. However, there are tools that allow you to improve the ranking of the site, but still remain incomprehensible, "unrecognized" and surrounded by many misconceptions. For example, the sitemap file.
The biggest misconception is that having a Sitemap alone will get your pages indexed better on Google. It is worth remembering that search algorithms do not respond to user requests, but they respond well in two cases - if the page is visible for indexing and if it is filled with high-quality content.
The sitemap.xml file for Google is a hint, a mention of what you consider these pages to be of the highest quality, targeted, that these URLs can be shown to the user, that they are important for promotion. In order for this hint to work properly, it is important to follow the basic rules for using sitemap.xml.
What to pay attention to when working
Avoid the most common errors when working with sitemap.xml by following fairly simple rules:
- Follow the sequence: you should not include the page in sitemap.xml if you are going to close it from indexing in the robots.txt file or use the “noindex, follow” index.
- Divide the pages into service, closed from the search, and landing, available to the user, with key queries for which your site will be found in the search. All service pages are blocked and not listed in the sitemap.
- Remember that Google evaluates the quality of the site as a whole when indexing. However, you should additionally indicate what, among all the pages of the site, you consider to be really excellent, “targeted” content, and what has nothing to do with it. Accordingly, the search robot will analyze only those pages that you have identified as “good”. Even if he finds shortcomings in them, he is more likely to mark the site as “attractive to users”. If the robot evaluates all pages, then the “service” ones can be indexed as “low-quality content”, which means that the overall attractiveness of the site will also decrease.
- Hide all unnecessary pages. If you have mentioned all attractive pages in the sitemap.xml, then this is not a guarantee that other, unmentioned pages will not be indexed. You should carefully check where the signals for search robots come from and close the unnecessary ones. Use the "site:" command. It will allow you to see all indexed pages. As a rule, the most recent urls in the list are the most low-quality ones in terms of content from the point of view of search robots.
What to use to hide pages - noindex or robots.txt?
To prevent page indexing, either the robots meta tag or the robots.txt file is used. However, keep in mind that blocking a page in robots.txt sort of "nulls" it, but the robots "noindex,follow" meta tag allows you to leave links and pass link weight. For example, tracking scripts can be blocked completely and nothing is lost, but links to pages from the main menu should be left. You should also block the indexing of service pages through robots.txt if you have too many of them, otherwise Googlebot will spend too much time crawling unnecessary ones. In addition, this approach will allow you to manage the scan budget, if it turns out that it is limited.
Indeed, it is illogical to spend the crawling budget on indexing service pages, which will not bring you a foreseeable benefit in terms of attracting new customers through search traffic. It is much better to regularly index constantly changing pages like "blog", "news", "catalog" and so on. Therefore, all such pages are included in the XML Sitemap file to tell Google which pages you prefer among all that are not closed from indexing.
Don't forget that the XML Sitemap file can be created separately for different categories of pages on the site. This will highlight groups with problems in indexing. Moreover, the sitemap should not be static. It can be dynamically changed in accordance with changes occurring in the content. Just set up the rules according to which these changes will occur.
So, be consistent, use the XML Sitemap as a hint for Google's crawlers, and don't forget that it's possible to create a dynamic version of the sitemap.