How to Index Website Pages Faster In Google Search Console:
The faster the pages on your site are indexed by the search engine, the faster they will be served.
Google takes the indexing process from a couple of minutes to seven days, Yandex will need it from a week to a month. Nobody can determine the timing of indexing in advance – it all depends on your site. But there are several ways to speed up the process.
Quick indexing methods are especially useful for sites:
with a large number of pages;
who often update their content;
the owners of which have made very important changes and want to immediately see the updated pages in the results.
Let’s get to the point – what can the site owner do to make the search engines index his site as quickly as possible.
1. Submit pages for indexing in the Google Search Console and Yandex.Webmaster:
To make the pages fall under the radar of search engines as quickly as possible, use the services of the search engines themselves – Google Search Console and Yandex.Webmaster. They help site owners monitor the status of indexing, study errors during crawl and optimize sites.
Google search console:
The first steps in working with the Google Search Console are to add your resource there and verify ownership. You can start here using the tips . The first data about your site can be seen in the console in a couple of days.
To check the indexing status of a new page, select “Check URL” in the left pane or enter the desired URL in the search field at the top. So simple.
check URL in google:
After submitting the URL for verification, you will see if this page is indexed, as well as errors related to AMP, markup, and indexing. If the page is not indexed, you can request a visit from Googlebot and track the indexing status.
link status check:
The URL Check tool is suitable for indexing both new pages and those that you have refreshed.
Getting started with Yandex webmaster is the same as in Google: add your site there and confirm the rights. After that, the site automatically enters the indexing queue. It usually takes up to 2 weeks.
As soon as the data on your site appears in Yandex.Webmaster, you can send the necessary pages for reindexing. To do this, on the left menu, click on the “Indexing” tab → the “Page Crawl” tool. Paste the URLs you need into the field and send the list for reindexing.
pages crawling in Yandex
The daily limit for adding URLs is individual for each site and is determined automatically – Yandex does not tell how it calculates.
2. Take advantage of the sitemap:
The XML sitemap serves as a navigator for search robots, and this can be used to speed up indexing.
The map helps search bots find pages on the site faster and index them faster. That is, to inform the search engines about which pages on your site are available for scanning, add them to the sitemap.xml file, and the link to the file itself – to a special section of the panel for webmasters (I will add instructions below).
The XML map for search engines is a recommendation, not a strict instruction. It does not guarantee that all added pages will be indexed, but can really help large sites index faster and more fully.
A few tips for creating a map:
For a map to work better, it must be well structured and marked up for a search engine. Mark updated pages with the <lastmod> tag to add the actual date of the last change, and not the same date for all pages.
Prioritize the pages to be scanned using the <priority> tag (0 to 1) and the frequency of page changes (using the <changefreq> tag ). So the search engines will receive complete information about your resource. But keep in mind that the page’s frequency of changes may not correspond to how often search bots will crawl it. Google does not currently take into account the <priority> and <changefreq> attributes , and therefore it is not necessary to add them. As for Yandex, he does not say that he ignores these tags – they are simply indicated as optional .
Create multiple site maps if you have more than 50,000 pages or a map size larger than 50 MB. Created maps must be placed in one index , in which there will be links to all created XML files.
How to create a site map and add it to the webmaster panel:
Various web tools and programs will help you create an XML map . This option is also in SE Ranking – in the “Site Analysis” section.
sitemap in se ranking:
In this tool you can choose which types of pages to include in the map, specify the frequency of page changes and the priority of scanning for pages of different levels.
The file with the map must be added to the site itself (downloaded to the root of the site https: //yoursite/sitemap.xml ), and then sent to Google and Yandex.
To submit a map to the Google Search Console, provide a link to the map on your site. You can do this in the “Sitemaps” section.
google sitemap files:
As soon as the site map is scanned by robots, you will receive a report on which pages of the site are in the search engine index and what errors were detected.
Everything is very similar in Yandex.Webmaster . First you need to go to the “Indexing” section, open the “Sitemaps” tab and specify a link to the file with your map in xml format. Within 2 weeks, Yandex will process the submitted file and will be able to index the pages you recommend.
3. Gather quality backlinks:
The more high-quality backlinks your website has, the more reasons search engines have to pay attention to it. After all, if someone links to your pages, then they are important to someone.
For indexing, links from social networks and news portals work best. The fact is that they are popular, often update content and, thereby, “stimulate” search robots to scan a resource as often as possible. If your link appeared on such a site, it is likely to be quickly detected by a search robot – even nofollow links have chances.
By the way, this year Google changed the rules for nofollow links. Starting March 1, 2020, the nofollow attribute becomes recommended. This means that there is a chance that the search engine will follow nofollow links and index them.
How to get high-quality links from other resources? Start by developing a backlink strategy. A few ideas: post a link to your resource on thematic forums, Q & A services, aggregator sites, etc. Try all the appropriate (and legal) ways . If the ideas for obtaining links have already ended, you can peek them at your competitors – collect sites for potential placement of backlinks from the link profile of a competitor site. The SE Ranking backlink analysis tool will help you figure this out .
What do we have to do? It’s simple – register a free trial account , open the “Backlink Analysis” and enter the competitor’s domain URL in the search field.
competitor link analysis:
See a list of sites that link to your competitor. You can appreciate the context in which the link is given, anchor text, and resource parameters. All the necessary data in one table – this way, selecting donor links is much faster.
Separately, it should be said about social networks. Search robots actively “monitor” social networks, because new materials appear there every second. To use this feature for your site, you need to strengthen its presence on social platforms. Just adding a link to the site in the profile header is not enough. In order for a search robot to become interested in a company’s page, it must be popular – have subscribers, likes, comments and reposts.
You can also place links to the site and pages in posts, comments, descriptions of multimedia files, notes, personal profiles (for example, as a place of work or partnership), groups and communities (both yours and others on your topic). The main thing is to make sure that the links are in place.
In first place for the speed of scanning links Twitter – it is quickly indexed by both Google and Yandex. Search engines also index links from active pages on Facebook. The same thing with Vkontakte – the more subscribers who like and repost posts will be recruited, the sooner you will attract the attention of search robots.
For Yandex, posting works in bundles of social networks – VK + Twitter, you can also add links to new pages in LiveJournal, LiveInternet, Diary.Ru.
The general rule of indexing different social networks by search engines is that the more social activity you attract to your publications, the more chances you have to “invite” social network crawlers. And they visit in pairs, inviting search engine robots to your pages.