The web is growing at an alarming rate. For its search engine to work properly, Google must crawl all these new pages while continuing to crawl existing sites to check for content updates. This suggests that Google requires more processing power every day to continue operating. Most searches are also performed in under 1 second which requires more memory and processing power as the database of sites continues to grow.
Another factor which we need to keep in mind, is that Google offers a range of other services. One typical example is how Google can Optical Character Recognition (OCR) on the content you have on your mobile device to search for related information/ OCR alone is very processor-intensive, and one can appreciate that maintaining such services also puts further pressure on Google’s processing power requirements.
When we look at the complexity of the Google Ranking algorithm, we can see that Google needs to process all the crawled information in order to calculate Domain Authority and give every page a dynamic score. Even though Google has various sources of information to identify new sites (most notably, its Chrome web browser), it also needs to keep track of outbound links to discover new pages.
Case studies
We are constantly running experiments in order to understand Google’s Algorithm better, through reverse-engineering. I’m going to describe two scenarios, that highlight the fact that Google is taking short-cuts in order to cut down on increasing demands of processing power.
Changing the Title of a Page
About a month ago, we changed the title of a Twitter account in order to change how the account appears in search. One would think that a website as active as Twitter is crawled and indexed frequently, however, this change is still not reflected in the search results. This shows that Google is not updating the index on existing pages on a frequent basis.
De-Indexing pages with robots.txt
Around 6 months ago, we wanted to de-index most of a Casino site’s pages from search results for Search Engine Optimization (SEO) purposes.
In the Search Console Help Pages, Google themselves state the following:
If you recently changed your site and now have some outdated URLs in the index, Google’s crawlers will see this as we recrawl your URLs, and those pages will naturally drop out of our search results. There’s no need to request an urgent update.
This suggests, that we should not request a de-index through the Search Console and instead inform Google about our de-indexing request through meta tags and robots.txt. After following these guidelines, we adopted both processes to this site, without doing any changes through Search Console.
6 months ago, we had 1500 URL’s indexed and we left around 150 index-able pages. A few years back, we tried the same process and got the desired result in just 2 weeks. However, this time, after 6 months there are still 430 indexed pages on Google Search results. Again, this shows how Google has been performing certain tasks less frequently.
Conclusions
The answer to the original question is simple: Google is far from running out of Processing Power. However, it is updating its search algorithm accordingly in order to decrease the processing requirements. If used appropriately, this information can be used to help improve your site’s ranking.
Would you like to know more about how we can help your search presence? Get in Touch!