Subscribe to Squarefish
Be updated on the latest news in
Virtual Assistance
As an SEO practitioner, your work has most likely been affected by Google’s algorithm updates from time to time. And most of the adjustments you’ve made to your SEO strategies probably had something to do with Google’s algorithms. But have you ever taken the time to find out how these algorithms work? How does Google crawl through the millions of websites currently available and index them? More importantly, how do the algorithms rank these websites?
If you haven’t taken the time to find the answers to these questions, or if you haven’t even wondered about them at all, why not? Haven’t you realized the importance of those answers? Think about it. When you understand how Google’s algorithm works, you understand how you can make your web pages rank as high as possible in search results.
Programs known as spiders browse (or crawl) the Internet to find and index every single page on the World Wide Web.
Okay, then, let’s take a closer look at Google’s algorithms and how they crawl through the Internet. Are you ready? Here goes…
First things first. Only Google knows the specifics of their algorithm, so the discussion that follows is actually a combination of general principles on how search engine algorithms work and a bit of informed guesswork on how Google does their thing. Of course, we also have to consider the fact that algorithms are constantly changing, so we really cannot delve into exact specifics.
So, here’s what generally happens “inside” Google. There are programs known as “spiders” or “bots” that are designed to browse the Internet in the same way people do. These spiders move from one link to another, from one web page to another, and from one website to another. Their job is to find and index every single web page on the Internet. Since real spiders typically crawl, what Google’s spiders do are also referred to as crawling.
If you leave your website untouched, Google’s spiders are likely to crawl through it just once every six months. If you want your web pages to be crawled and indexed more often, then you should make it a point to regularly update your web content. Crawls in that case can happen several times each day.
Common Mistakes That Prevent Crawling/Indexing
To make sure your web pages are crawled and indexed by Google (and thereby increase your chance of ranking well on search results), do your best to avoid the following mistakes:
- DNS/Connectivity Issues – Such issues can make your servers unreachable. How, then, can Google index your web pages if it can’t even reach your servers? Be sure to check for these issues regularly and correct them immediately.
- Inaccurate URL Parameters – Part of your job as an SEO specialist is to tell Google which links you do not want it to index, via the Google Webmaster Tools. Be careful not to make a mistake when you set your parameters, as a single mistake can cause certain pages from your website to be dropped from Google’s indexing.
- Poor Titles and Meta Tags – You should know by now that there is a right way of writing titles and Meta tags. Poorly-written titles and tags will cause your pages to be dropped from indexing as well. You may want to use some SEO plug-ins to help you improve your Meta data.
Make sure your titles and Meta tags are written properly, or your web pages may be dropped from indexing.
- Neglected Page Rank – If you don’t take the time to implement SEO strategies for improving page rank, your web pages aren’t likely to get ranked as often as you’d like them to be. Google’s Matt Cutts himself has said that the number of web pages on your site that Google crawls is roughly in proportion to your page rank.
- Absence of or Inaccuracy in .htaccess or robots.txt Files – It is very important for you to get these configuration files right because they determine which of your web pages are accessible to Google’s spiders.
Preparing for Algorithm Updates
There is a running joke within the SEO and search marketing community. Whenever Google announces the launch of another algorithm update, people are told to prepare their sites. This joke is the result of horror stories told by search marketers whose websites suffered considerably following one or more of Google’s algorithm updates. Naturally, you wouldn’t want your website to suddenly be thrown out of search results after an update. This is why you need to be prepared for when another update strikes. Perhaps the best way to prepare for future updates is to review the impact of the most important updates in recent years.
Google Caffeine – Released in 2009, this update was designed to scrutinize website architecture, which continues to factor into rankings to this day. There was little to no impact on site rankings when this update was rolled out.
Google Panda – Released in 2011, this update was designed to penalize websites that publish low-quality content. It also hit sites with duplicate content, too many advertisements, and other black-hat SEO techniques. It was because of this update that content came to be known as the king of SEO. The impact of this update was huge and actually started all the horror stories. Fortunately, many of the affected websites have managed to recover.
Google Penguin – Released in 2012, this update was designed to regulate spammy and/or overly-optimized websites. Black-hat SEO practices such as keyword stuffing and link-buying were penalized. Websites that practiced white-hat SEO and offered real value were rewarded. Just like the Panda, the impact of this update was huge and had search marketers scrambling to rethink their strategies.
Designed to regulate spammy websites, the Penguin update made a huge impact and had webmasters rethinking their strategies.
Google Hummingbird – Released in 2013 in time for Google’s 15th anniversary, this update was designed for speed and precision, much like the bird after which it was named. The focus of this update is on search itself, rather than on websites. It refreshed the search engine such that the algorithm now interprets the meaning of a particular query, instead of the individual words used in the query. This update is said to have made Google smarter; it has also made high-quality content even more important.
Google Pigeon – Released in 2014, this update is designed to emphasize local results. The impact is still starting to be felt, but most webmasters have already taken the necessary steps to make their websites more open to local search.
Google’s algorithms are always changing, but the fundamentals have remained the same. As long as you continue publishing excellent content, taking care of on-site SEO, and earning high-quality organic links, then you should continue enjoying high rankings.