Google had a new patent granted on the 14th August; it’s called the Ranking Documents patent and you can read the patent here if you like. However, I’ll do my best to summarise a rather complicated (and some might say, dull) document.
Here is a description of the patent from Google themselves:
A system determines a first rank associated with a document and determines a second rank associated with the document, where the second rank is different from the first rank. The system also changes, during a transition period that occurs during a transition from the first rank to the second rank, a transition rank associated with the document based on a rank transition function that varies the transition rank over time without any change in ranking factors associated with the document.
After changes are made to a site or links are built to the site the positions may drop, or something unexpected may happen to the positions just so Google can see what action the website/webmaster will take when they see this happen. For example, someone that recently stuffed their content with keywords or added in a block of extremely small text just to try and boost positions would immediately remove that content. Google would notice this and could penalise the site, or at least use it as an indication that the site is trying to modify its ranking using spammy methods that are against Google’s quality guidelines.
The response to the changes made on the site isn’t always going to be the same. There could be a time-based delay response, negative response, random response and/or unexpected response. These response are rather vague and just leaves Google open to respond in any way they like. Results can be affected for an unknown period of time, this could just be random period of time, or it may take into consideration the changes that were made. This makes our job as SEOs even more difficult. We are used to seeing positive responses (or no response, which is also useful for us) when we make changes or do off-site work, and our clients are not going to be happy to see positions suddenly drop.
What if changes also coincide with a bigger algorithm update such as a Penguin or Panda update? Should we immediately react and clean up the site to get things back on track and make the two animals happy again or should we wait for an unknown period of time to see if positions come back before doing anything because we don’t want the website to be flagged as spammers? Luckily the patent is only designed to look at techniques such as:
- Keyword stuffing
- Invisible text (aka hidden text) or Tiny text
- Meta tags stuffing
- Link-based manipulation
“Link-based manipulation” is rather vague and open to interpretation, it could mean any type of links might be considered. All the other techniques are already dated, or could even be considered “black hat” so we do not use them and haven’t for quite some time now.
Here’s Google’s description within its patent:
When a spammer tries to positively influence a document’s rank through rank-modifying spamming, the spammer may be perplexed by the rank assigned by a rank transition function consistent with the principles of the invention, such as the ones described above. For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results. In response to negative results, the spammer may remove the changes and, thereby render the long-term impact on the document’s rank zero. Alternatively or additionally, it may take an unknown (possibly variable) amount of time to see positive (or expected) results in response to the spammer’s changes. In response to delayed results, the spammer may perform additional changes in an attempt to positively (or more positively) influence the document’s rank. In either event, these further spammer-initiated changes may assist in identifying signs of rank-modifying spamming.
Just because the patent exists it does not mean it is being used. It may not apply to every website or search query or they may be rolling it out slowly; nobody apart from Google knows. However, we have seen a similar thing happen multiple times before the 14th August; here is one example that may, or may not have been caused by the patent, but it may be very similar to ranking fluctuations we see in the future:
The page/keyword in question had been re-designed and the content re-written, but that should not have caused a massive drop like that. We were immediately aware of the drop but did not act upon it because we are familiar with things like this happening. What would have happened if we immediately put the old page back up or changed the content again? Would we have been flagged as spammers?
I think almost every webmaster monitors their positions within Google and always wants to improve them. Some will do it in better ways than others but hopefully Google will stick to the techniques listed above and only focus on the methods of genuine spammers rather than innocent webmasters just trying to improve their website. Maybe it has to happen on multiple occasions rather than a one1 off? 3 strikes and you’re out perhaps.