A recent change in Google algorithms has the blogosphere in an uproar. I sometimes wonder just what Google is up to when they make their changes. I’m reasonably sure they do so many times just to remind us all of who in charge. You know, kind of like a God complex thing.
The change took place about ten days ago and now we’re starting to see the results from it. It’s not pretty. In fact, it kind of looks like the girl at the bar after your tenth drink. She’s pretty now, but ….
The algorithm change revolves around three criteria:
- Domain Age
Domain Age – The Double-Edge Sword
Google has emphasized the age of domains for awhile now. The new update in its algorithm does so even more. I thought the previous emphasis was a bit too much. This one is surely so.
I believe that older sites likely have more authority on a subject than new sites. It’s easy to set up a spam site and just start targeting another site with back links. There are so many people gaming the system that Google’s second name could be Doormat. Considering the age of the domain is one way to discourage the spam sites from rising too high in the SERPs and we can all agree – well, all of us except the spammers – that that is a good thing, right?
But the problem is that Domain Age algorithms also penalize new sites that actually do provide good information. OK, I don’t sympathize really. I do believe you have to earn your keep. Life is hard. One of the rules that the Great Algorithm of the Universe placed upon nature in the very beginning is the Law of Prove Thy Worth. This law makes hard work a necessity in anything in life and ensures success goes to the persistent rather than the lucky. Why should it work any differently online than off line?
In short, the Domain Age algorithm means that new sites have to prove their value by gritting their teeth and plunging forward through the sludge and slosh of their spam brethren to prove themselves worthy of the rest of us. I’m cool with that. The real downside to this algorithm is that there are older web pages ranking highly that shouldn’t be there. Here’s an example:
I decided to search for “Google algorithm update” when deciding to write about this post. The top site in the SERP was an article on Digg that is one-and-a-half years old. Why is it there? I’m looking for the latest information, not something that happened a year ago. Other pages on Page 1 of the SERP were written between December last year and February of this year regarding the change that was popularly referred to as BigDaddy. Surely, there are people writing about the most recent update – the one affecting us right now.
Guess not. But even so, pick a keyword and you are likely to get results on the first page of your SERP that are outdated, which brings me to my Google Suggestion No. 1: How about an algorithm change that addresses relevance of older information. When I am searching for the latest information on a given topic, I don’t want a SERP full of last year’s news and gossip. I still find SEO-related articles from 1999 sometimes when I search for information related to SEO or Internet marketing. I can think of know body of knowledge that more desperately needs to be the latest and the greatest, yet older information still continues to rise to the top and it means that a lot of bad and outdated information is getting passed around on the Internet regarding the most important concepts of search. Sorry Google, but I think you’ve gone overboard on this one.
Backlinks – What Ever Happened To Relevance?
Google has created its own problems with regard to backlinks. They let the cat out of the bag about the value of backlinks and the whole world went berserk trying to create backlinks. Link building has become a huge business with some SEO firms making their living solely off of cleverly disguised link building campaigns. A paid link has become the most valuable commodity on the Internet, making on-page SEO elements second class citizens in certain cases.
Giving more value to links from sites that are relevant to yours ensured that link farms got their just desert. Take away relevance as a factor and we’re back to the wild, wild west. It seems that Google’s recent algorithm change is designed to reward almost every link in existence. This will likely change again soon. I don’t see Google allowing that to happen. It could be that the algorithm change is a lead-in to another, bigger change that will address Google’s concerns over paid links and in order to get there Google management felt they needed to take a step back. If that is the case then another change is on the horizon, like any day now.
The PageRank Blues And Proper Trust Management
PageRank came about in part because Google felt the need to measure a website’s trustworthiness. I think that is a highly regarded value. I’d like to know that a website is trustworthy, but the question is, Who should judge its trustworthiness? And what factors should go into that measurement?
I’m not sure that Google is the best authority on trustworthiness, but I’m not sure that anyone else is either. Mostly, I think every searcher is better able to decide whether a site is trustworthy with regard to their own interests. That’s why I’m a big fan of personalization.
Personalization aside, however, PageRank has largely become irrelevant. People have learned how to manipulate the data to appear as if they have some trustworthiness when, in reality, they just learned how to manipulate Google into giving them a thumbs up. But PageRank affects other things as well. If I can show that I deserve a high PR (whether I, in fact, do or not) then when I link to other websites those links are given more weight than someone else’s. PageRank then has an indirect affect on other website’s rankings through the backlinks algorithm. It’s interesting that this change is occurring at the same time as the backlinks change. I consider it as more evidence that another change if forthcoming.
While PageRank may be going out, it looks like TrustRank may be the new way for Google. TrustRank is considered by many SEOs more reliable than PageRank because PageRank can be gamed, but TrustRank is determined largely by a human editor who reviews links to and from sites to determine whether those links are from relevant sites with authority on a subject. Human editors can often find discrepancies and nuances in relationships that robots can’t. That’s why the TrustRank factor is much more important. And the fact that Google isn’t telling us what is important with regard to TrustRank is even better. If we all operate in the dark then that levels the playing field. Otherwise, the Internet will simply belong to those who have enough time and money to focus on gaming the system. When that happens, searchers lose.