Google’s Panda update rolled out in the US in early February and globally in April. There has been some major shifts in the search engine’s rankings and virtually every website has been impacted by the new algorithm. However, based upon a bit of sleuthing, it seems that there might be more to this algorithm change than we’ve been told. Site construction and SEO may have also contributed to the outcome of the Panda update. Let’s talk about content first and site construction a bit later.
Article directories, syndicators and other content farms appear to be hardest hit by Panda – leading to the nickname of ‘Farmer Update.’ I think many of these sites were targeted for poor content and their promotion of the use of duplicate content.
Panda seems to have also hit eCommerce sites hard. By their very nature, they have shallow content. Worse yet, that content often comes from the manufacturer and appears on numerous sites. As a result, many micro-niche sites, often with very low quality content, are now outranking quality eCommerce sites.
Whereas, eCommerce sites should have expected a duplicate content hit, they should not have been replaced by lower-quality sites simply based on word counts. eCommerce site owners should re-evaluate their site structure, their product descriptions and work on getting more unique words on each page. Sadly, many business owners are already falling victim to “SEO pros” with promises of restored rankings instead of investing in their site’s content.
Quality Content vs. SEO
In researching this article for Olaf, I thought that I would simply find examples of how well-written, original content was finally being rewarded. Surprisingly, Google appears to have also included site design and on-site SEO into this update – something many in the SEO world have appeared to miss.
A thread on the Google forums (Think you’re affected by the recent algorithm change? Post here.) invited webmasters to report sites that they felt were unfairly impacted by the update. I assumed that the sites would have poor content. Sadly, that appears to not be the case. Several of the sites were quite established, could easily be classified as authority sites and should live at the top of the SERPs. Looking a little deeper at the sites, I began to wonder if the code and on-page SEO of the sites is to blame.
Quite a few of the sites had extremely bloated code. One had over 8,100 characters in the HEAD section of their code, along with an oh-so-lovely pop-up. This entire article only has around 5,000 characters. Quite a few of the sites didn’t use style sheets and/or had several scripts within the HEAD section. The BODY section of the pages wasn’t much better.
OK, I know, many say that Google ignores the keyword tag. But, Google does use the meta description for the site descriptions within the SERPs. A number of the negatively affected sites had no meta tags at all or overly stuffed keyword tags.
I took three of the affected sites and counted the words on their homepage, including flat text and navigation. Site one had 480 total words, 87 of which were used in links – a little over 18% – seemed like much more. The second had 449 words in links – 33.6% of the text on the page. The third site had 642 words, 361 were used in links – a whopping 56.2%.
I don’t think there’s some set formula which dictates content vs links. I mention this only because on the sites I looked at the links overwhelmed the content and visually appeared quite excessive.
Some of the webmasters participating in the Google thread seem to have not pro-actively protected their content. They complained about being outranked by sites who stole their content. They didn’t file DMCA complaints, contact webhosts or request the sites be delisted. (I’ve lost count of how many hours I’ve spent on DMCA complaints and protecting my content.)
Sadly, it would seem that Google has made it clear that site owners are going to have to get proactive about protecting their content. Sites that did so, appear to have maintained their rankings. Sites that did not are now truly competing, and too often losing, against the thieves. Perhaps, the algorithm identified them as content farms if the number of stolen pieces was statistically significant.
As far as I can see, webmasters who saw positive results have a number of things in common. Their sites are well-designed, well-coded and well-written. They used unique, unsyndicated content. eCommerce sites with original descriptions and a fair amount of product reviews also faired well; those with little on-page content fell in the SERPs.
Where I think Panda goes wrong is that many heavily SEO’ed sites have been rewarded for their programming prowess and not the quality of their content. Quality sites are falling prey to sites with what often amounts to gibberish on keyword-specific domains. Google maintains that the Panda update focused on weak or shallow content. Unfortunately, it may have also rewarded content thieves and low quality, made for Adsense sites with streamlined code and superior SEO strategies.
Published in: Search Engine Optimization