Google Panda Update Targets More than Bad Content

Google’s Panda update rolled out in the US in early February and globally in April. There has been some major shifts in the search engine’s rankings and virtually every website has been impacted by the new algorithm. However, based upon a bit of sleuthing, it seems that there might be more to this algorithm change than we’ve been told. Site construction and SEO may have also contributed to the outcome of the Panda update. Let’s talk about content first and site construction a bit later.

Syndicated Content

Article directories, syndicators and other content farms appear to be hardest hit by Panda – leading to the nickname of ‘Farmer Update.’ I think many of these sites were targeted for poor content and their promotion of the use of duplicate content.

eCommerce Sites

Panda seems to have also hit eCommerce sites hard. By their very nature, they have shallow content. Worse yet, that content often comes from the manufacturer and appears on numerous sites. As a result, many micro-niche sites, often with very low quality content, are now outranking quality eCommerce sites.

Whereas, eCommerce sites should have expected a duplicate content hit, they should not have been replaced by lower-quality sites simply based on word counts. eCommerce site owners should re-evaluate their site structure, their product descriptions and work on getting more unique words on each page. Sadly, many business owners are already falling victim to “SEO pros” with promises of restored rankings instead of investing in their site’s content.

Quality Content vs. SEO

In researching this article for Olaf, I thought that I would simply find examples of how well-written, original content was finally being rewarded. Surprisingly, Google appears to have also included site design and on-site SEO into this update – something many in the SEO world have appeared to miss.

A thread on the Google forums (Think you’re affected by the recent algorithm change? Post here.) invited webmasters to report sites that they felt were unfairly impacted by the update. I assumed that the sites would have poor content. Sadly, that appears to not be the case. Several of the sites were quite established, could easily be classified as authority sites and should live at the top of the SERPs. Looking a little deeper at the sites, I began to wonder if the code and on-page SEO of the sites is to blame.

Bloated Code

Quite a few of the sites had extremely bloated code. One had over 8,100 characters in the HEAD section of their code, along with an oh-so-lovely pop-up. This entire article only has around 5,000 characters. Quite a few of the sites didn’t use style sheets and/or had several scripts within the HEAD section. The BODY section of the pages wasn’t much better.

Meta Tags

OK, I know, many say that Google ignores the keyword tag. But, Google does use the meta description for the site descriptions within the SERPs. A number of the negatively affected sites had no meta tags at all or overly stuffed keyword tags.

Excessive Linking

I took three of the affected sites and counted the words on their homepage, including flat text and navigation. Site one had 480 total words, 87 of which were used in links – a little over 18% – seemed like much more. The second had 449 words in links – 33.6% of the text on the page. The third site had 642 words, 361 were used in links – a whopping 56.2%.

I don’t think there’s some set formula which dictates content vs links. I mention this only because on the sites I looked at the links overwhelmed the content and visually appeared quite excessive.

Unprotected Content

Some of the webmasters participating in the Google thread seem to have not pro-actively protected their content. They complained about being outranked by sites who stole their content. They didn’t file DMCA complaints, contact webhosts or request the sites be delisted. (I’ve lost count of how many hours I’ve spent on DMCA complaints and protecting my content.)

Sadly, it would seem that Google has made it clear that site owners are going to have to get proactive about protecting their content. Sites that did so, appear to have maintained their rankings. Sites that did not are now truly competing, and too often losing, against the thieves. Perhaps, the algorithm identified them as content farms if the number of stolen pieces was statistically significant.

Conclusions

As far as I can see, webmasters who saw positive results have a number of things in common. Their sites are well-designed, well-coded and well-written. They used unique, unsyndicated content. eCommerce sites with original descriptions and a fair amount of product reviews also faired well; those with little on-page content fell in the SERPs.

Where I think Panda goes wrong is that many heavily SEO’ed sites have been rewarded for their programming prowess and not the quality of their content. Quality sites are falling prey to sites with what often amounts to gibberish on keyword-specific domains. Google maintains that the Panda update focused on weak or shallow content. Unfortunately, it may have also rewarded content thieves and low quality, made for Adsense sites with streamlined code and superior SEO strategies.

Published in: Search Engine Optimization

3 Comments

  1. Hi Michele,
    first of all welcome and thank you so much for this great article!

    I like the part about “protect your content”, THIS article was first recognized by Google on some third party website (because I was not active enough the last 2 month).
    I send an e-mail to the owner of that site and he removed all my content from his site immediately. Several hours later Google has found this article on this site and removed the copy from the SERPs.

    I hope this “Real World” example (blunder) will help other blog owners to protect their content. If you publish a new article check always in Google if your article is on top of the SERPs (for example if you search for the page title).

    I’m not very surprised that eCommerce sites getting hard times because of all the duplicates. Some portals are very simple, they put some feeds together and call themselves an eCommerce portal. I know one bigger portal in Holland which has without the (local) panda update big problems these days, why? Because they provide only a little amount of content. Hard times for the 35 people working for that company…

  2. … It’s very funny, I tried to find some recent blogs about the “Google Panda Update” while using the Google blog search. Most results I found are republished blog posts (duplicates), so where is the filter update? :D

  3. Thank you for the warm welcome. Sadly, we timed this post a bit badly. The same day we posted was the day Demand Studios released their quarterly earnings report and discussed their opinion of the Panda Update. This post got a bit lost in the shuffle.

    I wonder if there will be a round 2 of Panda and whether or not it will address the sites who are Wikipedia clones, Gutenberg.org republishers and other similar sites.

Comments are closed.