This guest post is by Ethan of OneProjectCloser.com
When Google rolled out the first Panda update on 23 February 2011, we saw our site traffic plummet by 40%. I learned about this four hours after quitting my day job to become a full-time blogger. I don’t regret the decision for a second, but it presented some unique challenges for the days ahead.
Since then, we’ve employed several different strategies to reclaim our former glory. Research and site analysis led us to remove potentially low quality content. We’ve experimented with modifying and removing ads, all the while trying to better the user experience. It’s important to know that we haven’t seen a recovery … yet. None of what I’m about to share has made a significant improvement, but hopefully this article will provide insight for other publishers.
“This update is designed to reduce rankings for low-quality sites”—Amit Singhal, Google Fellow
Google has mentioned time and again that the new Panda document classifier impacts the entire site. Before, you could have a handful of really good posts and the onus was on Google to find them. Now, webmasters shoulder the responsibility to carefully curate every shred of content.
Since the term low-quality is subject to some interpretation, we began our site analysis to identify the high-quality content. The goal was to improve our link profile and eliminate everything but our best content. Using data from Google Analytics, Webmaster Tools, and backlink analysis tools, we rated every single post. Specifically, we looked at top landing pages, content by number of links, content by number of linking domains and domain authority. Many of these factors correlate with AdSense earning so we also took that into account.
Removing low-quality content
“…is blocked from crawling and indexing, so that search engines can focus on what makes your site unique and valuable…” – John Mu, Google Employee
We decided which articles needed to go and which would stay. It was painful to think about deleting about 75% of our archives, so it was a relief to find alternative ways of “removing” content. By blocking crawling, we would be able to keep informative posts that didn’t make the cut, and preserve link juice.
In another forum, John Mu stated that you should use a 404 or 410 error code for pages that are not worth salvaging, 301 redirect items that can be merged, and a “noindex” meta tag for content that you plan to rewrite. Matt Cutts did a live webcast on May 25 in which he verified that noindexing is a good solution for removing low-quality content. Blocking content in robots.txt prevents Googlebots from crawling whereas noindexing allows crawling and following links.
Ads and affiliate links
“While it’s exciting to maximize your ad performance with AdSense, it’s also important to consider the user experience…” – Best Practices Guidelines, Google AdSense
It seemed very telling that the AdSense team released new guidelines for ad placement about two months after Panda hit. A lot of publishers felt slighted because AdSense optimization specialists have always pushed for more ad blocks and more aggressive placements. Now it seemed there was a threshold for ads that pushed content below the fold. This isn’t a stretch, as Google already renders each page for the preview they provide alongside search results. They know where the ad blocks fall.
I’ll admit we were being aggressive with our ad placement. We took the plunge and removed AdSense for over a month, through the Panda 2.2 update, but saw no improvement. Since, we’ve only replaced AdSense on a handful of articles.
We suspect that Google views affiliate links much like ads, especially as it may bias the publisher toward a specific product. Eliminating the majority of our affiliate links was easy as only a few ever converted. But needless to say, overall these changes have really hit us where it hurts.
“The Panda Technology appears to have helped some scraper sites” – Michael Martinez,
Michael shares that he had a hard time finding examples of scrapers outranking the original authors, but he hits the nail on the head in the last line of the section. If Panda isn’t demoting your site, you’ll still outrank the scrapers. Our site doesn’t.
I’ve submitted a lot of takedown notices since Panda hit, but that isn’t the only duplicate content we’ve been reviewing. A lot of our articles overlap because of similar (but distinct) topics. We began working to make sure each article could stand on its own merit with unique ideas and fresh perspective. This was no easy task, and is still a work in progress.
The end-user experience
“The +1 button is shorthand for ‘this is pretty cool’ or ‘you should check this out.’ Click +1 to publicly give something your stamp of approval.” – Google +1
Bloggers have known that social marketing (a good metric for user experience) is an important part of your online identity and a great way to build readership. With moves like the +1 button, Google shifts some of the power from site owners to the everyday web surfer. Before, we would build relationships and advocate for links from webmasters, but that system was easily gamed. Now, the end user experience and how they interact on your site matters more than ever.
We’ve made a lot of improvements, and in some ways I’m glad Panda has had such a dramatic impact. Nothing else would have spurred on many of the changes we’ve made. Our site will be refined by fire with the end result that will be much better than before. Sometimes webmasters are too close to their own products.
If you have ideas about overcoming the Panda demotion, or suggestions for how we can improve, I’d love to hear them.
Ethan is 28 years old, and loves construction and home improvement. He co-founded OneProjectCloser.com in 2008 where he shares how-to projects, tool reviews and more. To stay connected, follow One Project Closer on Twitter and their new Facebook page.