Showing posts with label Google Panda. Show all posts
Showing posts with label Google Panda. Show all posts

The Relentless Pounding of Google Panda: Why SEO Band-Aids Won't Work

Google Panda Band-Aids

Getting hit by Panda is a horrible experience. I've helped a number of companies deal with Panda's wrath since February 2011, and business owners typically exhibit some combination of fear, frustration, and anger when they contact me. It's easy to understand why.

Most businesses hit by Panda are hemorrhaging revenue, as organic search traffic plummets by a large percentage. Some businesses have seen traffic drop by 60 to 70 percent overnight. That's a tough pill to swallow, especially when Google is a core source of revenue.

Understanding the catastrophic impact to businesses is why I'm extremely aggressive with my approach to Panda recovery. I'd rather have a client recover and build upon a clean foundation rather than sit in the gray area of Panda without knowing how close they are to recovery. And sitting in the gray area can drive a business owner mad.

The Relentless Pounding of Panda

After performing a thorough Panda audit, there are times audit presentations are 40-60 slides in PowerPoint covering the various changes that need to be made. The changes are based on my experience with Panda (knowing the various triggers that could cause a hit). Needless to say, you can sometimes hear a pin drop during those presentations, as webmasters begin to understand the amount of work involved to send Panda packing.

At that point, it's common to hear the following question: "Can we start by fixing one or two items, wait to see if we recover, and then move on from there if we don't?"

Although the thought process is totally understandable here, it's the wrong approach. My recommendations are based on analyzing SEO problems I have seen on many websites hit by Panda and Phantom. Companies shouldn't cherry pick changes.

The problems need to be rectified from both a usability and SEO standpoint, and sooner than later. If a significant amount of changes aren't made, then companies open themselves up to future Panda attacks. And if there's anything more frustrating than getting hit by Panda once, it's getting hit by Panda multiple times.

Remember, Panda targets low-quality content (and poor user engagement as a result). If you only fix one problem out of seven, then you can end up getting stuck in the gray area of Panda, or worse, you open yourself up to the relentless pounding of Panda over time. And the yo-yo effect can break down a business owner's will even more than never recovering.

An Example of Panda Pounding

Unfortunately, several companies have approached me over the past few years for help after having experienced the relentless pounding of Panda. It's always amazing to analyze their Google organic trending when I begin helping them, as you can easily see the yo-yo effect in action.

For example, I've provided the trending below for a client who was hit three separate times during Panda updates, and once by Phantom (which also targeted low quality content). This client also recovered from Panda once, only to get hit again during the Phantom update of May 8.

Multiple Google Panda Poundings
Phantom After Panda

Band-Aids Can Be Temporary

Temporary fixes – or making just enough changes to pass the Panda threshold and recover – are what cause the yo-yo effect. As time goes on, and other problems remain (e.g., user engagement problems), Panda might come back and hit the site again.

Then more changes are made, and again, they are just enough to recover. The pounding subsides, but only until a Panda update hits once again.

When you add up all of the time involved with getting hit multiple times, putting band-aids on the site, recovering, and then getting hit again, a company could have (and should have) made all of the changes that were recommended in the first place. Implementing all of the changes mapped out during an audit can help Panda-proof the site.

For example, let's say I analyzed a website hit hard by Panda and came up with a number of core changes that need to be implemented. Maybe they include removing doorway pages, cutting down on duplicate content, enhancing or removing key landing pages, and revamping the navigation (based on a heavy cross-linking problem).

If the company only removes doorway pages, is that enough to recover from Panda in the long-term? What if they end up recovering, only to get hit again due to thin content landing pages (or other problems)?

Will Targeting Two Problems Be Enough

When you break it down, the website has numerous problems that could be impacting it from a Panda standpoint. Addressing one or two problems, while leaving several in place, isn't a long-term solution for keeping Panda at bay.

So, based on my experience helping business owners with Panda, here are some recommendations for long-term success:
  • Have a thorough SEO audit completed to identify problematic areas from a Panda standpoint. Thorough audits are the most powerful deliverable in all of SEO.
  • Look beyond just SEO. One important thing I have learned during Panda work is that the right changes truly make the site better. There are no tricks involved… It comes down to building a better website. Engagement matters, user experience matters, great content matters, and organization matters.
  • Prioritize all recommended changes with the help of your SEO provider (whether that's an internal SEO, consultant, or agency). SEO professionals who have worked on several Panda projects can bring valuable insight about past experiences and recoveries.
  • Once you have your plan (which should include most, if not all, of the changes recommended), execute at a high level and at a fast pace. Panda is supposed to roll out monthly and can take 10 days to fully roll out. The bad news is that Google isn't confirming Panda updates anymore, but the good news is that you technically have a chance to recover once per month.
  • Once you make all of the changes, don't sit back and wait. Keep driving forward by building killer content, using social media to get the word out, analyzing your reporting to identify areas of opportunity, etc. As I documented in a Panda recovery case study, "act like you're not being impacted by Panda. Just keep going." This can help you send the right signals to Google, even when you are being hampered by the Panda filter.

Summary – Be Thorough, Objective, and Aggressive to Win the Panda Battle

If you leave this post with just one key piece of information, it's that band-aid fixes might not be effective over the long-term. Panda can absolutely strike again, and when it returns, you need to make sure it rolls past your camp versus stomping all over it
.
Understand the major problems impacting your site, and develop a thorough plan for fixing those problems. That's how you can Panda-proof your website. Good luck.


Original Article Post by Glenn Gabe @ Search Engine Watch

Matt Cutts Talks Google Panda Basics: Make Sure You've Got Quality Content

Matt Cutts

What should you do if you think your site might be affected by Google's Panda algorithm? And what types of content get impacted negatively by Panda? That is the topic of a recent video featuring Google's Distinguished Engineer Matt Cutts.
That first gives is a bit of a primer on how Panda rolled out previously and how it currently is rolling out into the search algorithm.

"So Panda is a change that we rolled out, at this point a couple years ago, targeted towards lower quality content. And it used to be that roughly every month or so, we would have a new update. And we would say, OK, there's something new, there's a launch, we've got new data, let's refresh the data.

"And it had gotten to the point where with Panda, the changes were getting smaller, they were more incremental. We had pretty good signals. We'd pretty much gotten the low hanging wins. So there weren't a lot of really big changes going on with the latest Panda changes."

So as Google got better at finding low quality content, they adjusted how and when new Panda updates would impact the search results.

"And we said, let's go ahead, and rather than have it be a discrete data push – that is, something that happens every month or so, at its own time, when we refresh the data, let's just go ahead and integrated into indexing. So at this point, we think that Panda is affecting a small enough number of webmasters on the edge that we said, let's go ahead and integrate it into our main process for indexing."

But what if you did get hit with Panda? First off, it likely means that your content is either poor quality, or it's of the cut-and-paste variety that can be found on many free article sites.

You should also check in your Google Webmaster Tools to see if there's any kind of alerts for you in your account that can help you determine if it is Panda or something else that is negatively affecting your Google search rankings.

"And so, if you think you might be affected by Panda, the overriding kind of goal is to try to make sure that you've got high-quality content, the sort of content that people really enjoy, that's compelling, the sort of thing that they'll love to read that you might see in a magazine or in a book, and that people would refer back to, or send friends to, those sorts of things," Cutts said. "So that would be the overriding goal."

So what if you think it might be the quality of your content that is affecting your rankings? Panda was pretty tough on many types of content that Google deemed to be of poor quality.

"So if you are not ranking as highly as you were in the past, overall, it's always a good idea to think about, OK, can I look at the quality of the content on my site? Is there stuff that's derivative, or scraped, or duplicate, and just not as useful?"

Not surprisingly, Cutts said this is a type of content that doesn't rank well, and it's the quality content that will be higher up in the Google search rankings.


Original Article Post by Jennifer Slegg @ Search Engine Watch

Recovering from Penalties, Penguin, and Panda

We've seen tremendous upheaval in the search landscape over the last two years. Google has sent web publishers into a tailspin with Pandas, Penguins and penalties.

An in-depth session at SES San Francisco 2013, Recovering from Penalties, Penguin, and Panda, looked at the types of Google penalties and algorithm updates that can impact your site, why your site may have been affected, and how to recover.

The session featured Stone Temple Consulting President Eric Enge (@stonetemple) along with two of his colleagues – COO John Biundo and Senior Marketing Consultant Kathy Brown – who talked about recovering from Google's algorithm updates named after black and white zoo creatures.

Google's Panda

Panda DevilPandas may be cute, but they can be mean. Enge started the session by talking about the primary causes of the being hit by Google's Panda algorithm.

When you're creating content, think about the idea of "sameness," he said. You really can't write anything new about certain topics like "making French toast," for example.

So think about how you're adding value by asking if what's being written on the topic is something unique from what's in the results for that query. If not, it's bad for Google's search product and this is the primary reason Panda exists.

So that's poor-quality content. What about this concept of doorway pages?

Doorways pages exist just to capture search traffic and convert the customer immediately, Enge said. They don't offer depth on the topic beyond just the one page. These are frowned upon.

One exception to some of the Panda rules is if you're a big brand with a specific type of site, Enge said. Big ecommerce brands like Amazon don't necessarily need in-depth content.

So how do you know if your site has poor quality? Take off the blinders to your website and look at the content with a critical eye.

Ways to deal with bad-quality pages on your site include 301 redirects, using noindex on pages or rewriting the pages and adding more supporting pages to that main topic. One client saw a 700 percent recovery by rewriting and adding content to their site, Enge said.

Then, be patient. It isn't necessarily a fast recovery from Panda, Enge said.

Google Manual Action Penalties

Google PenaltyBiundo took the podium next to talk about on-site manual penalty actions by Google. Manual actions can be applied to the site as a whole or a limited portion of the site.

The most common triggers for manual actions include cloaking and sneaky redirects. Most people know if they've done this, but sometimes they don't if, for example, they've inherited a site. You can go into Google Webmaster Tools, Fetch as Googlebot, and examine the code to find out.

Hidden text and keyword stuffing are other reasons for manual penalties. Again Biundo said that most people know what they have done here. The key is to just get rid of it, then go back to Google to tell them you've fixed it.

Thin content is another reason for manual actions. These are cases where there's no "value add" in the content. Things like auto-generated content, scraped content, and so on are all examples of this.

User generated spam like un-moderated comments could also trigger a manual action. Clean up the spam and close the comment spam loophole by doing things like putting nofollow on links to discourage it and using CAPTCHA for comments.
The next no-no is unnatural links, such as when you buy or sell links with the intent of trading PageRank. If you're selling links for traffic, make sure you nofollow them.

Links: The Good, the Bad and the Ugly

The Golden Rule for links (and this is directly from Google) is:
Any links intended to manipulate PageRank or a site's ranking in Google search results may be considered part of a link scheme and a violation of Google's Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site.
One exception is some directories, Biundo said. Select directories are OK for a site to be listed within, like Yahoo, Best of the Web, and DMOZ. These are examples of directories where the editorial review standard for the links are high, and therefore, OK in Google's eyes.

So when looking to be a part of directories, don't pursue a vast quantity of directories, Biundo said. Be exclusive.

Guest posting when done well and in high quality sites is a valid technique, even though there's a lot of controversy around it. Guest posts at sites that aren't directly related to the industry or subject matter of your business can be a red flag to Google.

Also, when posting, don't use or overuse links back to your website within your guest post. Once is OK if it's appropriate.

Infographics are another controversial topic with Google. They've been vastly abused, so the key is to make sure they are high quality, factual, and offer value if you're going to use them as part of your digital marketing strategy.

Press releases are still OK, but just don't load them up with keyword-rich links back to the site. If you're going to embed links, stick to branded links for your site or business name in the footer of the press release.

There are countless ways to build manipulative links – many that have not been touched upon in this session. Biundo said it's as simple as knowing a bad link when you see one. Stick to Google's Golden Rule for links, and you should be OK.

Link Penalties: How to Recover

penguin-recoveryKathy Brown took the stage next to talk about how to recover from penalties or algorithm updates affecting a site. Penguin is an algorithmic detection, so you likely won't have any messages in Webmaster Tools or manual actions listed. So the traffic drop will be the biggest indicator cross-referenced to algorithmic updates. You can find the history of algorithm changes at Moz.

Next step is to remove any problem links and then be patient as it's not an instantaneous recovery. Requesting a reconsideration is not going to help in these cases. The algorithm has to detect that the links you have removed are gone.

For manual link actions, you have site-wide and partial penalties. Look at the announcements in Webmaster Tools to see what action was taken and what the reason is for the penalty.

Identifying links can be daunting. So be sure you look at every domain in your link profile, but don't worry about examining each and every link if you have thousands coming from the same domain. Instead, choose a sample and determine if the links coming from the domain are good or bad overall.

Categorize the links for efficiency after identifying them, Brown said. You can group them together in categories such as "blogs," "links from the same 'C' block," "links from comments," etc. You can divvy up the work between team members this way.

Here's a five-step way to clean up links that Brown said they use at Stone Temple Consulting:
  1. Build a list of links using tools like Open Site Explorer, Google Webmaster Tools, Bing Webmaster Tools and so on.
  2. Categorize the links as previously mentioned.
  3. Examine the links and determine if they are good or bad.
  4. Get contact info through social, email, or snail mail.
  5. Be polite when you contact the webmaster. Don't blame them, but be persistent.
After you've done all you can and if you've had no luck, you can try the Google Disavow Tool to disavow the links. If you have had luck in cleaning up your links, you can submit a reconsideration request at that time.

When submitting a reconsideration request, you need thorough documentation to show the reviewer at Google that you've done all you can to clean up your site. Stick to the facts and tell the story. Explain how it happened, how you cleaned it up, and that it won't happen again.

At the same time you are cleaning up links, you should be building great links through quality content. This is how you can ultimately pull through those penalties.

Audience Questions

Q. What about curated articles?
A. Content curation without adding expert commentary isn't good for SEO, generally speaking.
Q. What about link exchanges?
A. If it's with legitimate business partners, it's OK. 
Original Article Post by Jessica Lee @ Search Engine Watch

Google Panda, Penguin & Phantom: 3 Recovery Examples

The latest Panda Update began rolling out the week of July 15 and was confirmed by Google. I was happy to see Google confirm the update, since that wasn't supposed to happen anymore.

During the July update, many webmasters reported recoveries of varying levels, which seemed to match up with what Google's Distinguished Engineer Matt Cutts explained about the “softening” of Panda. Also, this was the first rollout of the new and improved Panda, which can take 10 days to fully roll out.

But remember, there were other significant updates in May, with both Penguin 2.0 and Phantom. Since many sites were hit by those updates as well, I was eager to see how the new Panda update impacted Penguin and Phantom victims as well.

And we can't leave out "Phanteguin" victims (companies hit by both Phantom and Penguin 2.0). Could they recover during the Panda update too?

Needless to say, I was eager to analyze client websites hit by Panda, Phantom, Penguin, or the killer combo of Phanteguin. What I found was fascinating, and confirms some prior beliefs I had about the relationship between algorithm updates.

For example, I've witnessed Penguin recoveries during Panda updates before, which led me to believe there was some connection between the two. Well, the latest Panda update provides even more data regarding the relationship between Panda, Penguin, and even Phantom.

Pandas Helping Penguins, and Exorcising Phantoms

Exorcism

Past case studies about Penguin recoveries during Panda updates led me to believe that Panda and Penguin were connected somehow, although I'm still not sure exactly how. It almost seems like one bubbles up to the other, and if all is OK on both fronts, a site can recover.

Well, I saw this again during the latest Panda update, but now we have a ghostly new friend named Phantom who has joined the party.

During the latest Panda update, I saw several companies recover at varying levels from Panda (which makes sense during a Panda update), but I also saw websites hit by Penguin and Phantom recover. And to make matters even more complex, I saw a website hit by Phanteguin (both Phantom and Penguin) recover.
panda-phanteguin-recovery-wmt

Today, I'm going to walk you through three examples of recoveries, so you can see that Pandas do indeed like Penguins, and Phantoms can be exorcised. My hope is that by providing a few examples of sites that recovered during the latest Panda update, you can start to better understand the relationship between algorithm updates, and how Panda is evolving.

Case 1: Panda Recovery – Fixing Technical Problems + The Softer Side of Panda

Several of my clients recovered from Panda hits during the July 15 update, and I'll cover one particularly interesting one now. The website I'm referring to has seen a significant loss in traffic over the past several months. Based on a technical SEO audit I conducted, the company has performed a massive amount of work on the website over the past several months.

The site had a number of serious technical problems, and many of those problems have been rectified. It's a large site with over 10 million pages indexed, and inherently has thinner content than any SEO professional would like to see. But, that's an industry-wide issue for this client, so it was never fair that it was held against them (in my opinion).

The site also had technical issues causing duplicate content, massive XML sitemap issues (causing dirty sitemaps), and many soft 404s. A number of these technical problems led to poor quality signals being presented to Google, which eventually caused problems Panda-wise.

Thin Content Identified During a Test Crawl of 50K Pages

Min Content Size

But on July 15, the site experienced a nice bump in Google organic traffic. Impressions more than doubled and Google organic traffic increased by 60 percent.
panda-recovery-technical-issues

The site in question removed technical issues causing duplicate content, soft 404s, etc. To me, that was the real problem, as users still benefited from the content, even if the content was "thin."

I also think this is a great example of the softening of Panda. The correction was the right one, and I'm glad to see that Google made the change to its algorithm.

To sum up this example, technical changes cut down on duplicate content and soft 404s, but the original content remained (and it was technically thin). The new Panda seemed OK with the content, and rewarded the site for fixing the issues sending dangerous quality signals to Google.

Case 2: Panda Recovery While Still Being Impacted by Penguin

One site I analyzed after the latest Panda update saw a nice increase in Google organic traffic, even though it had gotten hammered by Penguin (and had not recovered from that algorithm update yet).

From a Panda perspective, the site got hit in November of 2012, losing approximately 42 percent of its already anemic Google organic search traffic. Then the site remained at that level until the latest Panda update in July 2013. Unfortunately, the website got hit by Pandeguin, or both Panda and Penguin.

Now, I've seen Penguin recoveries during Panda updates before, but I wasn't sure if you could see a Panda recovery on a site being severely impacted by Penguin. This case confirms that you can recover from Panda even when you're still being severely hampered by Penguin. The site in question got hammered during the first Penguin update (April 24, 2012), and lost ~65 percent of its Google organic traffic overnight.
penguin-hits-april-24-2012

The business owner worked hard to remove as many unnatural links as possible, and used the disavow tool, but didn't recover during subsequent Penguin updates (even Penguin 2.0). Actually, the site saw another dip during Penguin 2.0.
July 15 arrived and the site recovered from Panda to its pre-Panda levels. Specifically, the website doubled its organic search traffic from Google, but it's still below where it was pre-Penguin 1.0.

This makes complete sense, since the site shouldn't return to its pre-Penguin levels since there are still more links to remove from my perspective. So, returning to pre-Panda levels make sense, but the site still needs more work Penguin-wise in order to recover.

To quickly recap, the website was impacted by Penguin 1.0 and 2.0, and hit by Panda in November 2012 (or Pandeguin). The webmaster worked hard to remove unnatural links, and used the disavow tool as well, but hasn't removed as many links as needed to recover from Penguin.

The site didn't recover during Penguin updates, but ended up recovering during the latest Panda update where it returned to its pre-Panda levels. It's a great example of how a website could recover from Panda even when it's still being impacted by Penguin.

panda-recovery-during-penguin

Case 3: Phanteguin Recovery – Penguin and Phantom Recovery During a Panda Update

This might be my favorite case from the latest Panda update.

A client I'm helping was hit by Penguin 1.0 in 2012 and never recovered. Then they were impacted by Phantom on May 8, and hit even harder by Penguin 2.0 on May 22.

When you've been hit by Phanteguin, it means you have both a content and unnatural links problem. So you have to fight a battle on two fronts.
Phanteguin Drop

This case, in particular, shows how hard work and determination can pay off when you are hit by an algorithm update (or multiple algo updates).

My client moved really fast to rectify many of the problems riddling the website. They attacked their unnatural links problem aggressively and began removing links quickly. It's worth noting that the site's link profile was in grave condition (and that's saying something, considering I've analyzed more than 220 websites hit by Penguin).
Percent of Unnatural Links

From a content perspective (which is what Phantom attacked), they had a pretty serious duplicate content problem. Their content wasn't thin, but the same content could be found across a number of other pages on the site (and even across domains).

In addition, there was a heavy cross-linking problem with company-owned domains (using exact match anchor text). That's another common problem I've seen with both Phantom and Panda victims.

A large percentage of the problematic content was dealt with quickly, which again, is a testament to how determined the owner of the company was to fixing the Phanteguin situation.

On July 15, the site began its recovery (and has been increasing ever since). The site is now close to its pre-Phanteguin levels. And once again, the reason it's not fully back is because it shouldn't have been there in the first place. Remember, there was an unnatural link situation artificially boosting its power, including unnatural links from company-owned domains.
Phanteguin Recovery

So, this case study shows that Penguin recoveries can occur outside of official Penguin updates, and it also shows that recovery from Phantom-based content issues is possible during Panda updates.

It's important to understand that this is going on. Nobody outside Google knows exactly how the algorithm updates are connected, but I've now seen them connect several times (and now Phantom is involved).

What Didn't Work

Now we've examined what worked. But what didn't work for companies trying to recover?

July 15 wasn't a great day for every website impacted by Panda, Phantom, or Penguin. Here's why:

1. Lack of Execution

If you've been hit by an algorithm update, and don't make significant changes, then there's little chance of recovering (at least to the extent that you desire).

I know a number of companies that were reluctant to implement major changes, including gutting content, refining their structure, changing their business model, and tackling unnatural links. They didn't recover, and it makes complete sense to me that they didn't.

There are tough decisions to make when you've been hit by Panda, Phantom, or Penguin. If you don't have the intestinal fortitude to take action, then you probably won't recover. I saw several cases of this during the July 15 update.

2. Rolling Back Changes and Spinning Wheels

I know a few companies that actually implemented the right changes, but ended up rolling them back after having second thoughts. This is one of the worst things you can do.

If you roll out changes, but don't wait for another update, then you will have no idea if those changes worked. Worse, you are sending really bad signals to the engines as you change your structure, content, internal linking, etc., only to roll it back a few weeks later.

You must make hard decisions and stick with them in order to gain traction. Do the right things SEO-wise and solidify your foundation. That's the right way to go.

3. Caught in the Gray Area

I've written about the gray area of Panda before, and it's a tough spot to be in. If you don't change enough of your content, or remove enough links, then you can sit in algorithm update limbo for what seems like an eternity.

You should fully understand the algorithm update(s) that hit you, understand the risks and threats with your own site, and then execute changes quickly and accurately. This is why I'm very aggressive when dealing with Panda, Phantom, and Penguin. I'd rather have a client recover as quickly as possible and then build on that foundation (versus getting caught in the gray area).

Key Takeaways

If you're still dealing with the impact of an algorithm update (whether Panda, Phantom, or Penguin, or some combination of the three), hopefully these takeaways will help you get on the right path.
  • Have a clear understanding of what you need to tackle. Don't be afraid to make hard decisions.
  • Move fast. Speed, while maintaining focus, is key to successfully recovering from an update.
  • Clean up everything you can, including both content and link issues. Don't just focus on one element of your site, when you know there are problems elsewhere. Remember, one algorithm update may bubble up to another, so don't get caught in a silo.
  • Stick with your changes. Don't roll them back too soon. Have the intestinal fortitude to stay the course.
  • Build upon your clean (and stronger) foundation with the right SEO strategy. Don't be tempted to test the algorithm again. It's not worth it.

Summary: Analyze, Plan, and Execute

Like many other things in this world, success often comes down to execution. I've had the opportunity to speak with many business owners who have been impacted by Panda, Phantom, and Penguin. I've learned that it's one thing to say you're ready to make serious changes, and another to execute those changes.

To me, it's critically important to have a solid plan, based on thorough analysis, and then be able to execute with speed and accuracy. Working toward recovery is hard work, but that hard work can pay off.


Article Post @ Search Engine Watch
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2013. Google Updates - All Rights Reserved
Template Created by Creating Website Published by Mas Template
Proudly powered by Blogger