Showing posts with label google panda algorithm. Show all posts
Showing posts with label google panda algorithm. Show all posts

Thursday, July 18, 2013

Google Panda Update: Possibly The "Softer" Panda Algorithm

There is renewed chatter in the WebmasterWorld forums about another shuffle taking place in Google. The consensus is that this update is likely Panda related.
google panda
We know Google has now slowed the Panda rollout to happen over several days and we also know Google will not confirm Panda updates anymore. We also know that Google wanted to soften the Panda algorithm a bit.


The chatter in the forum seems to back up the soften part where many folks, not all, are saying it looks like a Panda recovery has been pushed out.


Here are some quotes from the thread:


seen very big changes - clearly Panda recovery. Dramatic change on the 12th and sustained since. Average rank position, # of unique search terms driving to site both improved as well.


Major improvements here starting on the 12th and leveling off today. Traffic is now about 5% higher than 2012 after being 25% lower all year long.


Panda Cub update


Despite Google telling me they won't confirm these updates anymore, I am going to try to reach out and see if this is indeed the "softer" Panda algorithm being released. If I hear back, I will let you know.


Do you think this update is Panda related?

Wednesday, May 8, 2013

Google Update Brewing? May 2013


There are some very early signs of a possible Google update brewing as of early this morning. AWebmasterWorld thread has some renewed chatter around an update.
Google Update Brewing

Note, most of the WebmasterWorld thread is about April 15th changes, which people say have to do with the Boston bombings and seasonal traffic changes. But last night, early this morning, two webmasters came in and said they saw major shifts in rankings and traffic.

A preferred WebmasterWorld member said he saw a 77% drop just yesterday. Others said "some thing big is underway," after noting drops in his keyword ranking.

SERPs.com reports pretty significant changes in he Google results on Monday. SERPMetrics.com shows very little change in the search results. MozCast has not yetupdated with results from Monday but they showed changes on Sunday, which seem off.

It is very early and nothing is confirmed - but there may be signs of a possible Google update. What exactly, is still unknown and unconfirmed.
Forum discussion at WebmasterWorld.

Friday, February 22, 2013

Google Rarely Updates The Penguin Algorithm



It has almost been five months since the last Penguin refresh and no updates to the Penguin algorithm are in sight.

In fact, I reported yesterday at Search Engine Land that No, Google Hasn’t Released Unannounced Penguin Updates. Why did I have to report that? Couple reasons:
(1) There was some speculation based on a video hangout with John Mueller that Penguin refreshed regularly. It does not, it never did, and the truth is, it refreshes very rarely.
(2) It has been almost five months since an official Penguin update and I wanted to make sure we didn't miss any updates.
Google has told us that Penguin is rarely refreshed, unlike Panda and we didn't miss any Penguin refreshes since.
What was John talking about? He was talking about normal link analysis is refreshed and rerun continuously.
I posted this on my Google+ page and then someone brought up the Zebra update. There is no such thing, stop asking me about Zebras. There was not a Zebra update.
Forum discussion at Google+.

Wednesday, February 20, 2013

Google Panda #25 Coming Today? Not Sure.



An ongoing WebmasterWorld thread has some chatter around an increase in GoogleBot crawl activity as well as some early ranking fluctuations.

That and we are just about the 30 day mark from the previous Google Panda update, Panda #24, we are suspecting a Panda update is about to be hitting today or tomorrow.
Normally, days before a Panda update is announced by Google, we see this type of chatter and GoogleBot activity. The issue is, it has been almost 5 months since the lastconfirmed Penguin update, so webmasters are unsure what is going on with that.
That being said, Mozcast showed some activity the other day, so did SERPs.com, however SERPMetrics doesn't show much, and now DigitalPoint shows changes (See "Search Engine Rank Changes") also but nothing crazy.
Is a Panda refresh about to hit us? I suspect so but only Google can confirm that.
Forum discussion at WebmasterWorld.

A Google Penalty Removal Leads To Less Google Traffic?



WebmasterWorld thread has an interesting case I've never seen before.
A webmaster, who is a "senior member" at WebmasterWorld, claims he received a notification that Google revoked a manual penalty they had on his site for a while. Three days after the penalty revocation the traffic from Google "completely" stopped coming in.

It is like he was better off with the manual penalty versus not having the penalty at all.
We know Google does manually revoke partial penalties but this doesn't seem like a partial revocation.
One theory given in the thread is that now the manual penalty was revoked, an automated algorithmic penalty kicked in and made things worse. I am not sure I believe that 100%. I don't think it works that way, but who knows - I don't have that type of knowledge. It just doesn't seem right based on the knowledge I do have.
Netmeg explained his theory in the thread:Is it possible that once the manual action was revoked, one of the algorithmic changes finally kicked in?
Or maybe it is just a temporary glitch and things will fix itself in a few more days?
Forum discussion at WebmasterWorld.

Wednesday, February 6, 2013

4 Steps to Panda-Proof Your Website (Before It’s Too Late!)


It may be a new year, but that hasn’t stopped Google from rolling out yet another Panda refresh.

Last year Google unleashed the most aggressive campaign of major algo updates ever in its crusade to battle rank spam. This year looks to be more of the same.
Since Panda first hit the scene two years ago, thousands of sites have been mauled. SEO forums are littered with site owners who have seen six figure revenue websites and their entire livelihoods evaporate overnight, largely because they didn’t take Panda seriously.
If your site is guilty of transgressions that might provoke the Panda and you haven’t been hit yet, consider yourself lucky. But understand that it’s only a matter of time before you do get mauled. No doubt about it: Panda is coming for you.
Over the past year, we’ve helped a number of site owners recover from Panda. We’ve also worked with existing clients to Panda-proof their websites and (knock on wood) haven’t had a single site fall victim to Panda.
Based on that what we’ve learned saving and securing sites, I’ve pulled together a list of steps and actions to help site owners Panda-proof websites that may be at risk.

Step 1: Purge Duplicate Content

Duplicate content issues have always plagued websites and SEOs. But with Panda, Google has taken a dramatically different approach to how they view and treat sites with high degrees of duplicate content. Where dupe content issues pre-Panda might hurt a particular piece of content, now duplicate content will sink an entire website.
So with that shift in attitude, site owners need to take duplicate content seriously. You must be hawkish about cleaning up duplicate content issues to Panda-proof your site.
Screaming Frog is a good choice when you want to identify duplicate pages. This article by Ben Goodsell offers a great tutorial on locating duplicate content issues.
Some suggestions for fixing dupe content issues include:
  • Meta directives (e.g. noindex, follow).
  • Canonical tags (rel=“canonical”).
  • 301 redirects.
  • Block pages via Robots.txt file.
  • Remove URLs via Webmaster Tools.
  • Choose your preferred domain in Webmaster Tools.
Now, cleaning up existing duplicate content issues is critical. But it’s just as important to take a preventative measures as well. This means, addressing the root cause of your duplicate content issues before they end up in the index. Yoast offers some great suggestions on how to avoid duplicate content issues altogether.

Step 2: Eradicate Low Quality, Low Value Content

Google’s objective with Panda is to help users find "high-quality" sites by diminishing the visibility (ranking power) of low-quality content, all of which is accomplished at scale, algorithmically. So weeding out low value content should be mission critical for site owners.
But the million dollar question we hear all the time is “what constitutes ‘low quality’ content?”
Google offered guidance on how to asses page-level quality, which is useful to help guide your editorial roadmap. But what about sites that host hundreds or thousands of pages, where evaluating every page by hand isn’t even remotely practical or cost-effective?
A much more realistic approach for larger sites is to look at user engagement signals that Google is potentially using to identify low-quality content. These would include key behavioral metrics such as:
  • Low to no visits.
  • Anemic unique page views.
  • Short time on page.
  • High bounce rates.
Of course, these metrics can be somewhat noisy and susceptible to external factors, but they’re the most efficient way to sniff-out out low value content at scale.
Some ways you can deal with these low value and poor performing pages include:
  • Deleting any content with low to no user engagement signals.
  • Consolidating the content of thin or shallow pages into thicker, more useful documents (i.e., “purge and merge).”
  • Adding additional internal links to improve visitor engagement (and deeper indexation). Tip: make sure these internal links point to high-quality content on your site.
One additional type of low quality content that often gets overlooked is pagination. Proper pagination is highly effective at distributing link equity throughout your site. But high ratios of paginated archives, comments and tag pages can also dilute your site’s crawl budget, cause indexation cap issues and negatively tip the scales of high-to low-value content ratios on your site.
Tips for Panda-proofing pagination include:
  • “No index, follow” paginated pages.
  • Tag paginated content with “rel=prev” and “rel=next” to indicate documents in a sequence.

Step 3: Thicken-Up Thin Content

Google hates thin content. And this disdain isn’t reserved for spammy scraper sites or thin affiliates only. It’s also directed at sites with little or no original content (i.e., another form of “low value” content).
One of the riskiest content types we see frequently on client sites are thin directory-style pages. These are aggregate feed pages you’d find on ecommerce product pages (both page level and category level); sites with city, state and ZIP code directory type pages (think hotel and travel sites); and event location listings (think ticket brokers). And many sites host thousands of these page types, which other than a big list of hyperlinks have zero-to-no content.
Unlike other low-value content traps, these directory pages are often instrumental in site usabilityand helping users navigate to deeper content. So deleting them or merging them isn’t an option.
Instead, the best strategy here is to thicken up these thin directory pages with original content. Some recommendations include:
  • Drop a thousand words of original, value-add content on the page in an effort to treat each page as a comprehensive guide on a specific topic.
  • Pipe in API data and content mash-ups (excellent when you need to thicken hundreds or thousands of pages at scale).
  • Encourage user reviews.
  • Add images and videos.
  • Move thin pages off to subdomains, which Google hints at. Though we use this is as more of a “stop gap” approach for sites that have been mauled by Panda and are trying to rebound quickly, rather than a long-term, sustainable strategy.
It’s worth noting that these recommendations can be applied to most types of thin content pages. I’m just using directory style pages as an example because we see them so often.
When it comes to discovering thin content issues at scale, take a look at word count. If you’re running WordPress, there are a couple of plugins you can use to asses word count for every document on your site:
  • WP Word Count
  • Admin Word Count Column
As well, here are some all-purpose plugin recommendations to help in the war against Panda.
All in all, we’re seeing documents that have been thickened up get a nice boost in rankings and SERP visibility. And this isn’t boost isn’t a temporal QDF bump. In the majority of cases, when thickening up thin pages, we’re seeing permanent ranking improvements over competitor pages.

Step 4: Develop High-Quality Content

On the flipside of fixing low or no-value content issues, you must adopt an approach of onlypublishing the highest quality content on your site. For many sites, this is a total shift in mindset, but nonetheless raising your content publishing standards is essential to Panda-proofing your site.
Google describes “quality content” as “content that you can send to your child to learn something.” Which is a little vague but to me it says two distinct things:
  • Your content should be highly informative.
  • Your content should easy to understand (easy enough that a child can comprehend it).
For a really in-depth look at “What Google Considers Quality Content,” check out Brian Ussery’s excellent analysis.
When publishing content on our own sites, we ask ourselves a few simple quality control questions:
  • Does this content offer value?
  • Is this content you would share with others?
  • Would you link to this content as an informative resource?
If a piece of content doesn’t meet these basic criteria, we work to improve it until it does.
Now, when it comes to publishing quality content, many site owners don’t have the good fortune of having industry experts in house and internal writing resources at their disposal. In those cases, you should consider outsourcing your content generation to the pros.
Some of the most effective ways we use to find professional, authoritative authors include:
  • Placing an ad on Craigslist and conduct a “competition.” Despite what the critics say, this method works really and you can find some excellent, cost-effective talent.  “How to Find Quality Freelance Authors on Craigslist” will walk you through the process.
  • Reaching out to influential writers in your niche with columns on high profile pubs. Most of these folks do freelance work and are eager to take on new projects. You can find these folks with search operators like [intitle:“your product nice” intext:“meet our bloggers”] or [intitle:“your product nice” intext: “meet our authors”] since many blogs publish an author’s profile page.
  • Targeting published authors on Amazon.com is a fantastic way to find influential authors who have experience writing on topics in your niche.
Apart from addressing writing resource deficiencies, the advantages of hiring topic experts or published authors include:
  • Authoritative authors raise the perceived value of your content.
  • You can leverage authorship credentials on Google+.
  • Author profiles display in the SERP snippets, and can improve CTR and help users find great content.
  • AuthorRank! It may not be a ranking signal just yet, but it will be.
  • Authorship engagement and satisfaction which may contribute to AuthorRank.
  • Higher engagement levels lead to longer clicks vs the short clicks. And I have to assume “time on page/site” is a signal Google pays attention to.
Finally, I wanted to address the issue of frequency and publishing quality content. Ask yourself this: are you publishing content everyday on your blog, sometimes twice a day? If so, ask yourself “why?”
Is it because you read on a popular marketing blog that cranking out blog posts each and every day is a good way to target trending topics and popular terms, and flood the index with content that will rank in hundreds of relevant mid-tail verticals?
If this is your approach, you might want to rethink it. In fact, I’d argue that 90 percent of sites that use this strategy should slow down and publish better, longer, meatier content less frequently.
In a race to “publish every day!!!” you’re potentially polluting the SERPs with quick, thin, low value posts and dragging down the overall quality score of your entire site. So if you fall into this camp, definitely stop and think about your approach. Test the efficacy of fewer, thicker posts vs short-form “keyword chasing” articles.

Panda-Proofing Wrap Up

Bottom line: get your site in-shape before it’s too late. Why risk being susceptible to every Panda update, when Armageddon is entirely avoidable.
The SEO and affiliate forums are littered with site owners who continue to practice the same low value tactics in spite of the clear dangers because they were cheap and they worked. But look at those sites now. Don’t make the same mistake. More details here: http://searchenginewatch.com/article/2241400/4-Steps-to-Panda-Proof-Your-Website-Before-Its-Too-Late