Monday, August 10, 2015

From SEO To SXO: Search Experience Optimization

When it comes to search engine optimization, columnist Mark Munroe suggests you need to think beyond your website. 

future-search-box-ss-1920
How does one win at SEO in 2015 and beyond?
Some of the directives we have been hearing for years are truer than ever:
“Just create great content!”
“Content is king!”
“Build a quality site!”
But what is “great?” How do you measure “quality?”
You can’t evaluate content quality without considering the expectations of the user. It doesn’t matter how well the content is written if it’s out of sync with the user’s expectations. Great content, in the context of search, means you have moved beyond SEO to Search Experience Optimization (SXO).

The Search Experience Starts On Google And Ends On Google

Typically, user experience (UX) optimization focuses on optimizing success metrics. Perhaps those metrics are based upon converting users into buyers, gathering email addresses, generating page views or getting users to click on ads.
These are your metrics, not Google’s — they are your customers, not Google’s. With SXO, you need to focus on Google’s customer. This experience starts and ends on Google.
We have access to all sorts of metrics about our own site, some of them very clear (like a purchase) and some vague (like bounce rate). However, we have little access to how Google may be measuring results on their own site. It’s a deep black hole. To optimize the search experience, we must shed some light on that darkness!

You Can’t Always Get What You Want

We want users who are ready to buy, book, subscribe or otherwise take an action that’s good for our business. The following chart shows a hypothetical breakdown of how search visitors might interact with a website:
UserActions

In this case, 60% of the users never take a single action. Did they get what they want? Who are these people? Why are they landing on your site?
A mere 10% took an action that can reasonably be viewed as a successful visit (a sign-up or a purchase). What about everyone else? Even those who didn’t bounce? Did they walk away from the site frustrated or happy?
We don’t always get the visitors we want. Search Experience Optimization means optimizing the user experience for the users we get, as well as the ones we want! Not only will that align with what Google wants, but a better understanding of all our users will help our business objectives, as well.

What Google Wants: Provide An Answer!

Except for navigational searches, almost all searches are questions, even when they are not phrased as such. Very simply, Google wants to provide answers — as evidenced by the increasing number of direct answers appearing within search results.
Consider the following searches:
search-intent-examples
Google is successful when it provides an answer — but how does Google know if it has done so successfully, especially when the query is not obviously a question?

How Does Google Evaluate?

Obviously, Google has its own internal metrics to measure the quality of its search results. Just like our own sites, Google must have metrics based on what users click on — in fact, Google recently confirmed this.
It makes sense that Google analyzes click behavior. Likely and oft-discussed metrics it is looking at include:
  • Short click. A “short click” is a quick return from a website to Google. Clearly, a very quick return is not a good signal.
  • Long click. This refers to a long delay before the user returns to Google. Longer is better.
  • Pogosticking. This is when a searcher bounces back and forth between several search results.
  • Click-through rate. How often users click on a given result compared with how often it is displayed (expressed as a percentage).
  • Next click. What a user clicks on after “pogosticking” back to Google (Either they click on an existing search listing or perform a new search).
  • Next search. When a user moves on to a new search.
  • Click rate on second search. When a previous page is elevated due to a personalized search and/or a previous click.

The Next Click

The most telling signal to Google may very well be the “next click.” If Google wants to provide the answer to a query, the next user click tells them what they need to know. Did the user find someplace to buy their New Balance running shoes? Or a review for the B&B in Napa?
If a user returns and clicks a different search result from the same query — or, upon a subsequent visit to Google, repeats the same query — that could be a signal that the initial search was not satisfied. If a user comes back and does a completely new search, that could mean the user was satisfied with the result.
Google has yet to confirm that click behavior directly influences rankings. It’s hard for me to imagine that it doesn’t. But even if it doesn’t affect rankings, Google likely uses it to influence and evaluate other changes to their algorithm. Either way, if the appearance of your site in Google’s SERP improves their metrics, that can only be fantastic for your organic search.

Kill The Search

Therefore, to optimize the search user experience, you must end the user quest and kill the search.
The user must have no reason to go back to Google and continue their quest for an answer. We will never know what that next click is. However, we can influence that next click by understanding our users.
To “kill the search,” we need to understand why users are landing on our page. What was the question or need that drove them to Google in the first place?
Consider a hotel site when a user searches for a specific hotel:
  • Are they price shopping?
  • Looking for reviews?
  • In need of driving directions?
  • Researching amenities?
Of course, we can make educated guesses. But we can do better.

Keyword Data

Keyword data is a good place to start. You can start by examining keyword data from Webmaster Tools (now rebranded as the Search Console) and looking for modifiers that reveal intent. Look at keywords for specific page types and high-traffic individual pages.
Many keywords will be vague and not reveal intent. If you are a travel site, for example, you might see “Hyatt Regency” 100 times with no modifiers and only 20 times with modifiers (such as “reviews,” “directions” or “location”). The frequency of those modifiers can give you a good idea of the broad questions users have when they land on your site.
This is only a starting point. There might be many user queries about which you have no data, simply because you do not rank for those queries. That’s when you need to go to keyword tools like SEMrush or the Google Keyword Planner. I also like to use UberSuggest to get a good overview of what the user mindset is. (Although it does not have query volume, it catches many variations you don’t see in the other tools.)
UberSuggest
Keyword data is a good start toward getting into our users’ heads. But that’s only a start. Let’s take it further.

SEO Surveys

Surveys are fantastic tools to help you understand why people landed on your site. I’ve been doing SEO surveys for many years, using tools like SurveyMonkey and Qualaroo. Of course, surveys themselves are disrupting to the user experience, so I only keep them running long enough to reach statistical significance. I usually find 100 responses is sufficient. Things to keep in mind:
  1. You want to segment users based on search. This is an SEO survey, so it is only triggered for search visitors. (Of course, it’s useful to extend the survey to other segments, too.)
  2. The purpose of this survey is to understand why the user landed on your site. What was the question or problem that drove them to search?
  3. You need to trigger the survey very quickly. If you wait too long, you will have lost the opportunity to include the people who bounced very quickly (Those are particularly the people you want to catch!). Generally, I launch after 10 or 15 seconds.
  4. The surveys should be segmented by page type. For example, people landing on a hotel property page on a travel site have very different motives from those of people landing on a city/hotel page. For high-traffic content pieces, you want to survey those pages individually.
  5. Your survey segments should represent a significant portion of your SEO traffic.
Ask the users, “Why did you visit this site today?” and list different options for the reasons. Make sure you list an “other” to capture reasons you might not have thought of. For instance, on a real estate home sales site, I have asked if users were looking for:
  • A home to buy
  • A home to rent
  • Home prices
  • School information
  • A house estimate
  • Open houses
  • Maps
Based on your survey data, you can create a prioritized list of user needs. Often, you will find surprises which can also turn into opportunities. For example, suppose you survey users who land on your real estate site on a “home for sale” page, and you discover that 20% would also consider renting. That could be a great cross-marketing opportunity.

Statistical Significance

You want to satisfy your users, but you can’t please everyone. You will need to prioritize your improvements so that they meet the needs of as large a percentage of your visitors as possible.
For example, if 25% of visitors to a restaurant page want to make a reservation, and they can’t (either because the functionality isn’t there or due to usability problems), you have an issue. If only 1% want driving directions, that is a much smaller issue.

10 Seconds Is All You Get

UX expert Jakob Nielsen performed an analysis on a Microsoft Research study a couple of years ago that showed that the largest visitor drop-off came in the first 10 seconds. If you get past 10 seconds, then users will give your site a chance. This means you have a maximum of 10 seconds to convince visitors that you:
  • Have the answer to their question
  • Have an answer that they can trust
  • Will make it easy to get their answer
That’s a tall order, and your page design needs to balance many competing priorities. To design an effective landing page, you need to know what visitors’ questions are.

SEO Usability Testing

Usability testing is a great tool to help determine how successful users are at meeting all their goals. In 2015, it definitely should be considered part of an SEO’s role. If that task falls to the UX or product team, work with them to make sure your tests are covered. If not, then take the lead and feed the results back to those organizations.
For SEOs who don’t have experience with usability, I suggest Rocket Surgery Made Easy. Additionally, there are online services which provide valuable, lightweight and rapid test results. I’ve used both UserTesting.com (for more extensive tests) and FiveSecondtest.com for quick reactions from users. Here are some tips specific to SEO usability testing:
  • Create a good SEO user scenario. Set the context and the objective. Start them on a search result page so you can observe the transition from search result page to your landing page.
  • Focus your landing pages by using page templates that get the most traffic.
  • Focus on the dominant problems, and use cases that you have identified by keyword analysis and surveys.

Consistent Titles & Meta Descriptions

If every search is a question, every result in the search results is a promise of an answer. Please make sure your titles are representative of what your site provides.
If you have reviews on 50% of your products, only mention that in the title and meta-descriptions of products that actually have reviews. Otherwise, you will be getting bad clicks and unhappy users. Another example is the use of the word “free.” If you say “free,” you’d better have “free!”

An Improved Web Product

The role of a successful SEO has broadened, and it demands that we understand and solve our visitors’ problems. It’s a huge challenge for SEOs, as we need to broaden our skill set and inject ourselves into different organizations. However, this is much more interesting and rewarding compared to link building.
Ultimately, the role of SEO will become even more critical within an organization, as the learnings and improvements will broadly benefit a website beyond just improving SEO. The great thing about the transition from Search Engine Optimization to Search Experience Optimization is that the end result is not just more traffic, it’s a better Web product.

Thursday, July 23, 2015

Google Phantom Update Last Week?

Last week I reported that Google said there was no Panda refresh last week. This was based on several emails I received from webmasters asking if there was a Panda update.
The forums were pretty quiet but my inbox was not. Either way, Google said it wasn't a Panda update and they moved on.
Glenn Gabe thinks this was a Phantom Tremor. What is Phantom? It is what Glenn named the Jun 16th Google update that Google originally denied and then laterconfirmed as a core search update.
Glenn says his data suggests there was a tweak to that algorithm in core search, which is why Google didn't confirm it and which is why not many people noticed.
Here are some charts from his client data:
click for full size
click for full size
I can tell you again, the forums were very quiet around this update but my inbox was not, neither was Glenn's inbox. So he actually did an amazing job documenting the July 14th Google tremors and suspects it is related to what happened on June 14th.
I am not sure, nor can I get this confirmed by Google. But the data is interesting.
Forum discussion at Google+.

Saturday, December 7, 2013

Google Toolbar PageRank Lives On With The First Update In Over 10 Months

google-toolbar-pagerank
Google has updated the Toolbar PageRank values this morning, despite Google’s Matt Cutts implying the update would not happen again within 2013.
The SEO community, discussion forums and social media outlets are lighting up with the news that Google has actually updated the Toolbar PageRank values. Why?
(1) The SEO industry always lights up when the most visible indicator and easiest to see metric of Google linkage data changes.
(2) Because no one expected a Toolbar PageRank update this year.
The last Toolbar PageRank update was over 10 months ago and I predicted, wrongly, thatToolbar PageRank was dead. I was wrong, Google updated it today, 10+ months later and well before anyone thought there would be an update.
Google has implied over the years that toolbar PageRank would go away and has slowly dropped support for the feature in several browsers over the years. Although, Google did tell usToolbar PageRank wouldn’t fully go away, at least not on older browsers that support it.
For more on the importance, or lack-there-of, Toolbar PageRank see our guide on PageRank.
Here is the latest video from Matt Cutts on Google PageRank:


Update: Matt Cutts confirmed the update on Twitter saying the “team was fixing a different backend service and did a PR update along the way.”

Content published on :http://searchengineland.com/google-toolbar-pagerank-lives-on-with-the-first-update-in-over-10-months-179238

Saturday, November 30, 2013

Google Tells Webmaster: You'll Have To Earn Our Trust Again

A Google Webmaster Help thread has a story of a site that is trying to disavow and remove all the bad links pointing to his site.


The interesting part is that even when he does, will it help with his site's rankings?
Trust Google
The goal of this webmaster is simply to remove the manual action, but Google's John Mueller tells him he also has algorithmic trust issues.


John said:



looking at your site's history, it looks like you've done quite a bit for quite some time, and it looks like our algorithms have picked up on that too. So while resolving the manual action is a good way to start, you need to keep in mind that it can possibly take quite some time for our algorithms to regain trust in your site even after that.


I see this happening a lot, webmasters aim to remove the manual action and do but then the rankings don't improve. The reason is likely do to algorithmic actions taken on the site.

That being said, it is interesting to see how Google words it.

The algorithms seem to have lost trust over time. The manual action is a "good way to start" but the algorithms need to "regain trust" in the site for there to be an improvement - which may take some time.



By Barry Schwartz : http://www.seroundtable.com/google-trust-time-17743.html

Wednesday, October 9, 2013

Hummingbird: Google’s Latest Algorithm Change & What it Means for Your Site









Katie Thomas's picture
Digital Marketing Technician
On Wednesday, September 27 Google confirmed that a new update to their search algorithm went live about a month prior to the announcement. Google shared the news of the latest update, dubbed Hummingbird, during their 15th birthday event at the Google Garage, the birthplace of the search engine giant. This update is said to have affected a staggering 90% of search results worldwide.


Hummingbird


Hummingbird is said to be one of the biggest changes to the algorithm since 2010’s Caffeine update, or even reaching back to changes made back in 2001. More recent updates have been Panda and Penguin which were updates to the old algorithm, but are still factored into Hummingbird.

So, what’s new about this algorithm?


The long version? The new algorithm seeks to serve the search demands and technologies of today, especially mobile devices. Hummingbird takes largely into account conversational search, demonstrated back in May on Chrome browsers where users have an option to speak their query. It also accounts for complex searches and semantic search. This algorithm doesn't just process certain keywords in a query but works to understand the meaning behind every word, becoming more intelligent and more predictive with time.
The short version? More understanding of the complete meaning of queries, less keywords-only focus.

How will Hummingbird affect SEO?

Google says that its guidance for SEO efforts remains the same. Continue to create high-quality and original content on your sites. Signals of quality that have been important in the past are still important, such as fresh and engaging content, hierarchical site architecture, authority and reputation.

Is this going to affect website traffic?

Any site that was hit by Penguin 2.0 back in May can probably pinpoint immediate drops in traffic following the May 22 release. With Hummingbird, it has been a month since its release, so if your traffic trends have not shown any effects, it is likely you will not be affected by the update.
Google has stated that this algorithm is meant to have a query-by-query effect on searches, particularly complex or long-tail searches. If your site has seen a downturn, Google still claims that it could be from any other little tweaks or changes that occur on a regular base to the overall search algorithm.

What can I do in response to Hummingbird?

As Google has advised, keep creating high-quality content. Set higher standards for yourself by rethinking your approach to your content marketing strategy. Start to think in terms of what your customers or users are saying, look for internal company feedback about what’s not being answered about your services or products on your site. Create content with even more user intent in mind.
To learn more about user intent, check out this video done by my colleague Mitch Holt.








Saturday, October 5, 2013

Penguin 5, With The Penguin 2.1 Spam-Filtering Algorithm, Is Now Live

Google announces the major updates while I'm driving right before the weekend! See through Glass ..

The fifth confirmed release of Google’s “Penguin” spam fighting algorithm is live. That makes it Penguin 5 by our count. But since this Penguin update is using a slightly improved version of Google’s “Penguin 2″ second-generation technology, Google itself is calling it “Penguin 2.1.” Don’t worry. We’ll explain the numbering nonsense below, as well as what this all means for publishers.
Google Penguin Update

New Version Of Penguin Live Today

The head of Google’s web spam team, Matt Cutts, shared the news on Twitter, saying the latest release would impact about 1 percent of all searches:


The link that Cutts points at, by the way, explains what Penguin was when it was first launched. It doesn’t cover anything new or changed with the latest release.

Previous Updates

Here are all the confirmed releases of Penguin to date:
  • Penguin 1 on April 24, 2012 (impacting around 3.1% of queries)
  • Penguin 2 on May 26, 2012 (impacting less than 0.1%)
  • Penguin 3 on October 5, 2012 (impacting around 0.3% of queries)
  • Penguin 4 (AKA Penguin 2.0) on May 22, 2013 (impacting 2.3% of queries)
  • Penguin 5 (AKA Penguin 2.1) on Oct. 4, 2013 (impacting around 1% of queries)

Why Penguin 2.1 AND Penguin 5?

If us talking about Penguin 5 in reference to something Google is calling Penguin 2.1 hurts your head, believe us, it hurts ours, too. But you can pin that blame back on Google. Here’s why.

When Google started releasing its “Panda” algorithm designed to fight low-quality content, it called the first one simply “Panda.” So when the second came out, people referred to that as “Panda 2.” When the third came out, people called that Panda 3 — causing Google to say that the third release, because it was relatively minor, really only should be called Panda 2.1 — the “point” being used to indicate how much a minor change it was.

Google eventually — and belatedly — indicated that a Panda 3 release happened, causing the numbering to move into Panda 3.0, Panda 3.1 and so on until there had been so many “minor” updates that we having to resort to going further out in decimal places to things like Panda 3.92.
That caused us here at Search Engine Land to decide it would be easier all around if we just numbered any confirmed update sequentially, in order of when they came. No matter how “big” or “small” an update might be, we’d just give it the next number on the list: Penguin 1, Penguin 2, Penguin 3 and so on.

Thanks For The Headache, Google

That worked out fine until Penguin 4, because Google typically didn’t give these updates numbers itself. It just said there was an update, and left it to us or others to attach a number to it.

But when Penguin 4 arrived, Google really wanted to stress that it was using what it deemed to be a major, next-generation change in how Penguin works. So, Google called it Penguin 2, despite all the references to a Penguin 2 already being out there, despite the fact it hadn’t really numbered many of these various updates before.

Today’s update, as can be seen above, has been dubbed Penguin 2.1 — so supposedly, it’s a relatively minor change to the previous Penguin filter that was being used. However, if it’s impacting around 1 percent of queries as Google says, that means it is more significant than what Google might have considered to be similar “minor” updates of Penguin 1.1 and Penguin 1.2.

What Is Penguin Again? And How Do I Deal With It?

For those new to the whole “Penguin” concept, Penguin is a part of Google’s overall search algorithm that periodically looks for sites that are deemed to be spamming Google’s search results but somehow still ranking well. In particular, it goes after sites that may have purchased paid links.

If you were hit by Penguin, you’ll likely know if you see a marked drop in traffic that begins today or tomorrow. To recover, you’ll need to do things like disavow bad links or manually have those removed. Filing a reconsideration request doesn’t help, because Penguin is an automated process. Until it sees that what it considers to be bad has been removed, you don’t recover.

If you were previously hit by Penguin and have taken actions hopefully meant to fix that, today and tomorrow are the days to watch. If you see an improvement in traffic, that’s a sign that you’ve escaped Penguin.
Here are previous articles with more on Penguin recovery and how it and other filters work as part of the ranking system

What About Hummingbird?

If you’re wondering about how Penguin fits into that new Google Hummingbird algorithm  you may have heard about, think of Penguin as a part of Hummingbird, not as a replacement for it.

Hummingbird is like Google’s entire ranking engine, whereas Penguin is like a small part of that engine, a filter that is removed and periodically replaced with what Google considers to be a better filter to help keep out bad stuff.

To understand more about that relationship and Hummingbird in general, see our post below:

About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn

Google Webmaster Central Blog - Official news on crawling and indexing sites for the Google index

Another step to reward high-quality sites

Tuesday, April 24, 2012 at 2:45 PM

Webmaster level: Allangry-penguin-200px

Google has said before that search engine optimization, or SEO, can be positive and constructive—and we're not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.

The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.

The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”

In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can't divulge specific signals because we don't want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

Here’s an example of a webspam tactic like keyword stuffing taken from a site that will be affected by this change: 


Of course, most sites affected by this change aren’t so blatant. Here’s an example of a site with unusual linking patterns that is also affected by this change. Notice that if you try to read the text aloud you’ll discover that the outgoing links are completely unrelated to the actual content, and in fact the page text has been “spun” beyond recognition: 


Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice.

We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites. As always, we’ll keep our ears open for feedback on ways to iterate and improve our ranking algorithms toward that goal.