Showing posts with label google news. Show all posts
Showing posts with label google news. Show all posts

Thursday, July 23, 2015

Google Phantom Update Last Week?

Last week I reported that Google said there was no Panda refresh last week. This was based on several emails I received from webmasters asking if there was a Panda update.
The forums were pretty quiet but my inbox was not. Either way, Google said it wasn't a Panda update and they moved on.
Glenn Gabe thinks this was a Phantom Tremor. What is Phantom? It is what Glenn named the Jun 16th Google update that Google originally denied and then laterconfirmed as a core search update.
Glenn says his data suggests there was a tweak to that algorithm in core search, which is why Google didn't confirm it and which is why not many people noticed.
Here are some charts from his client data:
click for full size
click for full size
I can tell you again, the forums were very quiet around this update but my inbox was not, neither was Glenn's inbox. So he actually did an amazing job documenting the July 14th Google tremors and suspects it is related to what happened on June 14th.
I am not sure, nor can I get this confirmed by Google. But the data is interesting.
Forum discussion at Google+.

Saturday, November 30, 2013

Google Tells Webmaster: You'll Have To Earn Our Trust Again

A Google Webmaster Help thread has a story of a site that is trying to disavow and remove all the bad links pointing to his site.


The interesting part is that even when he does, will it help with his site's rankings?
Trust Google
The goal of this webmaster is simply to remove the manual action, but Google's John Mueller tells him he also has algorithmic trust issues.


John said:



looking at your site's history, it looks like you've done quite a bit for quite some time, and it looks like our algorithms have picked up on that too. So while resolving the manual action is a good way to start, you need to keep in mind that it can possibly take quite some time for our algorithms to regain trust in your site even after that.


I see this happening a lot, webmasters aim to remove the manual action and do but then the rankings don't improve. The reason is likely do to algorithmic actions taken on the site.

That being said, it is interesting to see how Google words it.

The algorithms seem to have lost trust over time. The manual action is a "good way to start" but the algorithms need to "regain trust" in the site for there to be an improvement - which may take some time.



By Barry Schwartz : http://www.seroundtable.com/google-trust-time-17743.html

Saturday, October 5, 2013

Penguin 5, With The Penguin 2.1 Spam-Filtering Algorithm, Is Now Live

Google announces the major updates while I'm driving right before the weekend! See through Glass ..

The fifth confirmed release of Google’s “Penguin” spam fighting algorithm is live. That makes it Penguin 5 by our count. But since this Penguin update is using a slightly improved version of Google’s “Penguin 2″ second-generation technology, Google itself is calling it “Penguin 2.1.” Don’t worry. We’ll explain the numbering nonsense below, as well as what this all means for publishers.
Google Penguin Update

New Version Of Penguin Live Today

The head of Google’s web spam team, Matt Cutts, shared the news on Twitter, saying the latest release would impact about 1 percent of all searches:


The link that Cutts points at, by the way, explains what Penguin was when it was first launched. It doesn’t cover anything new or changed with the latest release.

Previous Updates

Here are all the confirmed releases of Penguin to date:
  • Penguin 1 on April 24, 2012 (impacting around 3.1% of queries)
  • Penguin 2 on May 26, 2012 (impacting less than 0.1%)
  • Penguin 3 on October 5, 2012 (impacting around 0.3% of queries)
  • Penguin 4 (AKA Penguin 2.0) on May 22, 2013 (impacting 2.3% of queries)
  • Penguin 5 (AKA Penguin 2.1) on Oct. 4, 2013 (impacting around 1% of queries)

Why Penguin 2.1 AND Penguin 5?

If us talking about Penguin 5 in reference to something Google is calling Penguin 2.1 hurts your head, believe us, it hurts ours, too. But you can pin that blame back on Google. Here’s why.

When Google started releasing its “Panda” algorithm designed to fight low-quality content, it called the first one simply “Panda.” So when the second came out, people referred to that as “Panda 2.” When the third came out, people called that Panda 3 — causing Google to say that the third release, because it was relatively minor, really only should be called Panda 2.1 — the “point” being used to indicate how much a minor change it was.

Google eventually — and belatedly — indicated that a Panda 3 release happened, causing the numbering to move into Panda 3.0, Panda 3.1 and so on until there had been so many “minor” updates that we having to resort to going further out in decimal places to things like Panda 3.92.
That caused us here at Search Engine Land to decide it would be easier all around if we just numbered any confirmed update sequentially, in order of when they came. No matter how “big” or “small” an update might be, we’d just give it the next number on the list: Penguin 1, Penguin 2, Penguin 3 and so on.

Thanks For The Headache, Google

That worked out fine until Penguin 4, because Google typically didn’t give these updates numbers itself. It just said there was an update, and left it to us or others to attach a number to it.

But when Penguin 4 arrived, Google really wanted to stress that it was using what it deemed to be a major, next-generation change in how Penguin works. So, Google called it Penguin 2, despite all the references to a Penguin 2 already being out there, despite the fact it hadn’t really numbered many of these various updates before.

Today’s update, as can be seen above, has been dubbed Penguin 2.1 — so supposedly, it’s a relatively minor change to the previous Penguin filter that was being used. However, if it’s impacting around 1 percent of queries as Google says, that means it is more significant than what Google might have considered to be similar “minor” updates of Penguin 1.1 and Penguin 1.2.

What Is Penguin Again? And How Do I Deal With It?

For those new to the whole “Penguin” concept, Penguin is a part of Google’s overall search algorithm that periodically looks for sites that are deemed to be spamming Google’s search results but somehow still ranking well. In particular, it goes after sites that may have purchased paid links.

If you were hit by Penguin, you’ll likely know if you see a marked drop in traffic that begins today or tomorrow. To recover, you’ll need to do things like disavow bad links or manually have those removed. Filing a reconsideration request doesn’t help, because Penguin is an automated process. Until it sees that what it considers to be bad has been removed, you don’t recover.

If you were previously hit by Penguin and have taken actions hopefully meant to fix that, today and tomorrow are the days to watch. If you see an improvement in traffic, that’s a sign that you’ve escaped Penguin.
Here are previous articles with more on Penguin recovery and how it and other filters work as part of the ranking system

What About Hummingbird?

If you’re wondering about how Penguin fits into that new Google Hummingbird algorithm  you may have heard about, think of Penguin as a part of Hummingbird, not as a replacement for it.

Hummingbird is like Google’s entire ranking engine, whereas Penguin is like a small part of that engine, a filter that is removed and periodically replaced with what Google considers to be a better filter to help keep out bad stuff.

To understand more about that relationship and Hummingbird in general, see our post below:

About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn

Google Webmaster Central Blog - Official news on crawling and indexing sites for the Google index

Another step to reward high-quality sites

Tuesday, April 24, 2012 at 2:45 PM

Webmaster level: Allangry-penguin-200px

Google has said before that search engine optimization, or SEO, can be positive and constructive—and we're not the only ones. Effective search engine optimization can make a site more crawlable and make individual pages more accessible and easier to find. Search engine optimization includes things as simple as keyword research to ensure that the right words are on the page, not just industry jargon that normal people will never type.

“White hat” search engine optimizers often improve the usability of a site, help create great content, or make sites faster, which is good for both users and search engines. Good search engine optimization can also mean good marketing: thinking about creative ways to make a site more compelling, which can help with search engines as well as social media. The net result of making a great site is often greater awareness of that site on the web, which can translate into more people linking to or visiting a site.

The opposite of “white hat” SEO is something called “black hat webspam” (we say “webspam” to distinguish it from email spam). In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.

The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs. We also want the “good guys” making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available “above the fold.”

In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can't divulge specific signals because we don't want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.

Here’s an example of a webspam tactic like keyword stuffing taken from a site that will be affected by this change: 


Of course, most sites affected by this change aren’t so blatant. Here’s an example of a site with unusual linking patterns that is also affected by this change. Notice that if you try to read the text aloud you’ll discover that the outgoing links are completely unrelated to the actual content, and in fact the page text has been “spun” beyond recognition: 


Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

The change will go live for all languages at the same time. For context, the initial Panda change affected about 12% of queries to a significant degree; this algorithm affects about 3.1% of queries in English to a degree that a regular user might notice. The change affects roughly 3% of queries in languages such as German, Chinese, and Arabic, but the impact is higher in more heavily-spammed languages. For example, 5% of Polish queries change to a degree that a regular user might notice.

We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites. As always, we’ll keep our ears open for feedback on ways to iterate and improve our ranking algorithms toward that goal.


Saturday, September 28, 2013

Google officially announced the latest, and biggest, algorithm update Hummingbird ...


Hummingbird

Google has a new search algorithm, the system it uses to sort through all the information it has when you search and come back with answers. It’s called “Hummingbird” and below, what we know about it so far.

What’s a “search algorithm?”
That’s a technical term for what you can think of as a recipe that Google uses to sort through the billions of web pages and other information it has, in order to return what it believes are the best answers.
What’s “Hummingbird?”
It’s the name of the new search algorithm that Google is using, one that Google says should return better results.
So that “PageRank” algorithm is dead?
No. PageRank is one of over 200 major “ingredients” that go into the Hummingbird recipe. Hummingbird looks at PageRank — how important links to a page are deemed to be — along with other factors like whether Google believes a page is of good quality, the words used on it and many other things (see our Periodic Table Of SEO Success Factors for a better sense of some of these).
Why is it called Hummingbird?
Google told us the name come from being “precise and fast.”
When did Hummingbird start? Today?
Google started using Hummingbird about a month ago, it said. Google only announced the change today.
What does it mean that Hummingbird is now being used?
Think of a car built in the 1950s. It might have a great engine, but it might also be an engine that lacks things like fuel injection or be unable to use unleaded fuel. When Google switched to Hummingbird, it’s as if it dropped the old engine out of a car and put in a new one. It also did this so quickly that no one really noticed the switch.
When’s the last time Google replaced its algorithm this way?
Google struggled to recall when any type of major change like this last happened. In 2010, the “Caffeine Update” was a huge change. But that was also a change mostly meant to help Google better gather information (indexing) rather than sorting through the information. Google search chief Amit Singhal told me that perhaps 2001, when he first joined the company, was the last time the algorithm was so dramatically rewritten.
What about all these Penguin, Panda and other “updates” — haven’t those been changes to the algorithm?
PandaPenguin and other updates were changes to parts of the old algorithm, but not an entire replacement of the whole. Think of it again like an engine. Those things were as if the engine received a new oil filter or had an improved pump put in. Hummingbird is a brand new engine, though it continues to use some of the same parts of the old, like Penguin and Panda
The new engine is using old parts?
Yes. And no. Some of the parts are perfectly good, so there was no reason to toss them out. Other parts are constantly being replaced. In general, Hummingbird — Google says — is a new engine built on both existing and new parts, organized in a way to especially serve the search demands of today, rather than one created for the needs of ten years ago, with the technologies back then.
What type of “new” search activity does Hummingbird help?
Conversational search” is one of the biggest examples Google gave. People, when speaking searches, may find it more useful to have a conversation.
“What’s the closest place to buy the iPhone 5s to my home?” A traditional search engine might focus on finding matches for words — finding a page that says “buy” and “iPhone 5s,” for example.
Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words.
In particular, Google said that Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.
I thought Google did this conversational search stuff already!
It does (see Google’s Impressive “Conversational Search” Goes Live On Chrome), but it had only been doing it really within its Knowledge Graph answers. Hummingbird is designed to apply the meaning technology to billions of pages from across the web, in addition to Knowledge Graph facts, which may bring back better results.
Does it really work? Any before-and-afters?
We don’t know. There’s no way to do a “before-and-after” ourselves, now. Pretty much, we only have Google’s word that Hummingbird is improving things. However, Google did offer some before-and-after examples of its own, that it says shows Hummingbird improvements.
A search for “acid reflux prescription” used to list a lot of drugs (such as this, Google said), which might not be necessarily be the best way to treat the disease. Now, Google says results have information about treatment in general, including whether you even need drugs, such asthis as one of the listings.
A search for “pay your bills through citizens bank and trust bank” used to bring up the homepage for Citizens Bank but now should return the specific page about paying bills
A search for “pizza hut calories per slice” used to list an answer like this, Google said, but not one from Pizza Hut. Now, it lists this answer directly from Pizza Hut itself, Google says.
Could it be making Google worse?
Almost certainly not. While we can’t say that Google’s gotten better, we do know that Hummingbird — if it has indeed been used for the past month — hasn’t sparked any wave of consumers complaining that Google’s results suddenly got bad. People complain when things get worse; they generally don’t notice when things improve.
Does this mean SEO is dead?
No, SEO is not yet again dead. In fact, Google’s saying there’s nothing new or different SEOs or publishers need to worry about. Guidance remains the same, it says: have original, high-quality content. Signals that have been important in the past remain important; Hummingbird just allows Google to process them in new and hopefully better ways.
Does this mean I’m going to lose traffic from Google?
If you haven’t in the past month, well, you came through Hummingbird unscathed. After all, it went live about a month ago. If you were going to have problems with it, you would have known by now.
By and large, there’s been no major outcry among publishers that they’ve lost rankings. This seems to support Google saying this is very much a query-by-query effect, one that may improve particular searches — particularly complex ones — rather than something that hits “head” terms that can, in turn, cause major traffic shifts.
But I did lose traffic!
Perhaps it was due to Hummingbird, but Google stressed that it could also be due to some of the other parts of its algorithm, which are always being changed, tweaked or improved. There’s no way to know.
How do you know all this stuff?
Google shared some of it at its press event today, and then I talked with two of Google’s top search execs, Amit Singhal and Ben Gomes, after the event for more details. I also hope to do a more formal look at the changes from those conversations in the near future. But for now, hopefully you’ve found this quick FAQ based on those conversations to be helpful.
By the way, another term for the “meaning” connections that Hummingbird does is “entity search,” and we have an entire panel on that at our SMX East search marketing show in New York City, next week. The Coming “Entity Search” Revolution session is part of an entire “Semantic Search” track that also gets into ways search engines are discovering meanings behind words. Learn more about the track and the entire show on the agenda page.


About The Author:  is a Founding Editor of Search Engine Land. He’s a widely citedauthority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on FacebookGoogle +and microblogs on Twitter as @dannysullivan


Friday, September 27, 2013

Google Now Auto-Merging Google+ Pages Into Google Places Dashboard Listings

First spotted on Linda Buquet’s forum where Google has announced that they have started auto-merging G+ social functionality into basic (upgraded) Dashboard listings. Here is the Google announcement (bold is mine):
Screen Shot 2013-08-21 at 6.43.27 AM

Starting today, some pages managed in the new Google Places for Business dashboard will be automatically upgraded to have social features. We will send out emails to users whose pages are automatically upgraded letting them know. Users who have upgraded pages will see a link to Visit your Google+ page in their dashboards. A personal Google+ account is not necessary in order to utilize social features on local Google+ pages that are automatically upgraded.



If the listing for your business is not automatically upgraded and you are interested in social features, you may be able to use the Google+ widget to upgrade the page manually. (You can read more about the Google+ widget in the update from April 11 on this post — scroll up.)

Please first make sure you follow these criteria:


1 - You must have verified your business in your Places account.
2 - Your Places for Business email address should also have a  Google+ profile.
3 - Your page must be in a category that is eligible for Google+.


If these apply to you, you will see a Google+ widget in your dashboard inviting you to upgrade. Simply click Get your Google+ page to upgrade. This will create a local Google+ page in Google+ that is tied to your Google+ account. You will be able to update this page from both Google Places for Business and Google+.


If you do not see the Google+ widget yet, or don’t have the upgrade link in your widget, sit tight while we work on getting a smooth upgrade process in place for you.


To clarify Google’s somewhat imprecise communication: Google is saying that if you wait and just have a generic Google email or corporate email BUT not a G+ account, your dashboard will be upgraded automatically to be able to have a social presence and video capabilities. My understanding is that if you don’t not post any social content to your stream then your listing will continue to not show the posts tab and likewise with videos.


If you want to to have a social presence for your business before that new capability hits your account you can initiate the upgrade from within the new dashboard if your login email for the dashboard is already a G+ Plus account.


The bottom line is that if you sit and wait your new Places for Business Dashboard will bring all of the social and video features of Plus to your business without the need for an individual to have a Plus persona. You can continue to use a generic or corporate email address to manage the listings.


This is obviously a second, continuing step in creating an integrated system where all listing management can occur from within the Places Dashboard and where a business will have the ability to manage the whole system as a branded entity rather than as an individual, an obvious necessity for large businesses as well as small.


While the listing management picture is clearing up, there are still some questions around how the bulk upload feature set will be integrated into this picture and how a single brand with many locations will be accommodated so to not need to produce social streams per location. Hopefully the wait will not be interminable but this change dramatically simplifies management of listings for both agencies and a range of businesses that struggle with arbitrarily putting one individual face forward as a claimant of the brand.


Thursday, July 18, 2013

Google's Cutts: Be Careful Linking Many Sites Together

Yesterday, Google's Matt Cutts posted a video answerto the question "If I have 20 domains, should I link them all together?"
linking
The short answer is, most likely no - you should not link them all together.
As I explained at Search Engine Land with my article titles Google's Matt Cutts: Linking 20 Domains Together Likely A "Cross Linking Scheme" - it may be considered by Google as a cross linking scheme - at least those are the words used by Matt Cutts.
Here is the video:
As you can see, the overall theme and feeling you get from Matt is that it is typically a bad idea.
So the next question I will hear is what about linking 18 sites, or 15 sites, or what about 10 or 5 sites together. I love those questions. Those asking those questions are linking the sites for one purpose, ranking.

Thursday, May 9, 2013

A Google Update Is Happening (Google: Nothing To Announce Now)


An ongoing WebmasterWorld has a huge uptick in chatter around major ranking and search result fluctuation over the night. It seems from this and from all the complaints in the Google Webmaster Help forums that there is indeed some sort of update going on.
Google Update
Is it PenguinPandaEMDpage layout or something else - or is it a wide-spread manual action or Google going after and devaluing a major link network - I do not know. But it does seem something has happened causing tons of webmasters and SEOs to take to the forums to complain.
This is fairly common days after I see an update brewing as I reported on Tuesday. It does seem like something is indeed rolling out and hopefully you guys benefited from it.
SERPs.comSERP metrics and MozCast have all shown higher than normal Google fluctuation activity over the past few days as well.
Here are some comments from the WebmasterWorld thread over night:
Sure fire sign of a major update...
Seeing GIGANTIC drops this morning, woke up to 200 visitors over night, should be around 1200 by now. Server is fine. Europe appears to be asleep
Plus, as I said, there is a huge number of complaints from individual webmasters in theGoogle Webmaster Help forums.
So there seems to be a Google update happening. I will ping Google and see if I can get anything on the record. Stay tuned...
Forum discussion at WebmasterWorld.
Update: A Google spokesperson gave me a generic non-statement that reads:
We have nothing to announce at this time. We make over 500 changes to our algorithms a year, so there will always be fluctuations in our rankings in addition to normal crawling and indexing.

Wednesday, May 8, 2013

Google Update Brewing? May 2013


There are some very early signs of a possible Google update brewing as of early this morning. AWebmasterWorld thread has some renewed chatter around an update.
Google Update Brewing

Note, most of the WebmasterWorld thread is about April 15th changes, which people say have to do with the Boston bombings and seasonal traffic changes. But last night, early this morning, two webmasters came in and said they saw major shifts in rankings and traffic.

A preferred WebmasterWorld member said he saw a 77% drop just yesterday. Others said "some thing big is underway," after noting drops in his keyword ranking.

SERPs.com reports pretty significant changes in he Google results on Monday. SERPMetrics.com shows very little change in the search results. MozCast has not yetupdated with results from Monday but they showed changes on Sunday, which seem off.

It is very early and nothing is confirmed - but there may be signs of a possible Google update. What exactly, is still unknown and unconfirmed.
Forum discussion at WebmasterWorld.

Tuesday, March 12, 2013

Google’s Matt Cutts On Upcoming Penguin, Panda & Link Networks Updates


Google’s head of search spam, Matt Cutts, announced new updates with Google’s Penguin and Panda algorithms and new link network targets in 2013. Matt announced this during the SMX West panel, The Search Police.
matt-cutts-panda-smx-1314101535


Significant Penguin Update

Matt said that there will be a large Penguin update in 2013 that he thinks will be one of the more talked about Google algorithm updates this year. Google’s search quality team is working on a major update to the Penguin algorithm, which Cutts called very significant.

The last Penguin update we have on record was Penguin 3 in October 2012. Before that, we had Penguin 2 in May 2012 and the initial release in April.

So, expect a major Penguin release that may send ripples through the SEO industry this year.


A Panda Update Coming This Friday Or Monday

Matt also announced there will be a Panda algorithm update this coming Friday (March 15th) or Monday (March 18th). The last Panda update was version 24 on January 22nd, which is one of the longer spans of time between Panda refreshes we’ve seen in a long time.


Another Link Network Targeted

Matt Cutts confirmed that Google targeted a link network a couple weeks ago, and said Google will go after more in 2013. In fact, Matt said that they will release another update in the next week or two that specifically targets another large link network.


About The Author:  is Search Engine Land's News Editor and owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry's personal blog is named Cartoon Barry and he can be followed on Twitter here. For more background information on Barry, see his full bio over here.

Google's Cutts: Next Generation Penguin Update Will Be Big



Last night, Google's head of search spam, Matt Cutts announced at SMX West (which I live blogged and reported at Search Engine Land) that a "next generation" of Penguin is coming in 2013.
Google Penguin 4

This one should be big. I specifically asked, what will SEOs be talking about in 2013? What will be the next big Google algorithmic change that is the talk of 2013 amongst SEOs and Webmasters.
Matt did not mention the merchant quality algorithmbut he did specifically say it might be the next generation of the Penguin update.
Matt said his team is currently working on it and this will be a big change to how Penguin works. So when it is released, sometime in 2013, I assume sooner than later, this will send some ripples through the SEO space.
This would technically be named Penguin 4. The last official Penguin release was Penguin 3 on October 5, 2012, which was over 5 months ago. In fact, we only had two updates to Penguin since it's original release on April 24, 2012:
When will Penguin 4 happen? Again, Matt did not say, but if I had to guess, sometime in Q2 2013.
Forum discussion at Google+ and WebmasterWorld.