Thursday 19 December 2013

Google's Matt Cutts: A Little Duplicate Content Won't Hurt Your Rankings

Duplicate content is always a concern for webmasters. Whether it's a website stealing content from another site, or perhaps a website that hasn't taken an active role in ensuring they get great unique quality content on their site, being duplicated out of the Google index is a problem.
In the latest webmaster help video from Google's Matt Cutts, he addresses how Google handles duplicate content, and when it can negatively impact your search rankings.
Cutts started by explaining what duplicate content is and why duplicate content isn't always a problem, especially when it comes to quoting parts of other web pages.
It's important to realize that if you look at content on the web, something like 25 or 30 percent of all of the web's content is duplicate content. … People will quote a paragraph of a blog and then link to the blog, that sort of thing. So it's not the case that every single time there's duplicate content it's spam, and if we made that assumption the changes that happened as a result would end up probably hurting our search quality rather than helping our search quality.
For several years, Google's stance has been that they try to find the originating source and give that result the top billing, so to speak. After all, Google doesn't want to serve up masses of identical pages to a searcher because it doesn't provide a very good user experience if they click on one page, didn't find what they're looking for, and then go back and click the next result only to discover the identical page, just merely on a different site.
Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as of it's just one piece of content. So most of the time, suppose we're starting to return a set of search results and we've got two pages that are actually kind of identical. Typically we would say, "OK, rather than show both of those pages since they're duplicates, let's just show one of those pages and we'll crowd the other result out," and then if you get to the bottom of the search results and you really want to do an exhaustive search, you can change the filtering so that you can say, "OK, I want to see every single page" and then you'd see that other page. But for the most part, duplicate content isn't really treated as spam. It's just treated as something we need to cluster appropriately and we need to make sure that it ranks correctly, but duplicate content does happen.
Next, Cutts tackles the issue of where duplicate content is spam, such as websites that have scraped content off the original websites or website owner suggests republish a lot of “free articles” that are republished on masses of other websites. These types of sites have the biggest problem with duplicate content because they merely copy content created on other websites.
It's certainly the case that if you do nothing but duplicate content, and you are doing in an abusive, deceptive, malicious, or a manipulative way, we do reserve the right to take action on spam. So someone on Twitter was asking a question about "how can I do an RSS auto blog to a blog site and not have that be viewed as spam," and the problem is that if you are automatically generating stuff that is coming from nothing but an RSS feed, you're not adding a lot of value, so that duplicate content might be a little bit more likely to be viewed as spam.
There are also cases where businesses might legitimately end up with duplicate content that won't necessarily viewed as spam. In some cases, websites end up with duplicate content for usability reasons, rather than SEO. For the most part those websites shouldn't worry either
But if you're just making a regular website and you're worried about whether you'd have something on the .com and the .co.uk, or you might have two versions of your terms and conditions, an older version and a newer version, that sort of duplicate content happens all the time on the web and I really wouldn't get stressed out about the notion that you might have a little bit of duplicate content.
Cutts does caution against local directory types of websites that list masses of cities but serve up empty listings with no true content about what the user might be looking for, as well as sites that create individual pages for every neighborhood they service, even though the content is the same as what's on main city web page.
As long as you're not trying to massively copy for every city in every state in the entire United States, show the same boilerplate text which is, "no dentists found in this city either," for the most part you should be in very good shape not have anything to worry about.
Bottom line: as long as your duplicate content is there for legitimate reasons (e.g., you're quoting another website or you have things like two versions of terms and conditions), you really shouldn't be concerned about duplicate content. However, Google certainly can and will take action against sites utilizing duplicate content in a spammy fashion, because they aren't adding value to the search results.

Wednesday 18 December 2013

Google Squashes Backlinks.com, Another Link Network Outed By Google’s Matt Cutts

Matt Cutts, Google’s head of search spam, announced on Twitter that Google has gone after another link network, this one is named Backlinks.com.
Like with Anglo Rank, the link network Google outed the week prior, Matt Cutts took a line from their marketing material and then said “Au contraire!” This is the way Matt Cutts tells the SEO industry that Google is actively going after link networks and to stop participating in them.
Here is Matt’s tweet about Backlinks.com:
"Our installation code/software used to publish the sold links is not detectable by the search engine bots." Au contraire!
Here is the tweet the week before on Anglo Rank:
"There are absolutely NO footprints linking the websites together" Oh, Anglo Rank.
Will there be a tweet late on Friday this week about another link network?

Anglo Rank Rebuilding?

It is reported at Search Engine Roundtable that the owner of Anglo Rank, the link network Google penalized the week before, has decided to rebuild the network and start again. It is said that this is how black hats operate, they build sites that they expect will get penalized and then do it all over again after the penalty. Is it recommended? No, you can’t build a long term business around this cat and mouse game. But some like to live life on the edge.
Over the past year or so, Google has been going after link networks at greater speeds. Here are some of the reports we have on those stories over a year or so

Nelson Mandela Ranks #1 On Google’s Top Trending Searches For 2013

Google has published its top ten global trending searches of  2013, with the recently deceased South African leader Nelson Mandela ranking No. 1 for this year’s top global searches:
It’s perhaps unsurprising that the #1 trending search of 2013 was an international symbol of strength and peace: Nelson Mandela. Global search interest in the former President of South Africa was already high this year, and after his passing, people from around the world turned to Google to learn more about Madiba and his legacy.
Unlike Yahoo’s celebrity-focused list of top searches or Bing’s collection of top searchesthat surprisingly lacked any mention of Nelson Mandela, Google’s top global searches included a variety topics, from celebrities to tech gadgets and world affairs.

Google’s Top 10 Trending Global Searches of 2013:

  1. Nelson Mandela
  2. Paul Walker
  3. iPhone 5s
  4. Cory Monteith
  5. Harlem Shake
  6. Boston Marathon
  7. Royal Baby
  8. Samsung Galaxy s4
  9. PlayStation 4
  10. North Korea
Along with its top global trending searches, Google also published its annual “Year-End Zeitgeist” page, listing more than 1,000 top ten search lists. Google claims this year represented the most “global Zeitgeist” to date with top searches from 72 countries.
Top searches of 2013 included:
  • Most searched celebrity pregnancies – Kim Kardashian
  • Most searched deaths – Paul Walker
  • Most searched Fortune 500 – Google
  • Most searched movies – Man of Steel
  • Most searched MLB Player – Alex Rodriguez
  • Most searched NBA Player – LeBron James
  • Most searched TV Show – Breaking Bad
According to Google, the most often searched “What is..?” question asked by users was, “What is twerking?” which aligned perfect with the site’s most searched person of the year, Miley Cyrus.
Google wrapped up its year with a video spotlighting the most popular people, places and events of 2013:

Tuesday 17 December 2013

7 Breaking SEO News Updates That Will Take Your Business To New Heights

Nothing ever remains the same way for long in the world of SEO. The experts at Moz report that Google typically updates their algorithm between 500-600 times each year! While most of these tweaks go unnoticed, we’ve recently weathered an abundance of rewrites and changes that could affect how your company approaches your content marketing strategy. To help you stay informed about the ever-changing inbound marketing landscape, here are 7 pieces of critically-important SEO news:

1. Google is Tough on Repeat Offenders

One of the most fascinating and little-known areas of SEO news is Google penalties. When websites break Google’s webmaster guidelines with outdated tactics like buying links, they’re typically caught. The search engine responds with an official spam warning, and may eventually ban some websites from appearing in search results. Recovering from these issues can take months of hard work.
Google’s head spam fighter, Matt Cutts, recently revealed in a Q & A session that it’s much harder to come back and rank well after a second or third penalty. In fact, his recommendation for websites who are trying to improve their SEO after past use of purchase links to use the disavow tool to wipe their backlinks completely. If you’re unaffected, take this as evidence that it’s crucial to be pay attention to SEO news and avoid breaking any rules.

2. Rich Snippets Could Be Rolled Out Soon

The author photos which appear next to search results once you’ve earned Google authorship are a form of rich snippet. However, it appears the world’s biggest search engine is considering leveling the playing field. SEO News reporter Matt Southern shared that Google is currently testing the idea of including embedded images in search results, which are visible with certain searches:


7 Breaking SEO News Updates That Will Take Your Business To New Heights image google rich snippets e1386824878114



If you haven’t started enhancing each of your website pages with images, this SEO news indicates it’s probably wise to make that your next project.

3. In-Depth Results aren’t Going Anywhere

Back when Google’s Hummingbird Algorithm re-write was launched, the search engine announced they’d be sharing in-depth search results for around 10% of queries that may require more complex answers. Recent SEO news have indicated that the in-depth articles initiative continues, and that every web master has an opportunity to have their content featured among these top results:


7 Breaking SEO News Updates That Will Take Your Business To New Heights image google in depth articles



While many of the websites featured in in-depth results have extraordinarily high site authority, the Google webmaster blog announced the SEO news that you can improve your chances of being featured here with the following tactics:
  • Use schema.org “article” markup,
  • Apply for Google authorship and include the markup,
  • Use rel=next and rel=prev for paginated articles (also watch out for common rel=canonical mistakes),
  • Provide information about your organization’s logo,
  • Create compelling in-depth content.

4. Google to Index App Content

If you haven’t yet built an app for your business, there may be even more incentive to get started now. Some of Google’s latest SEO news is that content from Android apps would soon be indexed like regular web pages. This concept is illustrated below using real estate website Trulia:


7 Breaking SEO News Updates That Will Take Your Business To New Heights image Trulia app indexing e1386824947105



It isn’t easy to set this up – it requires advanced SEO capabilities to edit your Sitemap file, and a working knowledge of Google Webmaster tools. However, once it’s up and running, you’ll be able to provide a seamless mobile web experience to your customers. This is undoubtedly mobile marketing at it’s best!

5. Google PageRank is Now Up-to-Date

In perhaps the most shocking piece of SEO news in months, Google PageRank was quietly updated on December 6, 2013. Prior to 2012, the tool bar was updated on a quarterly basis, allowing SEOs and content marketers almost real-time access to their site’s authority in the eyes of Google. However, this year it wasn’t updated once and Cutts stated it probably wouldn’t be – causing many to speculate that PageRank would become secret data like keywords.
The updates are now live, and you can scope out how your site’s authority has increased in the past year. Regardless of when the next update occurs, you can rest assured that your site’s authority is updated in the eyes of Google on an hourly basis – even if you can’t see it.

6. Semantic Search is on the Rise

Anyone who’s been using web technologies for more than a decade remembers Boolean search operators. Early search engines weren’t smart enough to pick up on things like plural words, and you had to join your queries together with specific terms like “and”, “or”, and “else.” However, the deployment of Google Hummingbird, Bing’s Satori and Facebook’s knowledge graph in 2014 is clear evidence that search engines are getting much smarter at picking up on the variation behind phrasing choices, a concept known as semantic search.
SEO news expert Colin Jeavons writes that search engine’s increasing ability to use natural language processing has already transformed SEO from a highly rigid practice to a science that’s a mixture of quality, social signals, and optimization efforts. If your primary SEO tactic isn’t quality content above keywords, it’s time to make the switch.

7. Google Penalizes Excessive Linking

Cutts recently addressed the long-standing belief that you should never exceed 100 links per page on your website. Turns out, it’s still a wise best practice. In the early days of search, major engines had trouble indexing content with more than 100 links. While the capability is now there, excessive linking can be a pretty serious red flag that someone’s being spammy. Cutts recommends that content marketers stick to a “reasonable number” of links. If you hit 102 or 110 links on a particularly long and research-heavy piece of content, you’re probably in the clear. If your links begin to interfere with the readability of your content, it may be time to take a step back.

Monday 9 December 2013

Google Busts Yet Another Link Network: Anglo Rank

Google’s head of search spam, Matt Cutts, just confirmed on Twitter that Google has targeted another “private link network” – this one is named Anglo Rank.

Matt’s tweet was pretty direct, he wrote:
“There are absolutely NO footprints linking the websites together” Oh, Anglo Rank.
That is a quote directly from Anglo Rank’s marketing material, and a dig from Cutts suggesting that indeed, Google was able to spot sites in the network.
In response, Search Engine Land’s editor-in-chief Matt McGee suggested on Twitter that those in the network were likely to find that it was “torched.” Cutts responded by saying “messages can take a few days to show up in [Google Webmaster Tools], so timing of when to post can be tricky to predict.”
In other words — yes, Cutts confirmed that Anglo Rank was penalized, and that those involved with it were getting penalty notifications, and since those were finally starting to appear in Google Webmaster Tools, Cutts felt it was OK to finally go more public with a tweet.
Over the past year or so, Google has been going after link networks at greater speeds. Here are some of the reports we have on those stories over a year or so:
Cutts did say that Anglo Rank is not the only link network targeted in this effort. Heresponded to me that Google has “been rolling up a few” link networks in this specific target.
So if you get a message in Google Webmaster Tools about paid links in the next day or so, it may be related to this update.

Friday 6 December 2013

The Test Begins: Do Google Shopping & Other Shopping Search Engines Give You The Best Deals?

google-shopping-products-featured
It sounds pretty damning. Two recent surveys suggest that Google Shopping isn’t leading searchers to the best prices on products. But the surveys weren’t well documented, nor did they include competitors like Bing, Shopzilla, PriceGrabber and Nextag, which have similar issues. So, Search Engine Land will be running its own fully documented tests. Every few days, we’ll search for an item and show what we found, starting with today’s test, a search for a toaster.

How Much Is That Toaster In The Shopping Search Window?

This is a long story. It’s designed to be that way so that people can understand exactly what was tested and the reasoning for selecting prices to compare, which is lacking in the two other surveys. This type of details is important, especially when lobbing around accusations to the US Federal Trade Commission about potential consumer misleading, as Consumer Watchdog has done.
Take the time to read the full story if you want to understand just how very complicated it is to conduct a survey of this nature. But for those who want a quick summary now, the first test shows that Google actually ranked best in featuring a price that matched the lowest price that could be found from a major merchant:
shopping.xls.pdf__1_page_
The survey also found that all the shopping search engines have issues in terms of disclosing to consumers why they rank results in the way they do. Several have some serious issues with pricing inaccuracies.
Remember, a single test isn’t enough to draw a conclusion across the entire industry. I’m not sure that running six tests as the Financial Times did is enough, nor the 14 that Consumer Watchdog did is, either. But far more important, the how you test and what you count can produce skewing that can mess up even a large sample.

How Much Is That Toaster In The Shopping Search Window?

Why look for a toaster? That’s one of several items where Consumer Watchdog said its survey found a $20 price gap between what was listed on Google and what was listed as the lowest price for the same item at Shopzilla, PriceGrabber or Nextag, when it looked for “Cuisinart Classic 2P Slice Toaster.”
Bing wasn’t surveyed, even though it has shopping search ads exactly like Google. Nor did Consumer Watchdog explain the exact query was used or list a model number for the toaster. Nothing was documented. We were only given the end result, and when it comes to comparing shopping search engines, as you’ll see, it’s essential to show your work.
We can’t search for the exact same toaster as Consumer Watchdog did, because we don’t know the model. Plus, even if we did know it, looking for it might cause some to assume Google’s now “fixed” the toaster price problem for that particular model. So, for our test, we’re seeking a Hamilton Beach 2-Slice Metal Toaster, model 22504. Why? It’s one of the best selling toasters at Amazon, so it seems a popular model worth testing.

Which Price Do You Compare To?

At Google, the search began by entering “Hamilton Beach 2-Slice Metal Toaster” into the main search box. That brought back a special area of the search page with shopping search results, as the arrow points to below:
The results in the box are drawn from Google Shopping, which controversially shifted to an all-ad model last year. Only people who advertise are included in Google Shopping listings. The same is true of all the other shopping search engines in today’s test, by the way. Google isn’t some odd exception.
Already, you can see the challenge in deciding how to survey if the move to all-ads is causing lower prices not to be found. Which price in the results do you count? The $26.99 one from Sears, because it’s first? The $24.28 one from Walmart, because it’s lowest? The $34.99 one from JCPenney, because it’s highest? And are these even all the same model? One’s a different color than the others.
Consumer Watchdog’s survey method was to pick one result out of this box and compare it to the lowest price for a particular item that it could find on another shopping search engine, where it may have “drilled down” into the results in order to find that better price. Or perhaps it didn’t. The survey doesn’t make this clear.
The Financial Times, which did its own survey, instead compared prices listed within these results to what searchers might find at Google itself, if they know how to drill-down to get more listings. Again, which price from the results box was used isn’t clear, which matters when you can’t tell easily if you’re comparing the exact same product.

Four Searches To Verify The Lowest Price At Google

Our survey will use what we think is the cheapest price shown out of the results box for the same exact product, as best we can tell. We’ll also compare to what we’ll call the featured merchant price and finally to the best price from any well-known merchant listed.
To do all this, it’s time to “drill-down” into Google’s shopping results. We already did one initial search, and now we’ll do more in the hunt for the best price. The second search happens by clicking on that “Shop for Hamilton Beach….” link the arrow points to in the screenshot above. Doing that brings up this:
That click brought up a list of various items Google thinks matches the search, drawn from Google Shopping, and results that are still all ads. You just get more of them.
The first listing is for the model we’re after, and clicking on it made the result get larger, with Walmart as the featured merchant offering it at $24.28, before tax and shipping.
Why’s Walmart featured? No idea — nor does the typical consumer know why. It might be they’re paying more. It might be that Google’s complicated system of showing ads based on advertiser “quality score” means Walmart gets a bump. It’s not because Walmart has the cheapest price of the four merchants shown in that box, because in other searches, I know that sometimes the higher price merchant will still get featured.
You can drill-down further, by clicking on the “Compare prices from 50+ stories” link in the box. That constitutes a third search, and doing so brings up this list:
On that list, Walmart still has the lowest base price. But we’re still not being shown all the listings, only the first ten, ranked by whatever mystery criteria Google is using. If you want the lowest price, you have to effectively do a fourth search by clicking on the “Base Price” column heading, which resorts the listings like this:
If you do that, you’ll discover that a merchant called “Unopened Savings” is offering the low, low price of $12.27 for the toaster. But as this is an unrated merchant, a merchant with no recognizable brand, we’re not going to count that for this test. Instead, Walmart still hangs in there with the lowest price.
That’s also a lesson why it makes sense that Google doesn’t just rank merchants by lowest price or even highest customer ratings. Using only those criteria can easily allow extremes that don’t correspond to a good purchase experience to emerge. An unknown retailer offering a too-good-to-be-true price might indeed be too good to be true. A merchant with only a few reviews could easily get the highest customer ratings, based off just a tiny sample. It would be good if Google better explained more about how relevancy is determined for consumers who care, in an easy-to-find location, but the obmission also not abnormal in the shopping search space, either.

Google: Bottom Line, Lowest Price Is Shown

So what does the survey show?
  • Lowest price in ad box: $24.28
  • Lowest featured price in drill down: $24.28
  • Lowest major brand price in drill down: $24.28
For this test, against its own listings, Google did perfect. But are there cheaper prices out there that competing search engines can find?

Bing: No Major Retailers Listed; No Lowest Price

Google’s biggest competitor is Bing. Here’s what you get for the same search there:
Just like Google, Bing only shows shopping results for those who pay, a change it made earlier this year, but without the controversy Google encountered. These ads appear in a box with a similar format to Google’s.
Unlike the search at Google, the results Bing displayed were terrible. None were for the model that we were after. Some aren’t even for the brand or for the specified two-slice capacity.
Changing to a search for “Hamilton Beach 22504″ to add the model number didn’t improve things much. The right toaster finally appeared, but apparently the only place it’s available is through eBay or a small merchant called Hayneedle, with the lowest price being $36.99:
Unlike with Google, there’s no way to “drill down” into Bing Shopping to search for a better price. That’s because Bing completely killed Bing Shopping earlier this year in favor of an all-ad format. What appears in that shopping ad box is all you get.
Of course, Bing would argue that it does provide “free” shopping results that allow for more inclusivity than Google’s model, through a system called “Rich Captions.” See that second arrow above, pointing to a Walmart listing? That’s Walmart appearing in Bing’s results — and for free — but using a system that displays the price of the toaster, $24.31 and in stock.
So, should that price be counted as Bing’s “low price” from its shopping results? Not if you’re trying to run a survey comparing Google’s shopping listings to Bing, as both Consumer Watchdog and the Financial Times did. That’s because they didn’t seem to use any pricing displayed through Google’s similar — and totally free — Rich Snippets system. Here’s an example of those from Google, for a search on ”Hamilton Beach 22504,” below:
Hamilton_Beach_22504_-_Google_Search-2
See how complicated trying to compare shopping search engines can be? With Bing, if you did our original search, you didn’t get pricing either through the ad-block or via Rich Captions, unlike with Google. You certainly didn’t get what Bing promised when it killed its shopping search engine earlier this year, a better experience:
You no longer need to waste time navigating to a dedicated “shopping” experience to find what you’re looking for. Based on your intent, we’ll serve the best results.
You absolutely don’t get the ability to see all the shopping results, to determine if Bing is showing the best price from all those it knows about — which is crucial for measuring Bing against the same accusations that Google faces.
For the purposes of our test, we’re going with the Bing pricing shown in the ad block and tagging that as N/A, as there’s no major direct retailer showing it.

PriceGrabber: Low Prices That Don’t Exist

Over to PriceGrabber, the toaster is listed multiple times, with various prices. As with Google, the challenge is knowing which is the right model for the price comparison:
The first of the listings promising a price “as low as $17.99″ seems good, so let’s drill down by clicking on it:
Where’s that $17.99 price? Oddly, PriceGrabber shoves it way down at the bottom of the page, choosing to push consumers toward K-Mart’s $28.99 listing first. Why? As with Google, who knows. A consumer certainly doesn’t, just as they also probably don’t know that PriceGrabber is almost certainly is using the same all-ads model as Google. I can’t say for certain, because PriceGrabber seems to lack any definitive disclosure page for consumers.
Which price to use for our survey? We could go with the Target price of $17.99, but there’s one problem with that. It doesn’t really exist:
Clicking on the offer opens PriceGrabber up to potential accusations of bait-and-switch, since the price Target is really selling the toaster for is $24.49. Amazon, which has the next lowest price of $19.99, turns out to really be selling it for $24.28. That’s cheaper than the third lowest price from PriceGrabber, Walmart at $24.31. But since PriceGrabber isn’t actually listing the correct Amazon price, for purposes of our survey, we’ll use the lowest correct price it actually shows: Walmart’s, and at the $24.28 actually shown when you go to Walmart’s site. As for the “featured price,” that would be the first price on the original list, Kmart at $28.99.

Nextag: Pushing Higher-Priced Amazon Listing Over Lower-Priced One

Searching for “Hamilton Beach 2-Slice Metal Toaster” at Nextag proved to be difficult. Well, impossible. No matter what I did, that search just generated no results. In the end, I resorted to a search for “Hamilton Beach 22504,” which came back with this:
As with Google — and PriceGrabber — the lowest price isn’t shown first. Instead, sorting is done by “Best Match” order, whatever that is. Nothing on the page explains it. The help page, if consumers go to it, has no help about it — though at least it discloses that Nextag gets paid by merchants to list them, just as Google does.
Changing the sort to lowest price brings this up:
Now we discover that the lowest price from a major retailer, according to Nextag, is Amazon for $23.77, not from Amazon for $29.99, as it first presented. Except, as it turns out, that $23.77 price doesn’t exist. Going to the offer finds the price is really $24.28 — which matches another Amazon listing that Nextag also shows.
For the survey, the low price will be $24.28, while the featured price will be $29.99 — both prices coming from Amazon, which yes, lists the identical product at two differentprice points.

Shopzilla: Inaccurate Prices Shown

At Shopzilla, “Hamilton Beach 2-Slice Metal Toaster” didn’t find the model we were after, so I did a search for “Hamilton Beach 22504,” which returned this list:
The model appears to be listed three times. If you were to click on any of these listings using the “go to store” link, rather than the small “Compare price at other stores,” you’d pay anywhere between $31.90 to $43.33 for the item. Walmart is listed first at $43.33.
Using Consumer Watchdog’s survey methodology, that’s the assumption of what most consumers would do — click on those links and not try to find better pricing. But let’s try the drill-down. Selecting the first of the listing brought up a page with prices from a variety of merchants:
Shopzilla_-_Hamilton_Beach_22504_Two-Slice_Toaster-3
As with Google, PriceGrabber and Nextag, prices aren’t shown in order of lowest-to-highest. Instead, the now familiar “Best Match” sort order is used. There’s no help page that explains what this means, just as there’s no help page providing any disclosure that all these listings are almost certainly ads.
The “Best Match” sort order means Target, with the lowest price of any major retailer at $17.99, is buried way down on the list. Instead, consumers are pointed first at a Sears offer to buy the toaster for $25.64.
Except that Target low price? It’s really $24.99. And that Sears offer? It’s actually$26.99:
As it turns out, the lowest price from a major retailer is from Walmart, at $24.32 – which actually turns out to be $24.28, when you go to the page.
Wait — wasn’t Walmart the first listing way back on Shopzilla, offering the toaster for $43.33? Yes, it was. Walmart has a one page where it oddly sells the same item for two different prices, cheaper if you buy from it directly and more expensive if you buy through a partner:
Hamilton_Beach_Toaster__22504_-_Walmart.com
For the purposes of our survey, we’re considering the featured price on Shopzilla to be the first price you get when you drill-down into the listings, that from Sears, at $26.99 that’s accurately shown in the landing page. The low price is Walmart at $24.28. Counting the Walmart featured listing before doing the drill-down is difficult, because it isn’t accurately showing the actual price.

The Big Wrap-Up: They All Kind Of Suck

As I said at the outset, we’ll run a few more tests like this. I have no doubt that Google, which came out the best in this one, may not do so well in another. But the bigger issue to me is that the entire state of shopping search feels like a big mess.
It’s pretty clear that there’s not a lot of consumer disclosure going on by anyone. That’s despite the fact that I called the FTC’s attention to this issue last year:
  • A Letter To The FTC Regarding Search Engine Disclosure Compliance
The FTC’s response was that of what I’ve come to expect from an agency that seems ill-prepared and equipped to understand the complicated search engine space. It did nothing more than encourage search engines to pretty-please disclose ads more:
  • FTC Updates Search Engine Ad Disclosure Guidelines After “Decline In Compliance”
Ironically, Google is coming under fresh fire for not disclosing enough when it seems to do far more than some of the shopping search engines that it’s competing with.
The mess of shopping search also, in my view, weakens the argument that Google somehow would be better if it was pointing people to competing shopping search engines even more. Given that several clearly have issues showing fresh, valid prices, that’s just going to jump consumers through even more hoops and frustration.
Rather, it would be good to see them all step-up efforts to improve relevancy in a space that feels largely forgotten, in terms of improving search relevancy. I’m an expert on search engines, and trying to ferret out what these shopping search engines are showing, and how to use them to double-check to get the best price, is exhausting. I feel sorry for the consumer relying on them for that purpose.

Thursday 5 December 2013

Hummingbird's Unsung Impact on Local Search

There seems to have been a significant qualitative shift in local results since Google's release of Hummingbird that I haven't seen reported on search engine blogs and media outlets. The columns I have seen have generally espoused advice to take advantage of what Hummingbird was designed to do rather than looked at the outcome of the update.
From where I sit, the outcome has been a slightly lower overall quality in Google's local results, possibly due in part to a "purer" ranking algorithm in local packs. While these kinds of egregious results reported soon after Hummingbird's release have mostly disappeared, it's the secondary Hummingbird flutter, which may have coincided with the November 14th "update," that seems to have caused the most noticeable changes.
The manual searches for five keywords, both geo-modified and generic, in five diverse markets around the country has been performed. I selected these keywords based on terms that I knew Google considered to have "local intent" across as broad a range of industries as I could think of. After performing the searches, I took note of the top position and number of occurrences of four types of sites, as well as position and number of results in each "pack."
KeywordsMarketsResult Type Taxonomy
personal injury lawyerChicagonational directory (e.g., Yelp)
assisted living facilityPortlandregional directory (e.g., ArizonaGolf.com)
wedding photographerTampalocal business website (e.g., AcmeElectric.com)
electricianBurlingtonbarnacle webpage (e.g., facebook.com/acmeelectric)
pet storeFlagstaffnational brand (e.g., Petsmart.com)

Again, a very simple analysis that is by no means intended to be a statistically significant study.
I'll share with you some interim takeaways that I found interesting.

1. Search results in search results have made a comeback in a big way

If anything, Hummingbird or the November 14th update seem to have accelerated the trend that started with the Venice update: more and more localized organic results for generic (un-geo-modified) keywords.
But the winners of this update haven't necessarily been small businesses. Google is now returning specific metro-level pages from national directories like Yelp, TripAdvisor, Findlaw, and others for these generic keywords.
This trend is even more pronounced for keywords that do include geo-modifiers, as the example below for "pet store portland" demonstrates.
Results like the one above call into question Google's longstanding practice of minimizing the frequency with which these pages occur in Google search results. While the Yelp example above is one of the more blatant instances that I came across, plenty of directories (including WeddingWire, below) are benefitting from similar algorithmic behavior. In many cases the pages that are ranking are content-thin directory pages—the kind of content to which Panda, and to some extent Penguin, were supposed to minimize visibility.
Overall, national directories were the most frequently-occurring type of organic result for the phrases I looked at—a performance amplified when considering geo-modified keywords alone.
National brands as a result type is underrepresented due to 'personal injury lawyer,' 'electrician,' and 'wedding photographer' keyword choices. For the keywords where there are relevant national brands ('assisted living facility' and 'pet store'), they performed quite well.

2. Well-optimized regional-vertical directories accompanied by content still perform well

While a number of thriving directories were wiped out by the initial Panda update, here's an area where the Penguin and Hummingbird updates have been effective. There are plenty of examples of high-quality regionally focused content rewarded with a first-page position—in some cases above the fold. I don't remember seeing as many of these kinds of sites over the last 18 months as I do now.
Especially if keywords these sites are targeting return carousels instead of packs, there's still plenty of opportunity to rank: in my limited sample, an average of 2.3 first-page results below carousels were for regional directory-style sites.

3. There's little-to-no blending going on in local search anymore

While Mike Blumenthal and Darren Shaw have theorized that the organic algorithm still carries weight in terms of ranking Place results, visually, authorship has been separated from place in post-Hummingbird SERPs.
Numerous "lucky" small businesses (read: well-optimized small businesses) earned both organic and map results across all industries and geographies I looked at.

4. When it comes to packs, position 4 is the new 1

The overwhelming majority of packs seem to be displaying in position 4 these days, especially for "generic" local intent searches. Geo-modified searches seem slightly more likely to show packs in position #1, which makes sense since the local intent is explicitly stronger for those searches.
Together with point #3 in this post, this is yet another factor that is helping national and regional directories compete in local results where they couldn't before—additional spots appear to have opened up above the fold, with authorship-enabled small business sites typically shown below rather than above or inside the pack. 82% of the searches in my little mini-experiment returned a national directory in the top three organic results.

5. The number of pack results seems now more dependent on industry than geography

This is REALLY hypothetical, but prior to this summer, the number of Place-related results on a page (whether blended or in packs) seemed to depend largely on the quality of Google's structured local business data in a given geographic area. The more Place-related signals Google had about businesses in a given region, and the more confidence Google had in those signals, the more local results they'd show on a page. In smaller metro areas for example, it was commonplace to find 2- and 3-packs across a wide range of industries.
At least from this admittedly small sample size, Google increasingly seems to be a show a consistent number of pack results by industry, regardless of the size of the market.
Keyword# in PackReason for Variance
assisted living facility6.96-pack in Burlington
electrician6.96-pack in Portland
personal injury lawyer6.4Authoritative OneBox / Bug in Chicago
pet store3.0
wedding photographer7.0
This change may have more to do with the advent of the carousel than with Hummingbird, however. Since the ranking of carousel results doesn't reliably differ from that of (former) packs, it stands to reason that visual display ofall local results might now be controlled by a single back-end mechanism.

6. Small businesses are still missing a big opportunity with basic geographic keyword optimization

This is more of an observational bullet point than the others. While there were plenty of localized organic results featuring small business websites, these tended to rank lower than well-optimized national directories (like Yelp, Angie's List, Yellowpages.com, and others) for small-market geo-modified phrases (such as "electrician burlington").
For non-competitive phrases like this, even a simple website with no incoming links of note can rank on the first page (#7) just by including "Burlington, VT" in its homepage Title Tag. With just a little TLC—maybe a link to a contact page that says "contact our Burlington electricians"—sites like this one might be able to displace those national directories in positions 1-2-3.

7. The Barnacle SEO strategy is underutilized in a lot of industries

Look at the number of times Facebook and Yelp show up in last year's citation study . Clearly these are major "fixed objects" to which small businesses should be attaching their exoskeletons.
Yet 74% of searches I conducted as part of this experiment returned no Barnacle results.
This result for "pet store chicago" is one of the few barnacles that I came across—and it's a darn good result! Not only is Liz (unintenionally?) leveraging the power of the Yelp domain, but she gets five schema'd stars right on the main Google SERP—which has to increase her clickthrough rate relative to her neighbors.
Interestingly, the club industry is one outlier where small businesses are making the most of power profiles. This might have been my favorite result—the surprisingly competitive "dance club flagstaff" where Jax is absolutely crushing it on Facebook despite no presence in the carousel.

What does all this mean?

I have to admit, I don't really know the answer to this question yet. Why would Google downgrade the visibility of its Place-related results just as the quality of its Places backend has finally come up to par in the last year? Why favor search-results-in-local-search-results, something Google has actively and successfully fought to keep out of other types of searches for ages? Why minimize the impact of authorship profiles just as they are starting to gain widespread adoption by small business owners and webmasters?
One possible reason might be in preparation for more card-style layouts on mobile phones and wearable technology. But why force these (I believe slightly inferior) results on users of desktop computers, and so far in advance of when cards will be the norm?
At any rate, here are five takeaways from my qualitative review of local results in the last couple of months.
  1. Reports of directories' demise have been greatly exaggerated. For whatever reason (?), Google seems to be giving directories a renewed lease on life. With packs overwhelmingly in the fourth position, they can now compete for above-the-fold visibility in positions 1-2-3, especially in smaller and mid-size metro areas.
  2. Less-successful horizontal directories (non-Yelps and TripAdvisors, e.g.) should consider the economics of their situation. Their ship has largely sailed in larger metro areas like Chicago and Portland. But they still have the opportunity to dominate smaller markets. I realize you probably can't charge a personal injury lawyer in Burlington what you charge his colleague in downtown Chicago. But, in terms of the lifetime value of who willactually get business from your advertising packages, the happy Burlington attorney probably exceeds the furious one from Chicago (if she is even able to stay in business through the end of her contract with you).
  3. The Barnacle opportunity is huge, for independent and national businesses alike. With Google's new weighting towards directories in organic results and the unblending of packs, barnacle listings present an opportunity for savvy businesses to earn three first-page positions for the same keyword—one pack listing, one web listing, and one (or more) barnacle listing
Well, that's my take on what's happening in local search these days. Do you think the quality of local results has improved or declined since Hummingbird? Have you perceived a shift since November 14th?