Thursday, October 13, 2016

Why Didn't You Recover from Penguin?

Posted by Dr-Pete

After almost a two-year wait, the latest Penguin update rolled out in late September and into early October. This roll-out is unusual in many ways, and it only now seems to be settling down. In the past couple of weeks, we've seen many reports of recoveries from previous Penguin demotions, but this post is about those who were left behind. What if you didn't recover from Penguin?

I'm going to work my way from unlikely, borderline conspiracy theories to difficult truths. Theories #1 and #2 might make you feel better, but, unfortunately, the truth is more likely in #4 or #5.


1. There is no Penguin

Then you'll see that it is not the spoon that bends, it is only yourself. Ok, this is the closest I'll get to full-on conspiracy theory. What if this new Penguin is a ruse, and Google did nothing or rolled out something else? We can't know anything 100% without peering into the source code, but I'm 99% confident this isn't the case. Interpreting Google often means reading between the lines, but I don't know of any recent confirmed announcement that ended up being patently false.

Google representatives are confirming details about the new Penguin both publicly and privately, and algorithm flux matches the general timeline. Perhaps more importantly, we're seeing many anecdotal reports of Penguin recoveries, such as:

Given the severity of Penguin demotions and the known and infrequent update timelines, these reports are unlikely to be coincidences. Some of these reports are also coming from reliable sources, like Marie Haynes (above) and Glenn Gabe (below), who closely track sites hit by Penguin.


2. Penguin is still rolling out

This Penguin update has been unusual in many ways. It's probably best not to even call it "Penguin 4.0" (yes, I realize I keep calling it that). The new, "real-time" Penguin is not simply an update to Penguins 1–3. It replaces them and works very differently.

Because real-time Penguin is so different, the roll-out was broken up into a couple of phases. I believe that the new code went live in roughly the timeline of Google's announcement date of September 23rd. It might have happened a day or two before that, but probably not weeks before. This new code, though, was the kinder, gentler Penguin, which devalues bad links.

For this new code to fully take effect, the entire link graph had to be refreshed, and this takes time, especially for deeper links. So, the impact of the initial roll-out may have taken a few days to fully kick in. In terms of algorithm flux, the brunt of the initial release hit MozCast around September 27th. Now that the new Penguin is real-time, we'll be feeling its impact continuously, although that impact will be unnoticeable for the vast majority of sites on the vast majority of days.

In addition, Google has rolled back previous Penguin demotions. This happened after the new code launched, but we don't have an exact timeline. This process also took days, possibly a week or more. We saw additional algorithm spikes around October 2nd and 6th, although the entire period showed sustained flux.

On October 7th, Gary Illyes from Google said that the Penguin roll-out was in the "final stage" (presumably, the removal of demotions) and would take a "few more days". As of this writing, it's been five more days.

My best guess is that 95%+ of previous Penguin demotions have been removed at this point. There's a chance you're in the lucky 5% remaining, but I wouldn't hold my breath.


3. You didn't cut nearly deep enough

During the few previous Penguin updates, it was assumed that sites didn't recover because they simply hadn't cut deep enough. In other words, site owners and SEOs had tried to surgically remove or disavow a limited number of bad links, but those links were either not the suspect links or were just the tip of the iceberg.

I think it's true that many people were probably trying to keep as many links as possible, and were hesitant to make the deep cuts Penguin required. However, this entire argument is misleading and possibly self-destructive, because this isn't how the new Penguin works.

Theoretically, the new Penguin should only devalue bad links, and its impact will be felt on a more "granular" (in Google's own words) level. In other words, your entire site won't be demoted because of a few or even a lot of bad links, at least not by Penguin. Should you continue to clean up your link profile? Possibly. Will cutting deeper help you recover from Penguin down the road? Probably not.


4. Without bad links, you'd have no links at all

Here's the more likely problem, and it's a cousin of #3. Your link profile is so bad that there is practically no difference between "demotion" and "devaluation." It's quite possible that your past Penguin demotion was lifted, but your links were so heavily devalued that you saw no ranking recovery. There was simply no link equity left to provide SEO benefit.

In this case, continuing to prune those bad links isn't going to help you. You need to build new quality signals and authoritative links. The good news is that you shouldn't have to wait months or years now to see the positive impact of new links. The bad news is that building high-quality links is a long, difficult road. If it were easy, you probably wouldn't have taken shortcuts in the first place.


5. Your problem was never Penguin

This is the explanation no one wants to hear, but I think it's more common than most of us think. We're obsessed with the confirmed update animals, especially Penguin and Panda, but these are only a few of the hundreds of animals in the Google Zoo.

There were algorithmic link demotions before Penguin, and there are still parts of the algorithm that look for and act on bad links. Given the power that links still hold over ranking, this should come as no surprise. The new Penguin isn't a free pass on all past link-building sins.

In addition, there are still manual actions. These should (hopefully) show up in Google Search Console, but Google will act on bad links manually where it's warranted.

It's also possible that you have a very different algorithmic problem in play or any of a number of technical SEO issues. That diagnostic is well beyond the scope of this blog post, but I'll offer this advice — dig deeper. If you haven't recovered from Penguin, maybe you've got different or bigger problems.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, October 12, 2016

3 New Upgrades Make the Web's Best Keyword Research Tool Even Better

Posted by randfish

If you know me, you know I'm hyper-critical of the software, data, and products Moz releases. My usual response to someone asking about our tools vs. others used to be to give a rundown of the things I like about the competition and why they're great, then look down at ground, shuffle my feet in embarrassment, and say "and Moz also has a good tool for that."

But Keyword Explorer (and the progress Moz Pro & Local have made this year) brings out a different behavior in me. I'm still a little embarrassed to admit it, but admit it I must. KW Explorer is the best keyword research tool in the market, period*.

But we are never satisfied, so today, it's getting even better with the addition of some killer new functionality.

#1: Rank checking inside KW Explorer lists

First on the list is the ability to easily see whether a given domain (or URL) already ranks on page 1 for any of the keywords on a list. Just enter a domain or page, hit "check rankings," and the Rank column will fill in with your data.

Why is this crucial?

Because many of us who do keyword research need to know whether to add a list of keywords to our "already visible/getting traffic" set, or to the "in need of content creation or optimization" set. This feature makes it simple to build up a multi-hundred keyword list for targeting, and quickly include or exclude the keywords for which we're already ranking page 1 (or above/below any given position). This column now appears in the CSV export, too, so you can mash up and filter the data however you'd like.

Quick aside: If you have a keyword list with expired SERPs (after 14 days, KW Explorer assumes that Google's results may have changed substantially enough to invalidate the prior Difficulty & Opportunity scores), you'll get this experience when checking rankings. Just refresh the keywords on the list to fetch the latest SERPs and you'll be good to go.

But, of course, there's also the need to get more ranking data — the ranking positions beyond page 1, tracking over time, comparison to competitors, etc. And that's why, we've also added...

#2: Send keywords directly from a list to Pro Campaigns for rank tracking

Undoubtedly, our most-requested feature of the summer was the ability to import a list (or selected keywords from a list) over to a campaign to track. The previous export/import system worked, but it was an unnecessary hassle. Today, you can simply use the "I want to" menu, choose "Add XYZ to Campaign," and then select which campaign you want (or create a new one).

The keywords will auto-magically copy themselves into your campaign, using whatever default settings you've got for rank tracking (US-English, Google.com is most common, but you can rank track in any country or language).

Why is this crucial?

Because once you know the keywords you're targeting, you need to know how you're performing over time, how your competition's doing on those terms/phrases, and how the rankings are changing to include or exclude various SERP features (yup, as of August, we also track all the SERP features in Pro Campaigns).

The challenge, of course, is that you've got to know which keywords are worth targeting in the first place, and how relatively important they are, which is why we've worked like mad to deliver...

#3: Better, more accurate keyword volume and coverage than ever

(that's way, way frickin' better than whatever Google AdWords is doing with their "low spending" accounts)

Russ Jones and the Keyword Explorer team have been going full-force on a new, more powerful solution to replacing Google AdWords's weird, imprecise, always-30-days-or-more-behind keyword data with better information. We started working with clickstream data (searches and click patterns gathered from browser extensions, anonymized, and sold to us by various folks) early this year; Russ wrote a detailed account of the process here.

But now our volume numbers are even better, with the addition of dramatically more data via a partnership with the awesome crew at Jumpshot. Their clickstream-based search behavior, plus what we get from other sources, combined with our modeling against AdWords' impression counts on real campaigns, gives us higher accuracy, more coverage, and faster recognition of volume trends than ever before.

Why is this crucial?

When you enter a term or phrase into Keyword Explorer, you can now expect that we're providing the best, most accurate volume ranges available*. Marketers need to be able to trust the numbers in their keyword tools, or else risk prioritizing the wrong search terms, the wrong content, and the wrong investments. We have confidence, thanks to our test comparisons, that the volume ranges you see in KW Explorer's ranges will match real volume for the prior 30 days 95%+ of the time.

In the months ahead, Russ will have more to share comparing Moz's keyword volume data to AdWords' and, hopefully, an external API for search volume, too (especially after all the resounding requests on Twitter).

If that wasn't enough, we've also added volume numbers to Pro Campaigns, so you can see this high-quality information in the context of the keywords you're tracking.

Not too shabby, eh?


Let's get real. Moz had a number of years where getting one change to one product, even a small one, felt like pulling teeth. It took forever. I think you could rightly point at our software and say "What's going on over there?" But those days are long gone. Just look at all the useful, quality updates in 2016. This team is firing. on. every. cylinder. If you work on Moz's software, you should be proud. If you use our software, you can feel like you're getting your money's worth and more. And if, like me, you tie far too much of your self-worth to the quality of your company's products, well, even you can start holding your head high.

Rock on, fellow Mozzers and Moz subscribers. Rock on.


* In the English-language market, that is; outside of the United States, Canada, the United Kingdom, and Australia (where we get Jumpshot and other clickstream data), the suggestions aren't as comprehensive and the volume numbers are often missing. Sadly, it'll probably be this way for a while as we're focusing on English markets for the time being, and will need to find and make deals with clickstream providers in each country/language in order to match up.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, October 11, 2016

We Fought the Comment Spam (and the Comment Spam Didn't Win)

Posted by FeliciaCrawford

All across the Internet, comments sections are disappearing.

From your high-profile news sites to those that share the online marketing space, more and more sites are banishing that unassuming little text box at the bottom of a post. And frankly, it’s not hard to understand why.

First, you have your good ol’-fashioned spam comments. These are the commenters that hold dear the idea that those nofollowed comment links are valuable:

commentlink5.png

The usual.

commentlink2.png

Spicing it up a bit with some solid industry advice.

commentlink4.png

Really going for the gold!

Then you have your thin comments. Often left with (we assume) good intentions, they don’t add much value to the discussion:

thincomment1.png

thincomment2.png

thincomment3.png

These poor souls usually end up with a lot of downvotes, and if they receive upvotes, it’s often a clear sign that there’s a nefarious MozPoint scheme afoot.

Sometimes even the best of us are lured by the glamour of spamming:

Finally, lest we forget, you have your inflammatory comments. Those comments that, although perhaps on-topic, are derailing or just downright unkind. We don’t get too much of that here on the Moz blog, thank goodness (or thank TAGFEE), but I’m sure we’ve all read enough of those to last us several lifetimes.

And comment moderation is a thankless, wearying task. Though we fight the good fight, comment spammers are constantly finding ways around our barriers, poking their links into the chinks in our armor. It takes valuable time out of a Mozzer’s busy workday to moderate those comments.

So why are we battling to keep them?

In the beginning, there was the blog.

Before the Moz Pro toolset was even a twinkle in Roger’s optical sensors, Moz was a blog. A community of brave folks banding together to tackle the mysteries and challenges of SEO. If you look back across the years and rifle through the many, many comments, you’ll begin to notice a few things:

  • People learned from one another.
  • People leaned on one another.
  • People networked and cultivated relationships that otherwise may not have blossomed.

Google says they're good for SEO, and I'm not gonna fight with Google.

Now, I don’t want to cheapen the sentiment here, but it has to be said: the smart folks over at Google have made it clear that a healthy, vibrant online community is one signal of a site’s quality. Comments can be considered part and parcel of what constitutes good (nay, even great) content, and have even been spotted in a featured snippet or two.

I don’t know about you, but I’m not one to argue with the Big G.

But there's always been comment spam. Why do you care now?

Comment spam isn’t a new or novel phenomenon. It’s been plaguing blogs almost since the very first public bloggers put fingers to keyboard. Most blog posts on Moz show traces of its corrupt spamalicious influence in the comments section. So what was the catalyst that steeled our resolve?

It just got annoying.

Authors pour heart and soul into crafting their posts. They take valuable time out of their regular work day to engage in the comments section, answering questions and driving thoughtful discussion. They deserve better than a slew of spammers aiming to place a link.

Readers devote hours of their ever-so-precious lives to reading the blog. Some folks even read for the comment conversations alone. They deserve to benefit from those invested hours, to be inspired to join the conversation.

We knew we had to do something. What that was seemed unclear, though.

We began to notice something. When we promoted a YouMoz post to the main blog, it tended to garner more of what we’d call quality comments. Comments with depth, that ask pertinent questions, that respectfully challenge the article in question. These posts came prepackaged with their own discussions already in full swing from their time on YouMoz; often, the first few comments were engaging ones, and they were just as often upvoted to remain on top (the blog auto-sorts comments by popularity).

Conversely, when the first several comments on a brand-new post were thin, spammy, or otherwise low-quality, it seemed to grind any potential discussion to a screeching halt. Internally, our Mozzer authors like Dr. Pete and Rand began to take notice. I received some concerned questions from other frequent contributors. At first, I wasn’t sure how to tackle the problem. After all, we already seemed to be doing so much.

Comment moderation? Check. Certain triggers catch comments in a queue, which we clear out daily.

Subject every comment to approval by an editor? No, that would stymy the natural discussions that make our blog comments section special in the first place. No one should have to wait for my morning meetings to finish before they can engage in intellectual banter with their peers.

Close the comments section? No way. This was never on the table. It simply didn’t make sense; we’re fortunate in that a good majority of comments on the blog are high quality.

It boiled down to the fact that there was the potential for our comments section to nurture not only good content, not only great content, but fantastic content. 10X, if you prefer that term. The most royal darn content outside of Buckingham Palace.

Okay, that might be going a little far. But something incredibly special happens here on the blog. You can ask questions about a Whiteboard Friday and Rand will do his best to answer, thoughtfully refute, or discuss your point. You can get to know your peers in an industry largely cooped up behind a screen half a world away. You can joke with them, disagree with them, metaphorically high-five them. And it’s not limited to a relatively low character count, nor is there pressure to approve the friend request of anyone you’ve just hotly debated.

We had to preserve that.

And that’s when we devised our grand experiment.

We began to seed discussion questions as the first comment.

Inspired by sites like the New York Times with their “NYT Pick” featured comment option, we decided there was a better way.

nytpick.png

Marvel at that nifty gold badge!

For one week in August (8/1 through 8/5), I asked authors to contribute a discussion question, something to spark a decent conversation in the comments early on, before you could even say “thanks for the nice post.”

This question would appear at the top of our comments section, the first thing a reader would see after consuming the post and potentially feeling inspired enough to share their thoughts.

Rand kicked it off a little early, in fact, with this zinger on July 29th:

randsfirstcomment.png

Those upvotes looked mighty promising to a despairing blog editor.

Keep in mind that, for the most part, posting these discussion questions is a very manual process. We don’t currently have the framework built to display a “featured question.” We tend to publish around 12am Seattle time; to get these little puppies in place early enough to make a difference, I would...

  • Stay up until midnight
  • Assume the identity of the author (with permission, of course) using magical Moz admin abilities
  • Publish the comment
  • Sneak back to my main account and — yes, here’s the shady bit — thumbs it up to ensure it stayed “on top” for a few hours

I do struggle with the guilt of these small betrayals (that is, gaming the thumb system), but ‘twas for the greater good, I swear it! As you can see from the screenshot above, that high visibility — combined with a ready-to-go thought-provoking question — earned more upvotes as the day wore on. Almost without fail, each seeded discussion question remained the top-voted comment on every post that week. And it seemed to be working — more and more comments seemed to be good quality. Great quality. Sometimes even fantastic quality. (I just shivered.)

What's spam to me might be a sandwich to you.

Now, quality is a very subjective thing. I can’t vouch for the absolute science of this experiment, because it was very squarely rooted in a subjective analysis of the comments. But when we compared the results from our experiment week (8/1 through 8/5) to two separate weeks in which we didn’t make any special effort in the comments (7/18 through 7/22 and 6/27 through 7/1), the results were quite telling.

Cut to the chase — what happened?!

Manually going through the comments section of each post, I tallied how many comments I considered high-quality or useful that were not given by the author, and how many comments I considered so thin or spammy as to be detrimental to the section as a whole.

For the control week of June 27th through July 1st, 26% of total comments were high-quality and 26% were spammy.

For the control week of July 18th through July 22nd, 23% were high-quality and 29% were spammy.

For the week of our discussion questions, August 1st through August 5th, 35% of total comments were high-quality and 11% were spammy.

My subjective, unscientific experiment had great results. Since then, I’ve asked our authors to contribute discussion questions to kick off a good conversation in the comments. Every time, I can anecdotally say that the commentary was more vibrant, more overtly helpful, and more alive than when we don’t meddle.

You like it, you really like it!

Seeded discussion questions far and away have more upvotes than your regularly scheduled top comments. Often they top the double digits, and this very apt discussion question by Gianluca (a long-time supporter and champion of the Moz community) earned a whopping 27 thumbs pointing toward the heavens:

gianluca.png

In addition, people are answering those questions. They're answering each other answering those questions. The questions are helping to get the gears turning, adding another layer of thoughtfulness to a piece that you otherwise might be content to skim and then bounce off to another magical corner of the Internet.

The greatest and most humbling triumph, of course, would be to help transform the spammers into supporters, to inspire everyone to think critically and communicate boldly. If even one person hesitates before dropping in a promotional link and instead asks the community's advice, my spirit shall rest easy forevermore.

There's a light at the end of the tunnel.

Sure, there are still comment spammers. There have always been comment spammers. And, though it pains me to say it, there will always be comment spammers. It’s just a part of life we must accept, like the mud that comes along with a beautifully rainy Seattle afternoon or when your last sip of delicious coffee is muddled with grounds.

But I want to give you hope, O ye commenters and readers and editors of the world. You need not sacrifice the intrinsic goodness of a community-led comments section to the ravages of spam. There is another way. And though the night is dark and full of spammers, we’re strong enough and smart enough to never yield, to hold firm to our values, and to nourish what goodness and helpfulness we can in our humble territory of the Internet.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, October 10, 2016

The Complete Guide to Creating On-Site Reviews+Testimonials Pages

Posted by MiriamEllis

“Show your site’s credibility by using original research, citations, links, reviews and testimonials. An author biography or testimonials from real customers can help boost your site’s trustworthiness and reputation.” Google Search Console Course

2017 may well be the year of testimonials and reviews in local SEO. As our industry continues to grow, we have studied surveys indicating that some 92% of consumers now read online reviews and that 68% of these cite positive reviews as a significant trust factor. We’ve gone through a meaningful overhaul of Google’s schema review/testimonial guidelines while finding that major players like Yelp will publicly shame guideline-breakers. We’ve seen a major publication post a controversial piece suggesting that website testimonials pages are useless, drawing thoughtful industry rebuttals illustrating why well-crafted testimonials pages are, in fact, vitally useful in a variety of ways.

Reviews can impact your local pack rankings, testimonials can win you in-SERP stars, and if that isn’t convincing enough, the above quote states unequivocally that both reviews and testimonials on your website can boost Google’s perception of a local business’ trustworthiness and reputation. That sounds awfully good! Yet, seldom a day goes by that I don’t encounter websites that are neither encouraging reviews nor showcasing testimonials.

If you are marketing local enterprises that play to win, chances are you’ve been studying third-party review management for some years now. Not much has been written about on-site consumer feedback, though. What belongs on a company’s own testimonials/reviews page? How should you structure one? What are the benefits you might expect from the effort? Today, we’re going to get serious about the central role of consumer sentiment and learn to maximize its potential to influence and convert customers.

Up next to help you in the work ahead: technical specifics, expert tips, and a consumer feedback page mockup.

Definitions and differentiations

Traditional reviews: Direct from customers on third-party sites

In the local SEO industry, when you hear someone talking about "reviews," they typically mean sentiment left directly by customers on third-party platforms, like this review on TripAdvisor:

rt1.jpg

Traditional testimonials: Moderated by owners on company site

By contrast, testimonials have traditionally meant user sentiment gathered by a business and posted on the company website on behalf of customers, like this snippet from a bed-and-breakfast site:

rt2.jpg

Review content has historically been outside of owners’ control, while testimonial content has been subject to the editorial control of the business owner. Reviews have historically featured ratings, user profiles, images, owner responses, and other features while testimonials might just be a snippet of text with little verifiable information identifying the author. Reviews have typically been cited as more trustworthy because they are supposedly unmoderated, while testimonials have sometimes been criticized as creating a positive-only picture of the business managing them.

Hybrid sentiment: Review+testimonial functionality on company site

Things are changing! More sophisticated local businesses are now employing technologies that blur the lines between reviews and testimonials. Website-based applications can enable users to leave reviews directly on-site, they can contain star ratings, avatars, and even owner responses, like this:

In other words, you have many options when it comes to managing user sentiment, but to make sure the effort you put in yields maximum benefits, you’ve got to:

  1. Know the guidelines and technology
  2. Have a clear goal and a clear plan for achieving it
  3. Commit to making a sustained effort

There is a ton of great content out there about managing your reviews on third-party platforms like Yelp, Google, Facebook, etc., but today we’re focusing specifically on your on-site reviews/testimonials page. What belongs on that page? How should you populate and organize its content? What benefits might you expect from the investment? To answer those questions, let’s create a goal-drive plan, with help from some world-class Local SEOs.

Guidelines & technology

There are two types of guidelines you need to know in the consumer sentiment space:

1) Platform policies

Because your website’s consumer feedback page may feature a combination of unique reviews and testimonials you directly source, widgets featuring third-party review streams, and links or badges either showcasing third-party reviews or asking for them, you need to know the policies of each platform you plan to feature.

Why does this matter? Since different platforms have policies that range from lax to strict, you want to be sure you’re making the most of each one’s permissions without raising any red flags. Google, for example, has historically been fine with companies asking consumers for reviews, while Yelp’s policy is more stringent and complex.

Here are some quick links to the policies of a few of the major review platforms, to which you’ll want to add your own research for sites that are specific to your industry and/or geography:

2) Google’s review schema guidelines

Google has been a dominant player in local for so long that their policies often tend to set general industry standards. In addition to the Google review policy I’ve linked to above, Google has a completely separate set of review schema guidelines, which recently underwent a significant update. The update included clarifications about critic reviews and review snippets, but most germane to today’s topic, Google offered the following guidelines surrounding testimonial/review content you may wish to publish and mark up with schema on your website:

Google may display information from aggregate ratings markup in the Google Knowledge Cards. The following guidelines apply to review snippets in knowledge cards for local businesses:

- Ratings must be sourced directly from users.
- Don't rely on human editors to create, curate or compile ratings information for local businesses. - These types of reviews are critic reviews.
- Sites must collect ratings information directly from users and not from other sites.

In sum, if you want to mark up consumer feedback with schema on your website, it should be unique to your website — not drawn from any other source. But to enjoy the rewards of winning eye-catching in-SERP star ratings or of becoming a "reviews from the web" source in Google’s knowledge panels, you’ve got to know how to implement schema correctly. Let’s do this right and call on a schema expert to steer our course.

Get friendly with review schema technology.

rtdavid.jpg

The local SEO industry has come to know David Deering and his company TouchPoint Digital Marketing as go-to resources for the implementation of complex schema and JSON-LD markup. I’m very grateful to him for his willingness to share some of the basics with us.

Here on the Moz blog, I always strive to highlight high quality, free resources, but in this case, free may not get the job done. I asked David if he could recommend any really good free review schema plugins, and learned a lot from his answer:

Boy, that's a tough one because I don't use any plugins or tools to do the markup work. I find that none of them do a good job at adding markup to a page. Some come close, but the plugin files still need to be edited in order for everything to be correct and properly nested. So I tend to hard-code the templates that would control the insertion of reviews onto a page. But I can tell you that GetFiveStars does a pretty good job at marking up reviews and ratings and adding them to a site. There might be others, too, but I just don't have any personal experience using them, unfortunately.

It sounds like, at present, best bets are going to be to go with a paid service or roll up your sleeves to dig into schema hard coding. *If anyone in our community has discovered a plugin or widget that meets the standards David has cited, please definitely share it in the comments, but in the meantime, let’s take a look at the example David kindly provided of perfect markup. He notes,

“The following example is rather simple and straightforward but it contains everything that a review markup should. (The example also assumes that the review markup is nested within the markup of the business that's being reviewed):”
"review": {
    "@type": "Review",
    "author": {
        "@type": "Person",
        "name": "Reviewer's Name",
        "sameAs": "<a href="http://link-to-persons-profile-page.com">http://link-to-persons-profile-page.com</a>"
    }
    "datePublished": "2016-09-23",
    "reviewBody": "Reviewer's comments here...",
    "reviewRating": {
        "@type": "Rating"
        "worstRating": "1",
        "bestRating": "5",
        "ratingValue": "5"
    }
},

This is a good day to check to see if your schema is as clean and thorough as David’s, and also to consider the benefits of JSON-LD markup, which he describes this way:

“JSON-LD is simply another syntax or method that can be used to insert structured data markup onto a page. Once the markup is created, you can simply insert it into the head section of the page. So it's easy to use in that sense. And Google has stated their preference for JSON-LD, so it's a good idea to make the switch from microdata if a person hasn't already.”

There are some do’s and don’ts when it comes to schema + reviews

I asked David if he could share some expert review-oriented tips and he replied,

Well, in typical fashion, Google has been fickle with their rich snippet guidelines. They didn't allow the marking up of third-party reviews, then they did, now they don't again. So, I think it would be a good idea for businesses to begin collecting reviews directly from their customers through their site or through email. Of course, I would not suggest neglecting the other online review sources because those are important, too. But when it comes to Google and rich snippets, don't put all of your eggs (and hopes) in one basket.

*As a rule, the reviews should be directly about the main entity on the page. So keep reviews about the business, products, services, etc. separate — don't combine them because that goes against Google's rich snippet guidelines.”

And any warnings about things we should never do with schema? David says,

“Never mark up anything that is not visible on the page, including reviews, ratings and aggregate ratings. Only use review markup for the entities that Google allows it to be used for. For example, the review and rating markup should not be used for articles or on-page content. That goes against Google's guidelines. And as of this writing, it's also against their guidelines to mark up third-party reviews and ratings such as those found on Google+ or Yelp.

Ready to dig deeper into the engrossing world of schema markup with David Deering? I highly recommend this recent LocalU video. If the work involved makes you dizzy, hiring an expert or purchasing a paid service are likely to be worthwhile investments. Now that we’ve considered our technical options, let’s consider what we’d like to achieve.

Define your consumer feedback page goals.

rtmike.jpg

If I could pick just one consultant to get advice from concerning the potential benefits of local consumer feedback, it would be GetFiveStars’ co-founder and renowned local SEO, Mike Blumenthal.

Before we dive in with Mike, I want to offer one important clarification:

If you’re marketing a single-location business, you’ll typically be creating just one consumer feedback page on your website to represent it, but if yours is a multi-location business, you’ll want to take the advice in this article and apply it to each city landing page on your website, including unique user sentiment for each location. For more on this concept, see Joy Hawkins’ article How to Solve Duplicate Content Local SEO Issues for Multi-Location Businesses.

Now let’s set some goals for what a consumer feedback page can achieve. Mike breaks this down into two sections:

1. Customer-focused

  • Create an effective page that ranks highly for your brand so that it becomes a doorway page from Google.
  • Make sure that the page is easily accessible from your selling pages with appropriately embedded reviews and links so that it can help sell sitewide.

2. Google-focused

  • Get the page ranking well on brand and brand+review searches
  • Ideally, get designated with review stars
  • Optimally, have it show in the knowledge panel as a source for reviews from the web

This screenshot illustrates these last three points perfectly:

rt4.jpg

Time on page may make you a believer!

Getting excited about consumer feedback pages, yet? There’s more! Check out this screenshot from one of Mike’s showcase clients, the lovely Barbara Oliver Jewelry in Williamsville, NY, and pay special attention to the average time spent on http://barbaraoliverandco.com/reviews-testimonials/:

rt5.jpg

When customers are spending 3+ minutes on any page of a local business website, you can feel quite confident that they are really engaging with the business. Mike says,

“For Barbara, this is an incredibly important page. It reflects almost 9% of her overall page visits and represents almost 5% of the landing pages from the search engines. Time on the page for new visitors is 4 minutes with an average of over 3 minutes. This page had review snippets until she recently updated her site — hopefully they will return. It’s an incredibly important page for her.”

Transparency helps much more than it hurts.

The jewelry store utilizes GetFiveStars technology, and represents a perfect chance to ask Mike about a few of the finer details of what belongs on consumer feedback pages. I had noticed that GetFiveStars gives editorial control to owners over which reviews go live, and wanted to get Mike’s personal take on transparency and authenticity. He says,

“I strongly encourage business owners to show all feedback. I think transparency in reviews is critical for customer trust and we find that showing all legitimate feedback results in less than a half-point decline in star ratings on average.

That being said, I also recommend that 1) the negative feedback be held back for 7 to 10 days to allow for complaint resolution before publishing and 2) that the content meet basic terms of service and appropriateness that should be defined by each business. Obviously you don’t want your own review site to become a mosh pit, so some standards are appropriate.

I am more concerned about users than bots. I think that a clear statement of your terms of service and your standards for handling these comments should be visible to all visitors. Trust is the critical factor. Barbara Oliver doesn’t yet have that but only because she has recently updated her site. It’s something that will be added shortly.

Respond to on-page reviews just as you would on third-party platforms.

I’d also noticed something that struck me as uncommon on Barbara Oliver Jewelry’s consumer feedback page: she responds to her on-page reviews, just as she would on third-party review platforms. Mike explains:

“In the ‘old’ days of reviews, I always thought that owner responses to positive reviews were a sort of glad handing ... I mean how many times can you say ‘thank you’? But as I researched the issue it became clear that a very large minority of users (40%) noted that if they took the time to leave feedback or a review, then the owner should acknowledge it. That research convinced me to push for the feature in GetFiveStars. With GetFiveStars, the owner is actually prompted to provide either a private or public response. The reviewer receives an email with the response as well. This works great for both happy and unhappy outcomes and serves double-duty as a basis for complaint management on the unhappy side.

You can see the evolution of my thinking in these two articles

What I used to think: Should A Business Respond to Every Positive Review?
What I think after asking consumers their thoughts: Should A Business Respond to Every Positive Review? Here’s The Consumer View."

Reviews on your mind, all the time

So, basically, consumers have taught Mike (and now all of us!) that reasonable goals for reviews/testimonials pages include earning stars, becoming a knowledge panel review source, and winning a great average time on page, in addition to the fact that transparency and responsiveness are rewarded. Before he zooms off to his next local SEO rescue, I wanted to ask Mike if anything new is exciting him in this area of marketing. Waving goodbye, he shouts:

Sheesh ... I spend all day, every day thinking about these sorts of things. I mean my motto used to be ‘All Local, All the Time’… now it’s just ‘All Reviews, All the Time.'

I think that this content that is generated by the business owner, from known clients, has incredible import in all aspects of their marketing. It is great for social proof, great user-generated content, customer relations, and much more. We are currently 'plotting' new and valuable ways for businesses to use this content effectively and easily.

I’m experimenting right now with another client, Kaplan Insurance, to see exactly what it takes to get rich snippets these days.”

I know I’ll be on the lookout for a new case study from Mike on that topic!

Plan out the components of your consumer feedback page

rtphil.jpg

Phil Rozek of Local Visibility System is one of the most sophisticated, generous bloggers I know in the local SEO industry. You’ll become an instant fan of his, too, once you’ve saved yourself oodles of time using his Ultimate List of Review Widgets and Badges for Your Local Business Website. And speaking of ‘ultimate,’ here is the list Phil and I brainstormed together, each adding our recommended components, for the elements we’d want to see on a consumer feedback page:

  • Full integration into the site (navigation, internal linking, etc.); not an island page.
  • Welcoming text intro with a link to review content policy/TOS
  • Unique sentiment with schema markup (not drawn from third parties)
  • Specification of the reviewers’ names and cities
  • Owner responses
  • Paginate the reviews if page length starts getting out of hand
  • Provide an at-a-glance average star rating for easy scanning
  • Badges/widgets that take users to the best place to leave a traditional third-party review. Make sure these links open in a new browser tab!
  • Video reviews
  • Scanned hand-written testimonial images
  • Links to critic-type reviews (professional reviews at Zagat, Michelin, etc.)
  • A link to a SERP showing more of the users’ reviews, signalling authenticity rather than editorial control
  • Tasteful final call-to-action

And what might such a page look like in real life (or at least, on the Internet)? Here is my mockup for a fictitious restaurant in Denver, Colorado, followed by a key:

Click to open a bigger version in a new tab!

Key to the mockup:

  1. Page is an integral part of the top level navigation
  2. Welcoming text with nod to honesty and appreciation
  3. Link to review content policy
  4. Paginated on-page reviews
  5. Call-to-action button to leave a review
  6. Easy-to-read average star rating
  7. Schema marked-up on-page reviews
  8. Sample owner response
  9. Links and badges to third party reviews
  10. Link to SERP URL featuring all available review sources
  11. Links to professional reviews
  12. Handwritten and video testimonials
  13. Tasteful final call-to-action to leave a review

Your live consumer feedback page will be more beautifully and thoughtfully planned than my example, but hopefully the mockup has given you some ideas for a refresh or overhaul of what you’re currently publishing.

Scanning the wild for a little sentiment management inspiration

I asked Phil if he’d recently seen local businesses recently making a good effort at promoting consumer feedback. He pointed to these, with the proviso that none of them are 100% perfect but that they should offer some good inspiration. Don’t you just totally love real-world examples?

Lightning round advice for adept feedback acquisition

Before we let Phil get back to his work as "the last local SEO guy you’ll ever need," I wanted to take a minute to ask him for some tips on encouraging meaningful customer feedback.

“Don’t ask just once. In-person plus an email follow-up (or two) is usually best. Give customers choices and always provide instructions. Ask in a personal, conversational way. Rotate the sites you ask for reviews on. Try snail-mail or the phone. Have different people in your organization ask so that you can find ‘The Champ’,” says Phil. “Encourage detail, on-site and off-site. Saying things like ‘It will only take you 60 seconds’ may be great for getting big numbers of on-site testimonials, but the testimonials will be unhelpfully short or, worse, appear forced or fake. Dashed-off feedback helps no one. By the way, this can help you even if a given customer had a bad experience; if you’re encouraging specifics, at least he/she is a little more likely to leave the kind of in-depth feedback that can help you improve.”

Sustain your effort & facilitate your story

Every time Google sharpens focus on a particular element of search, as they are clearly doing right now with consumer and professional sentiment, it’s like a gift. It’s a clanging bell, an intercom announcement, a handwritten letter letting all of us know that we should consider shifting new effort toward a particular facet of marketing and see where it gets us with Google.

In this specific case, we can draw extra inspiration for sustaining ourselves in the work ahead from the fact that Google’s interest in reviews and testimonials intersects with the desires of consumers who make transactional decisions based, in part, on what Internet sentiment indicates about a local business. In other words, the effort you put into acquiring and amplifying this form of UGC makes Google, consumers, and your company happy, all in one fell swoop.

If you took all of the sentiment customers express about a vibrant, given business and put it into a book, it would end up reading something like War and Peace. The good news about this is that you don’t have to write it — you have thousands of potential volunteer Tolstoys out there to do the job for you, because reviewing businesses has become a phenomenal modern hobby.

Your job is simply to provide a service experience (hopefully a good one) that moves customers to start typing, back that up with a variety of ongoing feedback requests, and facilitate the publication of sentiment in the clearest, most user-friendly way.

Some more good news? You don’t have to do all of this tomorrow. I recently saw a Google review profile on which a business had "earned" over 100 reviews in a week — a glaring authenticity fail, for sure. A better approach is simply to keep the sentiment conversation going at a human pace, engaging with your customers in a human way, and ensuring that your consumer feedback page is as good as you can possibly make it. This is manageable — you can do this!

Are you experimenting with any page elements or techniques that have resulted in improved user feedback? Please inspire our community by sharing your tips!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, October 7, 2016

Penguin 4.0: How the Real-Time Penguin-in-the-Core-Alg Model Changes SEO - Whiteboard Friday

Posted by randfish

The dust is finally beginning to settle after the long-awaited rollout of Penguin 4.0. Now that our aquatic avian friend is a real-time part of the core Google algorithm, we've got some changes to get used to. In today's Whiteboard Friday, Rand explains Penguin's past, present, and future, offers his analysis of the rollout so far, and gives advice for going forward (hint: never link spam).

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week, it is all about Google Penguin. So Google Penguin is an algorithm that's been with us for a few years now, designed to combat link spam specifically. After many, many years of saying this was coming, Penguin 4.0 rolled out on Friday, September 23rd. It is now real-time in Google's algorithm, Google's core algorithm, which means that it's constantly updating.

So there are a bunch of changes. What we're going to talk about today is what Penguin 1.0 to 3.x looked like and how that's changed as we've moved to the Penguin 4.0 model. Then we'll cover a little bit of what the rollout has looked like and how it's affecting folks' sites and specifically some recommendations. Thankfully, we don't have a ton.

Penguin 1.0-3x

But important to understand, if people ask you about Penguin, people ask you about the penalties that used to come from Penguin, you've got to know that, back in the day...

  • Penguin 1.0 to 3.x, it used to run intermittently. So every few months, Google would collect a bunch of information, they'd run the algorithm, and then they'd release it out in the wild. It would now be in the search results. When that rollout happened, that was the only time, pretty much the only time that penalties from Penguin specifically would be given to websites or removed.

    This meant that a lot of the time, you had this slow process, where if you got penalized by Penguin, you did something bad, you did some sketchy link building, you went through all the process, you went through all the processes of getting that penalty lifted, Google said, "Fine, you're in good shape. The next time Penguin comes out, your penalty is lifted." You could wait months. You could wait six months or more before that penalty got lifted. So a lot of fear here and a lot of slowness on Google's side.

  • Penguin also penalized, much like Panda, where it looks at a portion of the site, these pages maybe are the only ones on this whole domain that got bad links to them, but old Penguin did not care. Penguin would hit the entire website.

    It would basically say, "No, you're spamming to those pages, I'm burying your whole domain. Every page on your site is penalized and will not be able to rank well." Those sorts of penalties are very, very tough for a lot of websites. That, in fact, might be changing a little bit with the new Penguin algorithm.
  • Old Penguin also required a reconsideration request process, often in conjunction with disavowing old links, proving to Google that you had gone through the process of trying to get those links removed.

    It wasn't often enough to just say, "I've disavowed them." You had to tell Google, "Hey, I tried to contact the site where I bought the links or I tried to contact the private blog network, but I couldn't get them to take it down or I did get them to take it down or they blackmailed me and forced me to pay them to take it down." Sometimes people did pay and Google said that was bad, but then sometimes would lift the penalties and sometimes they told them, "Okay, you don't have to pay the extortionist and we'll lift the penalty anyway." Very manual process here.

  • Penguin 1.0 to 3.x was really designed to remove the impact of link spam on search results, but doing it in a somewhat weird way. They were doing it basically through penalties that affected entire websites that had tried to manipulate the results and by creating this fear that if I got bad links, I would be potentially subject to Penguin for a long period.

I have a theory here. It's a personal theory. I don't want you to hold me to it. I believe that Google specifically went through this process in order to collect a tremendous amount of information on sketchy links and bad links through the disavow file process. Once they had a ginormous database of what sketchy and spammy bad links looked like, that they knew webmasters had manually reviewed and had submitted through the disavowal file and thought could harm their sites and were paid for or just links that were not editorially acquired, they could then machine learn against that giant database. Once they've acquired enough disavowals, great. Everything else is gravy. But they needed to get that huge sample set. They needed it not to just be things that they, Google, could identify but things that all of us distributed across the hundreds of millions of websites on the planet could identify. Using those disavowal files, Google can now make Penguin more real-time.

Penguin 4.0+

So challenges here, this is too slow. It hurt too much to have that long process. So in the new Penguin 4.0 and going forward, this runs as part of the core algorithm, meaning...

  • As soon as Google crawls and indexes a site and is able to update that in their databases, that site's penalty is either lifted or incurred. So this means that if you get sketchy links, you don't have to wait for Penguin to come out. You could get hurt tomorrow.
  • Penguin does not necessarily any longer penalize an entire domain. It still might. It could be the case that if lots of pages on a domain are getting sketchy links or some substantive portion or Google thinks you're just too sketchy, they could penalize you.
Remember, Penguin is not the only algorithm that can penalize websites for getting bad links. There are manual spam penalties, and there are other forms of spam penalties too. Penguin is not alone here. But it may be simply taking the pages that earn those bad links and discounting those links or using different signals, weighting different signals to rank those pages or search results that have lots of pages with sketchy links in them.
  • It is also the case — and this is not 100% confirmed yet — but some early discussion between Google's representatives and folks in the webmaster and SEO community has revealed to us that it may not be the case that Penguin 4.0 and moving forward still requires the full disavow and whole reconsideration request process.

That's not to say that if you incur a penalty, you should not go through this. But it may not be the case that's the only way to get a penalty lifted, especially in two cases — no fault cases, meaning you did not get those links, they just happened to come to you, or specifically negative SEO cases.

I want to bring up Marie Haynes, who does phenomenally good work around spam penalties, along with folks like Sha Menz and Alan Bleiweiss, all three of them have been concentrating on Google penalties along with many, many other SEOs and webmasters. But Marie wrote an excellent blog post detailing a number of case studies, including a negative SEO case study where the link penalty had been lifted on the domain. You can see her results of that. She's got some nice visual graphs showing the keyword rankings changing after Penguin's rollout. I urge you to do that, and we'll make sure to link to it in the transcript of this video.

  • Penguin 4.0 is a little bit different from Penguin 1.0 to 3 in that it's still designed to remove the impact of spam links on search results, but it's doing it by not counting those links in the core algo and/or by less strongly weighting links in search results where many folks are earning spammy links.

So, for example, your PPC, your porn, your pills, your casino searches, those types of queries may be places where Google says, "You know what? We don't want to interpret, because all these folks have nasty links pointing to them, we are going to weight links less. We're going to weight other signals higher." Maybe it's engagement and content and query interpretation models and non-link signals that are offsite, all those kinds of things, clickstream data, whatever they've got. "We're going to push down the value of either these specific links or all links in the algo as we weight them on these types of results."

Penguin 4.0 rollout

So this is what we know so far. We definitely will keep learning more about Penguin as we have more experience with it. We also have some information on the rollout.

  • Started on Friday, September 23rd, few people noticed any changes.

In fact, the first few days were pretty slow, which makes sense. It fits with what Google said about the rollout being real-time and them needing time to crawl and index and then refresh all this data. So until it rolls out across the full web and Google's crawled and indexed all the pages, gone through processing, we're not going to get there. So little effect that same day, but...

  • More SERP flux started three to five days after, that next Monday, Tuesday, Wednesday. We saw very hot temperatures starting that next week in MozCast, and Dr. Pete has been detailing those on Twitter.
  • As far as SEOs noticing, yes, a little bit.

So I asked the same poll on Twitter twice, once on September 27th and once on October 3rd, so about a week apart. Here is the data we got. "Nope, nothing yet." "Went from 76% to 72%," so a little more than a quarter of SEOs have noticed some changes.

A lot of folks noticing rankings went up. Moz itself, in fact, benefitted from this. Why is that the case? Well, any time a penalty rolls out to a lot of other websites, bad stuff gets pushed down and those of us who have not been spamming move up in the rankings. Of course, in the SEO world, which is where Moz operates, there are plenty of folks getting sketchy links and trying things out. So they were higher in the rankings, they moved down, and Moz moved up. We saw a very nice traffic boost. Thank you, Google, for rolling out Penguin. That makes our Audience Development team's metrics look real good.

Four percent and then six percent said they saw a site or page get penalized in their control, and two percent and then one percent said they saw a penalty lifted. So a penalty lifted is still pretty light, but there are some penalties coming in. There are a few of those. Then there's the nice benefit of if you don't link spam, you do not get penalized. Every time Google improves on the Penguin algorithm, every time they improve on any link spam algorithm, those of us who don't spam benefit.

It's an awesome thing, right? Instead of cheering against Google, which you do if you're a link spammer and you're very nervous, you get to cheer for Google. Certainly Penguin 4.0 is a good time to cheer for Google. It's brought a lot of traffic to a lot of good websites and pushed a lot of sketchy links down. We will see happens as far as disavows and reconsideration requests for the future.

All right, everyone, thanks for joining. Look forward to hearing about your experiences with Penguin. We'll see you next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, October 5, 2016

How Your Brand Can Create an Enviable Customer Experience for Mobile Web Searchers

Posted by ronell-smith

57ad48ab1f61d5.60768283.jpg

Not very edible corned beef hash

Here I am, seated in a Manhattan, New York restaurant, staring at corned beef hash that looks and tastes like what I imagine dog food to look and taste like.

I'm pissed for two reasons:

  • It cost nearly $25 and was entirely inedible
  • I should have known better given the visuals depicted after doing a Google image search to find the dish, which was offered at a nearby restaurant

In retrospect, I should have checked A and B on my phone before ordering the $25 plate of Alpo. And though I didn't do that, other would-be customers will, which means the business owner or SEO had better follow the steps below if they wish to stay in business.

The bad news is I no longer relish the thought of eating at high-end NY restaurants; the good news is this experience totally reshaped the way I view mobile, opening my eyes to simple but very effective tactics businesses of all types can immediately put to use for their brands.

My mobile education

We've all heard how mobile is transforming the web experience, reshaping the landscape for marketers, brands and consumers.

57ad39752f7dd4.15352822.jpg

As marketers, we now have to account for how our content will be accessed and consumed on mobile devices, whether that's a phone, tablet or phablet. As brands, we realize our efforts will be judged not only on how well or high we show up in the SERPs, but also on much we can delight the on-the-go prospect who needs information that's (a) fast, (b) accurate and (c) available from any device.

As prospects and consumers, we've come to know and value customer experience in large part because brands that use mobile to deliver what we need when we need it and in a way that's easily consumed, have earned our attention — and maybe even our dollars.

But that's where the similarities seemingly end. Marketers and brands seem to get so wrapped up in the technology (responsive design, anyone?) they forget that, at the end of the day, prospects want what they want right now — in the easiest-to-access way possible.

I've come to believe that, while marketers appreciate the overall value of mobile, they have yet to realize how, for customers, it's all about what it allows them to accomplish.

At the customer/end-user level it's not about mobile-friendly or responsive design; it's about creating an enviable customer experience, one web searchers will reward you for with traffic, brand mentions and conversions.

I was alerted to the prominence of mobile phone use by noticing how many people sit staring at their phones while out at dinner, even as family members and friends are seated all around them. "How rude," I thought. Then I realized it wasn't only the people at restaurants; it's people everywhere: walking down the street, driving (sadly and dangerously), sitting in movie theaters, at work, even texting while they talk on the phone.

One of my favorite comments with regard to mobile's dominance comes with the Wizard of Moz himself, when he shared this tweet and accompanying image last year:

But my "aha!" moment happened last year, in Manhattan, during the corned beef hash episode.

After working until brunch, I...

  1. Opened iPhone to Google
  2. Typed "Best corned beef hash near me"
  3. Scanned list of restaurant by distance and reviews
  4. Selected the closest restaurant having > 4-star review ratings
  5. Ended up disappointed

That's when it hit me that I'd made errors of omission at every step, in large part by leaving one very important element out of the process, but also by not thinking like a smart web user.

Normally my process is as follows, when I wish to enjoy a specific meal while traveling:

  1. Open iPhone to Google Search box
  2. Type "Best _________ near me"
  3. Scan list of restaurants by distance and reviews
  4. Select restaurant having > 4-star review rating but has excellent reviews (> 4.5) of the dish I want and has great images of the dish online
  5. Delight ensues

That's when three things occurred to me like a brickbat to the noggin':

  • This is a process I use quite often and is one that has proved quite foolproof
  • It's undoubtedly a process many other would-be customer are using to identify desirable products and services
  • Marketers can reverse-engineer the process to bring the customers they're hoping for to their doors or websites.

(Eds. note: This post was created with small business owners (single or multiple location), or those doing Local SEO for SMBs, in mind, as I hope to inform them of how many individuals think about and use mobile, and how the marketers can get in front of them with relevant content. Also, I'd like to thank Cindy Krum of Mobile Moxie for encouraging me to write this post, and Local SEO savant Phil Rozek of Local Visibility System for making sure I colored within the lines.)

Five ways to create an enviable customer experience on mobile

#1 — Optimize your images

Image optimization is the quintessential low-hanging fruit of online marketing: easy to accomplish but typically overlooked.

For our purposes, we aren't so much making them "mobile-friendly" as we are making them search-friendly, increasing the likelihood that Google's crawlers can better decipher what they contain and deliver them for the optimal search query.

First and foremost, do not use a stock image if your goal is for searchers to find, read and enjoy your content. Just don't. Also, given how much of a factor website speed is, minify your images to ensure they don't hamper page speed load times.

But the three main areas I want us to focus on are file name, alt text and title text, and captions. My standard for each is summed up very well in a blog post from Ian Lurie, who proposes an ingenious idea:

The Blank Sheet of Paper Test: If you wrote this text on a piece of paper and showed it to a stranger, would they understand the meaning? Is this text fully descriptive?

With this thinking in mind, image optimization becomes far simpler:

  • File name: We're all adults here — don't be thickheaded and choose something like "DSC9671 . png" when "cornedbeefhash . jpg" clearly works better.
  • Alt text and title text: Given that, in Google's eyes, these two are the priorities, you must make certain they're as descriptive as possible. Clearly list what the image is and/or contains without weighing it down with unneeded text. Using the corned beef hash from above as a example, "corned beef hash with minced meat" would be great, but "corned beef hash with minced meat and diced potatoes" would work better, alerting me that the dish isn't what I'm looking for. (I prefer shredded beef and shredded potatoes.)
  • Caption: Yes, I know these aren't necessary for every post, but why leave your visitors hanging, especially if an optimal customer experience is the goal? Were I to caption the corned beef, it'd be something along the lines of "Corned beef hash with minced meat and diced potatoes is one of the most popular dishes at XX." It says just enough without trying to say everything, which is the goal, says Lurie.

“'Fully descriptive' means 'describes the thing to which it’s attached,' not 'describe the entire universe,'" he adds.

Also, invite customers to take and share pictures online (e.g., websites, Instagram, Yelp, Google) and include as much rich detail as possible.

What's more, it might behoove you to have a Google Business View photo shoot, says Rozek. "Those show up most prominently (in the Knowledge Panel) for brand-name mobile searches in Google."

#2 — Make reviews a priority

Many prospects and customers use reviews as a make-or-break tactic when making purchases. Brands, realizing this, have taken note, making it their charge to get positive reviews.

But not all reviews are created equal.

Instead of making certain your brand gets positive reviews on the entirety of its products and services, redouble your efforts at getting positive reviews on your bread-and-butter services.

In many instances, what people have to say about your individual services and/or products matters more than your brand's overall review ratings.

I learned this from talking to several uber-picky foodie friends who shared that the main thing they look for is a brand having an overall rating (e.g., on Yelp, Google, Angie's List, Amazon, etc.) higher than 3.5, but who have customer comments glorifying the specific product they're hoping to enjoy.

"These days, everyone is gaming the system, doing what they can to get their customers to leave favorable reviews," said one friend, who lives in Dallas. "But discerning [prospects] are only looking at the overall rating as a beginning point. From there, they're digging into the comments, looking to see what people have to say about the very specific thing they want. [Smart brands] would focus more on getting people to leave comments about the particular service they used, how happy they work with the result and how it compares to other [such services they've used]. We may be on our phones, but we're still willing to dig into those comments."

To take advantage of this behavior,

  • In addition to asking for a favorable review, ask customers to comment on the specific services they used, providing as much detail as possible
  • Redouble your efforts at over-delivering on quality service when it comes to your core offerings
  • Ask a few of your regulars, who have left comments on review sites, what they think meets the minimum expectation for provoking folks to leave a review (e.g., optimizing for the desired behavior)
  • Encourage reviewers to upload photos with their reviews (or even just photos, if they don't want to review you). They're great "local content," they're useful as social-proof elements, and your customers may take better pictures than you do, in which case you can showcase them on your site.

Relevant content:

#3 — Shorten your content

I serve as a horrible spokesperson for content brevity, but it matters a great deal to mobile searchers. What works fine on desktop is a clutter-fest on mobile, even for sites using responsive design.

As a general rule, simplicity wins.

For example, Whataburger's mobile experience is uncluttered, appealing to the eye and makes it clear what they want me to do: learn about their specials or make a purchase:

57f3dd4b0c9037.76728058.jpg

On the other hand, McDonald's isn't so sure what I'm looking for, apparently:

57f3dfdb8ba5c6.40363967.jpg

Are they trying to sell me potatoes, convince me of how committed they are to freshness or looking to learn as much as they can about me? Or all of the above?

Web searchers have specific needs and are typically short on time and patience, so you have to get in front of them with the right message to have a chance.

When it comes to the content you deliver, think tight (shorter), punchy (attention-grabbing) and valuable (on- message for the query).

# 4 — Optimize for local content

Like all of you, I've been using "near me" searches for years, especially when I travel. But over the last year, these searches have gotten more thorough and more accurate, in large part as a result of Google's Mobile Update and because the search giant is making customer intent a priority.

In 2015, Google reported that "near me" searches increased by 34-fold since 2011.

And though most of these "near me" searches are for durable goods/appliances and their associated retailers, services, including "surgeons near me," "plumbers near me," "jobs near me," etc., and other things that are typically in a high consideration set are growing considerably, according to Google via its website, thinkwithgoogle.com.

A recent case study of 82 websites (41, control group; 41, test group) shows just how dramatic the impact of optimizing a site for local intent can be. By tweaking the hours and directions page titles, descriptions and H1s to utilize the phrases "franchise dealer near me" and "nearest franchise dealer" the brand saw mobile impressions for “near me” more than double to 8,833 impressions and 46 clicks. (The control group's “near me” impression share only rose 11%.)

57f3e93b3725a6.45545049.jpg

Image courtesy of CDK Global

Additional steps for optimizing your site for “near me” searches

  • Prominently display your business name, address and phone number (aka, NAP) on your site
  • Use schema markup in your NAP
  • In addition to proper setup and optimization of your Google My Business listing, provide each location with its own listing and, just as important, ensure that the business name, address and phone number of each location matches what's listed on the site
  • Consider embedding a Google Map prominently on your website. "It's good for user experience," says Rozek. "But it may also influence rankings."

#5 — Use Google App Deep Linking

We've all heard the statistics: The vast majority — in some circles the figure is 95% — of apps downloaded to mobile devices are never used. Don't be deceived, however, into believing apps are irrelevant.

Nearly half of all time spent on the web is in apps.

This means that the mobile searchers looking for products or services in your area are likely using an app or, at the very least, prompted to enter/use an app.

For example, when I type "thai restaurant near me," the first organic result is TripAdvisor.

57f3f59f25e451.06227108.jpg

Upon entering the site, the first (and preferred) action the brand would like for me to make is to download the TripAdvisor app:

57f3f5888e0c16.02910367.png

Many times, a "near me" search will take us to content within an app, and we won't even realize it until we see the "continue in XX app or visit the mobile site" banner.

And if a searcher doesn't have the app installed, "Google can show an app install button. So, enabling your app for Google indexing could actually increase the installed base of the app," writes Eric Enge of Stone Temple Consulting.

For brands, App Deep Linking (ADL), which he defines as "the ability for Google to index content from within an app and then display it as mobile search results," has huge implications if utilized properly.

"Think about it," he writes. "If your app is not one of the fortunate few that get most of the attention, but your app content ranks high in searches, then you could end up with a lot more users in your app than you might have had otherwise."

(To access details on how to set up Google App Deep Linking, read Enge's Search Engine Land article: SMX Advanced recap: Advanced Google App Deep Linking)

If your brand has an app, this is information you shouldn't sleep on.

Typically, when I conduct a "near me" search, I click on/look through the images until I find one that fits what I'm looking for. Nine times out of ten (depending upon what I'm looking for), I'm either taken to content within an app or taken to a mobile site and prompted to download the app.

Seems to me that ADL would be a no-brainer.

Optimizing for mobile is simply putting web searchers first

For all the gnashing of teeth Google's many actions/inactions provoke, the search giant deserves credit for making the needs of web searchers a priority.

Too often, we, as marketers, think first and foremost in this fashion:

  1. What do we have to sell?
  2. Who needs it?
  3. What's the cheapest, easiest way to deliver the product or service?

I think Google is saying to us that the reverse needs to occur:

  1. Make it as fast and as easy for people to find what they want
  2. Better understand who it is that's likely to be looking for it by better understanding our customers and their intent
  3. The sales process must begin by thinking "what specific needs do web searchers have that my brand is uniquely qualified to fulfill?"

In this way, we're placing the needs of web searchers ahead of the needs of the brand, which will be the winning combination for successful companies in the days ahead.

Brands will either follow suit or fall by the wayside.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!