How to speak ‘Search Engine’

one great answer

The challenge of how to ‘speak’ search engine and tell it how to surface our content is what Search Engine Optimisation is all about. But are we doing it as well as we could?

Christian J. Ward, partnerships lead at Yext, gave a webinar in partnership with Brighton SEO on ‘How to Speak Search Engine’, in which he looked at the current state of search and the problems inherent in how we produce the content that we expect search engines to find.

Search has changed dramatically since Google first began indexing the web in 1998, both in scale and in nature. Google alone executes more than two trillion searches every year – a scale that we can barely comprehend. Search, said Ward, is not just a process for a brand; it’s becoming the number one way that we interact with information generally.

But the way that we search has changed, too. At a recent CMA Digital Breakfast, digital journalist Adam Tinworth remarked that Google is becoming “much more of an answer engine” than a search engine – searches are increasingly phrased in the form of a question, and innovations like the Knowledge Graph and Featured Snippets aim to answer searchers’ questions without them needing to leave Google.

We all want Google’s ‘answer engine’ to surface our content in response to searcher queries. One way to help ensure this happens is to write content that will satisfy questions that users might have when coming to our websites.

But even once we have, how can we direct Google and other search engines to the content that will provide the best answer?

Feeding baby Google

To illustrate a problem inherent with the way that we approach content online, Ward used an image which has to be the best depiction of ‘peak content’ that I’ve seen so far.

These days, brands and websites are churning out more content than ever before in an effort to keep up with each other: blogs, ad copy, sponsored content, product write-ups, ordinary webpages and lots more.

“We’re trying to feed Google – the baby – great content information that, to some degree, it doesn’t want,” said Ward.

At least, not in a form that it can’t easily interpret.

“We pump out so much content that it is very difficult for Google to analyse it and to know what we’re talking about. And it’s partially because it’s unstructured content.”

As an example of how confusing this can be in practice, Ward looked at the search term “tombstone”, which has a whole array of possible meanings: Tombstone is the name of a popular 90s Western; it’s the name of a town in Arizona (for which the film was also named); a word meaning ‘headstone’ or ‘gravestone’; a brand of pizza; a Marvel comic book, and more. Which of these is going to be most relevant to the searcher?

A Google search results page for the keyword "tombstone". A drop-down list below the search bar shows the suggested searches "tombstone - film" and "tombstone - city in Arizona" as well as "tombstone cast" and "tombstone pizza". The search results mostly relate to the film Tombstone, and also include a Twitter user named TheLivingTombstone.

Of course, part of the game here is trying to guess what the searcher intends when they search for the word “tombstone”. But in our content, as well, we have to make it clear which “tombstone” we’re referring to, so that Google can more easily hone in on the right content and serve it to the user.

If you have a webpage about tombstones, and Google can’t tell whether it’s about headstones or pizzas, it won’t be able to show it to a user who is searching for one or the other.

Search engines want to provide their users with more rich data in search results: useful information like event dates, reviews, menus and other details that can answer their query at a glance, or at least help them decide which result will be the most relevant.

Ward quoted Sundar Pichai, the CEO of Google, who in his keynote speech at Google I/O, said,

“It’s not just enough to give [users] links. We really need to help them get things done in the real world.”

Ward believes that Google is working towards an eventual solution which means users will never have to open an app or website.

While this sounds like a very distant future (after all, there are bound to be some circumstances in which users are searching in order to find a website or app, not just an answer from Google), there’s no denying that Google has taken a huge step in this direction in recent years.

Putting definition around the cow

So what can content creators do to move with this trend, and set their websites apart from everything else in the vast sea of online content?

Warner showed a black-and-white photograph, which has been used by Ellen Langer in her work on mindfulness, and asked webinar attendees to volunteer what they thought it was a picture of.

cow illusion

Suggestions came back: a turtle, a skull, the Hindenburg. But when a few guiding lines were added to the image, the subject became clear: it is in fact a picture of a cow.

cow illusion 2

“Now that you see it, it’s impossible to unsee it,” said Ward. “There’s a lot of relationship around that, where just a little bit of definition can burn a pathway. And search works a lot like that.”

In other words, content creators need to put that bit of ‘definition’ around their content in a way that tells search engines what it represents, and what type of content it is. There’s a way to do this using search engine ‘language’, and it’s called structured data.

Structured data has been around for a few years now, and is known as a way to help search engines assess and understand content in order to better place it on the SERP. Yet in spite of this, a shockingly low proportion of website owners actually make use of it.

The Schema.org logo, consisting of white sans-serif text reading "schema.org" on a dark red background, with a slight shadow around the letters.

Take Schema.org, a markup language that is the result of a collaboration between Google, Bing, Yahoo! and Yandex to create a structured data vocabulary that can be understood by all search engines.

A study by Searchmetrics in 2014 found that 36.6% of Google search results incorporated Schema rich snippets, yet only 0.3% of websites actually made use of Schema markup at all.

The study also found that pages which used Schema ranked on average 4 places higher in search than pages which didn’t, although Searchmetrics was keen to emphasise that this might not be entirely down to structured data.

But search results which use Schema are widely agreed to result in higher click-through rate, as they include more useful, relevant and attractive information like pictures, reviews, opening hours, pricing information and more.

So since this study was conducted two years ago, has the number of pages marked up with Schema increased significantly?

Ward did some quick calculations. The Schema.org website proudly proclaims that “Over 10 million sites use Schema.org to markup their pages and email messages.”

Five most interesting search marketing news stories of the week

Symptoms Search

Welcome to our weekly round-up of all the latest news and research from around the world of search marketing and beyond.

This week we have as many updates from Google as you can stand and some hilarious/depressing social shenanigans.

Let Google take care of you, there there, this won’t hurt a bit

Dr Google is on the ward and ready to roll out a few new procedures. According to a blog post this week the FIND A TERM OTHER THAN ‘SEARCH GIANT will provide searchers with immediate medical advice in its Knowledge Graph answer box when searchers type in particular symptom queries, which apparently accounts for 1% of all search terms.

Now I was fully prepared to suggest this was a terrible idea, but as anyone who suffers from health anxiety can attest, the internet is a terrifying place when it comes to self-diagnosis.

Every search for even the most minor of ailment tends to lead to a cancer diagnosis. However Google recognises this fact and is working with trusted doctors and high quality medical information to bring you these results.

Rather than this guy…

dr nick billboard

Google Analytics will now warn you about Hackers

According to its Webmaster Central Blog and following on from the launch of Safe Browsing, a service that warns users of malware or phishing attacks, Google will expand its set of alerts in Google Analytics by adding notifications about sites hacked for spam in violation of its Webmaster Guidelines.

“In the unlikely event of your site being compromised by a third party, the alert will flag the affected domain right within the Google Analytics UI and will point you to resources to help you resolve the issue.”

compromised_sites_suspected

Google also revealed that it has seen a 180% increase in sites hacked for spam compared to the previous year, however direct contact with website owners increases the likelihood of a fix by 75%.

Google won’t however warn you about watching the movie Hackers, there are plenty of other online resources for that, most of which will tell you it’s dated horribly.

And it really has.

Search Console adds new ‘rich results’ filter

Following on from the addition of a rich cards section to its Search Console service, Google has now also added a ‘rich result’ filter to its Search Analytics.

Just navigate to Search Traffic>Search Analytics, then click on Search Appearance filter to select the ‘rich result’ option.

google-filter-rich-results

This will tell you how well your rich snippets and cards are doing in terms of impressions, clicks, CTR and position.

Thanks to Danny Sullivan over at SEJ for the info and screen grab.

Facebook is testing a new way to make damn sure your friends see your posts

As I reported earlier in the week, Facebook has been trialling out a new way of notifying your friends directly about your status updates.

facebook notify friends

Although only rolled out to a handful of people in the UK, Canada and Spain, this is an interesting experiment, that means you can nudge up to 10 of your friends and say, “Hey look at me, why don’t you pay attention to me anymore? Is it because of spammy behaviour such as this?”

We’ll see how long it lasts.

Just because they’re sharing, it doesn’t mean they’re reading

And finally that hilarious/depressing research I warned you about. As reported by us this week, research has revealed that only 41% of people actually read the links they share on social.

Basically 6 out of 10 people just click retweet upon seeing a headline (and less than only 140 characters of context) and nothing more.

Despicable behaviour I’m sure you’ll agree.

Thankfully we here at SEW have incredibly attentive readers who will conscientiously read every single word we write before sharing it, and who are also incredibly attractive, brave, generous and I think I can wrap this up here, nobody will notice if I just fill the rest of my word count with some ‘hipster ipsum.’

Messenger bag vice whatever biodiesel affogato pug 3 wolf moon beard bushwick celiac art party flannel. Flexitarian jean shorts offal, celiac tofu chicharrones retro chia fingerstache gastropub asymmetrical dreamcatcher yr pop-up kogi. Craft beer seitan salvia, typewriter organic photo booth.

How to keep the ‘person’ in ‘personalization’ without being a creep

A businessman is consulting a crystal ball to foretell the future.

Some brands don’t target well enough, while others go way too far, creeping people out. With personalized marketing, striking a delicate balance is the key.

Personalization is an important skill for any marketer to master, but it’s also quite a difficult one. There are just so many different ways it can go wrong.

If your ads don’t have any personalized components, people perceive them as not being relevant enough, making them that much more likely to use an ad blocker (provided they’ve heard of ad blockers).

But on the flip side, it is possible to go overboard with personalization. If people perceive your brand to be like Big Brother, they’ll be just as turned off.

As with many things, the answer is somewhere in the middle. If you’re looking to personalize your ads, the most important thing to remember is that the root of that word is “person.”

The pitfalls of getting too computerized

Jan Jensen, chief marketing officer (CMO) of Cxense, points out that as marketing gets more complicated, the sophistication of automation technology makes it easier for advertisers to get away from the person.

“In this day and age, where we have more moving pieces, it’s very complex. Being able to know the challenges, pain points, grievances and profiles of your audience is 20 times more important than it was five years ago,” says Jensen. “Depending on what people do, the data we have, their interest and intent around the content they consume and the profile we have on them, we can predict what they want next.”

As a result, things can get a bit too automated. Marketers often create a journey for customers, automate it, and move onto the next thing. But in the process, they’re assuming too much. There should be a balance.

For instance, artificial intelligence (AI) lacks the human emotions to realize it’s doing something insensitive. A robot would serve someone tons of ads for diet products, whereas a person could see how doing that constantly could hurt the customer’s feelings.

Jensen believes this extends to all the brands who have been using chatbots as customer service tools.

“I think chatbots have their place somewhere extremely straightforward, but human beings aren’t very straightforward,” says Jensen. “I think they need to be much more aware of who you are; it needs to know more than that I’m a male and my name is Jan.”

… as well as too creepy

On the flip side, tech companies like Amazon and Facebook know far more about us than our names and gender. They may even know us better than some of our loved ones; does your mom know what you Google? (Ew, she does? Gross.)

But you have to be careful about showing your cards. If you let people know just how much you know about them, more than being simply freaked out, they can hurt your ROI. Richard Sharp, chief technology officer at Yieldify, points to a concept called psychological reactance that will be familiar to anyone who’s raised – or been – a teenager.

“If people feel their behavioral freedom is being restricted or manipulated, they will explicitly react against that in order to restore their freedom,” explains Sharp. “A lot of research shows that when people perceive creepiness online, it results in this feeling of, ‘You’re trying to manipulate me and force me to take this cause of action,’ which causes people to react against the brand, which decreases purchase intent by 5 percent.”

The way around that, according to Sharp, is to be very upfront about the value proposition. If an email from Bloomingdale’s is highly personalized, people may be a bit taken aback. But if the email has information about a sale at the nearest Bloomingdale’s location, they may perceive it differently.

“If you trigger a really well-designed campaign at the point when someone is about to leave the site or stop browsing, it catches people’s eyes and draws them back to the site,” adds Sharp. “It converts really well without interrupting the customer journey or annoying people.”

In order to keep the person and personalization, Sharp recommends user-centric research.

“A lot of marketers like looking at numbers and click-through rates and conversions and cost-per-acquisitions, which is a view that takes the human out,” he says. “Get some qualitative data to back up this quantitative data so the human side doesn’t get lost behind a wall of numbers.”

Google enters the Artificial Intelligence race with Magenta

Aritifical intelligence stat IDC

The words Artificial Intelligence can bring to mind far-fetched, sci-fi ideas and a society where robots have replaced humans. Well, this idea may not be too far off given Google’s recent innovations.

Google recently released Magenta, a computer based system that has the ability to create pieces of music.

Even though its first melody sounds like a generic song pre-programed to a keyboard, the project is considered a success because the system taught itself. The system composed the 60-second melody with little human intervention.

Google engineers only provided Magenta with four notes to begin the process. They also added drums to add a bit of flair to the song.

Magenta is a project from the Google Brain team that questions the traditional view on computers. In the past, computers were generally seen only as an electronic device used for storing and processing data. But now, Magenta questions all of that.

With Magenta, the Google Brain Team is asking ‘can computer can learn to create compelling music all by itself?’

According to a blog post from Google, Magenta comes with two goals. First, it stands as a research project to advance the state of the art machine technology for music and art generation.

Second, the Google engineers hope that with Magenta they will create a community of artists, musicians, developers, and machine researchers.

While Google is the first to admit the program is in its infancy, the company is hopeful of its future potential. The company writes:

“We don’t know what artists and musicians will do with these new tools, but we’re excited to find out. Look at the history of creative tools. Daguerre and later Eastman didn’t imagine what Annie Liebovitz or Richard Avedon would accomplish in photography. Surely Rickenbacker and Gibson didn’t have Jimi Hendrix or St. Vincent in mind. We believe that the models that have worked so well in speech recognition, translation and image annotation will seed an exciting new crop of tools for art and music creation.”

This isn’t Google’s first rodeo with artificial intelligence, but the company sees Magenta as a stepping stone into the world of natural language processing. This move comes after Google noticed that more and more searches are being done by voice. Consequently, users expect their machines to understand the context of their commands.

Sundar Pichai, CEO of Alphabet’s core Google division, explains “We think of this as building each user their own individual Google. Google does a lot of things, but if you peel away everything and you distill it, this is the heart of what we do. It’s what we are about.”

But Google isn’t the only tech company in the AI game. Earlier this year, Microsoft machine learning technology drew a Rembrandt painting through a 3D scanning device that gathered information from 300 plus paintings. The result was an original, unique self-portrait of the Dutch artist.

IBM has been working since 2005 to develop a supercomputer named Watson and Google’s Android has created an ‘open-ecosystem’ that will let users incorporate different technologies on one domain.

Even Space Exploration Technologies Corporation (SPACE-X) believes that these developments in artificial technology will create computers so sophisticated that humans will eventually need to implant “neural laces” in their brain to keep up.

But the true fight to be the best in artificial technology is between Google and Facebook.

Recently, Google released to the technology world that their program AlphaGo was able to beat the ancient game ‘Go’, which has long been considered the most challenging game for any artificial intelligence to learn.

Within days, Facebook mentioned that they were close to achieving the same success which demonstrated their seriousness in joining the AI race.

The social network also introduced its Deep Text understanding technology recently. The innovative software can understand the textual context of thousands of posts per second with almost human accuracy.

Deep Text can span over 20 languages, and Facebook plans to use this technology to improve user experience in different ways. It wants to help identify the best quality comments in a public post, improve transaction performance, and increase its Messenger app’s user-friendliness.

Both companies have completely different methods to achieve success in the AI world. While Facebook is generally concerned with improving its users’ experience, Google is hoping to integrate AI into all aspects of their services.

So where does this leave us?

AI stat IDC

According to the International Data Corporation (IDC), by 2020, the market for machine learning applications will reach a whopping $40 billion and about 60% of those programs will run on the platforms of four different companies: Amazon, Google, IBM, and Microsoft.

Both Facebook and Google are choosing to attract advertisers to their platforms by offering compelling techniques that will give marketers a better return on their ad spending. This action will undoubtedly make technology more accessible and improve its adaptability.

For Google, it hopes to change the artificial intelligence game. The company knows it won’t happen immediately, but its goal is to make their intelligence software applications widely accessible.

The IDC estimates that, by 2018, at least 50% of developers will include AI features, so it is clear that Magenta is just the springboard for bringing artificial intelligence mainstream.

As to how this news affects the SEO world – well, Penguin still hasn’t reared its head. Could it be that the update’s AI isn’t done learning yet? If so, how smart will it be?

How to optimise your m-commerce site for search

A picture of a girl with short, dark hair and glowing white headphones, wearing a skintight metallic body suit and white gloves. She is flying through the air and punching ahead of her with one fist, speed lines indicating her fast motion.

In 2015, Google announced that for the first time ever, it was seeing more searches taking place worldwide on mobile devices than on desktop.

The steady and unerring shift towards mobile over the past few years has meant that online retailers now need to think about ‘m-commerce’ as well as ecommerce.

M-commerce is the name given to retail transactions which take place on mobile, and retailers are increasingly aware of the importance of having a dedicated, optimised mobile presence in order to attract these transactions to their site.

But how can you make sure that your m-commerce site is as visible as possible to searching shoppers? Drawing on insights from our recent ClickZ Intelligence report, ‘The DNA of a Great M-Commerce Site’, here are some practical steps that you can take to make sure your m-commerce site is optimised for search.

Make sure your site is fully mobile-responsive

Having a mobile-responsive site is essential even when you aren’t looking at SEO. If your site is properly optimised for mobile, your customers will have a better experience, and will be more likely to convert and complete a transaction instead of getting frustrated and abandoning the process.

As Salvador Carrillo, CEO of Mobile Dreams Factory, writes in the report:

“Be clear that consumer behaviour is different in mobile, so everything is about UX and design – fewer clicks, prioritised search, short description, click to buy etc. The more frictionless the experience (payments, confirmations, refill, and so on), the more conversions.”

Source: Google Resources for Webmasters

And on top of the advantages for customer experience and conversion, it’s a well-known fact that Google ranks sites that are fully mobile optimised higher up in search results, identifying them with a ‘mobile-friendly’ label to alert users to the sites where they will have a better experience. Bing also accounts for mobile optimisation when crawling and ranking sites.

Clearly, if you want to improve the search ranking of your m-commerce site, proper mobile optimisation is step one. So how can you make sure that your site is mobile-responsive as per Google’s standards?

To help you out, we’ve put together a comprehensive checklist of ways that you can make sure your m-commerce site passes all of Google’s tests to get that ‘mobile-friendly’ certification in search. And Andy Favell, the author of the ClickZ Intelligence report on m-commerce, has laid out how you can comprehensively test the mobile usability of your site when you’re done.

Implement app indexing and deep-linking

Now, you don’t necessarily need to have an app in order to have the best possible mobile presence for your retail business.

At our last #ClickZChat on what makes a great m-commerce experience, our intrepid tweeters discussed whether businesses should invest in a mobile site, or a dedicated app, for commerce, and concluded that it can depend on their needs.

@ClickZ A1 Both can be useful. It depends on their goals, their audience, their budget, etc #clickzchat

— Tereza Litsa (@terezalitsa) June 8, 2016

@ClickZ depends what the product is. App if you have an account and make regular purchases. Low frequency use should be onsite #ClickZChat

— Stephanie Bellis (@steffi_j_b) June 8, 2016

But if you do have an m-commerce app or think that building one would be most suitable for your business, app indexing and deep-linking will give you a huge competitive advantage in search.

App indexing is when Google’s search ‘spiders’ crawl an app in the same way that they do a website, and present content from the app directly within search results. Tapping on that link will launch the app, if the user has it installed, and take them directly to the content.

App indexing is a ranking factor in search as well as being a great way to promote, yet only a minority of brands are making use of it. A study by Searchmetrics, conducted on the 100 most visible websites in Google US searches, found that only 30% of those with an Android app and 19% with an iOS app had implemented app indexing.

So in case you needed any more reasons to set this up, it will almost certainly allow you to get ahead of your competitors who aren’t yet using it. To get started, Dan Cristo has written a step-by-step guide on how to set up app indexing as a developer.

Speed up your site

Aside from a poorly adapted user interface, one of the biggest issues that can kill a mobile audience’s interest is site speed (or the lack of it). Google recognises this, which is why it has confirmed that the next mobile-friendly update will include page speed as a ranking factor.

According to Kissmetrics’ statistics on how website performance affects shopping behaviour, 40% of web users will abandon a page if it takes longer than three seconds to load, while a one-second delay (or three seconds of waiting) tends to decrease customer satisfaction by about 16%.

So the loading speed of your m-commerce site can have a real impact on your bottom line, as well as on your SEO.

Image by Alan9187, CC0 public domain image

As Andy Favell writes in the ‘DNA of a Great M-Commerce Site’ report,

“With mobile, less is more. Less clutter. Fewer clicks. Fewer, smaller pictures. Easy navigation. If your responsive site is sending everything from your PC site to the mobile device, it will slow down load times and chomp through the customer’s data allowance.”

This is sound advice for improving both user experience and site load time: the less clutter, the better. Nor does it just apply to your mobile site (or app): site speed has long been a search ranking factor on desktop, so decreasing load times for all iterations of your site will improve your SEO across the board – as well as your customer satisfaction.

Kristi Hines’ piece for Search Engine Watch on why page speed should be your next focus has some actionable steps you can take to improve your site speed, including looking at your web host, website technology and content. And don’t miss Matt Owen’s detailed guide on how to optimise your page images to increase site speed.

There is one other option which can greatly improve the speed of your mobile presence: Google’s Accelerated Mobile Pages, or AMP.

Launched just four months ago, Accelerated Mobile Pages are specifically designed to load lightning-fast by stripping out much of the clutter that normally slows down page load times, like third-party scripts, trackers and in-line styling.

A screenshot of Google.com mobile results for "EU referendum", showing AMP-ified BBC News stories in the "top stories" carousel at the top of search results.

Google boasts that sites created with AMP can load anywhere from 15 to 85% faster than non-AMP mobile sites, which is bound to be a huge advantage for SEO; to say nothing of the fact that Google, as the creator of AMP, has a vested interest in promoting AMP websites in its search results.

The drawbacks are a lot of extra work for developers, as using AMP means creating an entirely separate version of a mobile site using Google’s new AMP-HTML web language.

SEOs are currently divided about whether or not to go all-in on AMP, and there is also the danger of spreading your web presence over too many platforms, which requires extra work and investment to maintain.

As with the mobile website versus app debate, ultimately it comes down to where you want to allocate your resources, and what you think would best suit your business.

Learn to ‘think mobile’

One of the biggest mistakes you can make when designing and optimising for mobile is to assume that mobile users have the same wants, needs and behaviour as desktop users.

Think about the way that you yourself browse, search and shop on mobile: you’re likely to be doing so in fundamentally different circumstances, and for different reasons, than if you were at a desktop computer.

An image of a stick person holding a mobile phone with the words "I want to..." underneath. To the right is a list of 16 options for things the mobile user might do, such as "Send a text message", "Watch a video", "Check the weather", "Call Mom" and "Listen to a song." At the bottom is a credit to Google Search Quality Guidelines.

Andy Favell recommends reading Google’s search quality evaluator guidelines (PDF), specifically the section on Understanding Mobile User Needs (page 56), to get a feel for how mobile search queries can differ from desktop queries, and why.

Google breaks down mobile search queries into four main categories: know, do, website and visit in person queries, and explains how they may have different user intent than the same queries on a desktop computer.

Understanding the types of queries that mobile users have, and what they are looking for, will help you to better cater to them with your site.

It is also important to remember voice search, which is increasingly how users interface with their mobile devices. Voice search queries use natural language, and it’s well worth looking into what this is and how you can adapt your site to better satisfy natural language and voice queries.

A stock photograph of a long-haired woman, shown from behind, using a Samsung mobile phone.The best way to get a sense of what works on mobile is to use mobile yourself

The best way to get a sense of how mobile users navigate the web, of course, is to do so yourself. As Favell writes in the report:

“There are tools that can help, but for m-commerce sites there really is no substitute for getting your mobile device(s) out and conducting web searches.”

Try out different search terms, using both keywords and natural language, and see how well your site ranks for each.

Google has confirmed that it is developing a separate mobile index to better answer queries on mobile devices where the user intent may differ. So the better you optimise for mobile now, the more of an advantage you’ll have with visibility when the new index, as well as subsequent updates to Google’s mobile algorithm, are launched.

For much more insight into the DNA of a Great M-Commerce Site, as well as other detailed reports and best practice guides on achieving digital dominance, head over to ClickZ Intelligence or browse our Reports Library.

Why are we so bad at social media customer service?

http://www.newvoicemedia.com/blog/the-multibillion-dollar-cost-of-poor-customer-service-infographic/

While social media marketing campaigns have always grabbed the lion’s share of the headlines, customer service is the area where the real battles for market dominance are being waged.

Providing good customer service is not just about differentiation, it is business-critical.

So… why is everyone so awful at it?

There are a lot of reasons customer service isn’t up to scratch. It’s a new discipline. In many cases it’s grown organically. A majority of businesses still file social under the marketing banner, rather than as a service department, which means that there are conflicting interests vying for channel space.

This means that the market is under-serviced in many cases. According to 2015 data, the majority of businesses using social media are only able to respond to two-thirds (66%) of the social media interactions they receive.

This issue is actually compounded in businesses where social customer service is part of the wider customer service function.

Channel expertise is at a premium, meaning there is often a lack of structure between the people running the Twitter account and the people on the phone. What should be a beautiful, frictionless experience for a customer becomes a hell of multiple calls, and explaining issues over and over again.

It’s worth remembering that by the time someone is complaining about your business online, it is probably because your other channels have already failed them. You are starting with a customer who is mad as hell and isn’t going to take it anymore.

No amount of brand-building is going to counteract that. And just so we’re clear on the impact, 40% of US consumers have taken their business to a competitor brand based purely on superior customer service.

How do we start providing good service through social?

It would be remiss of me not to mention that I’ve recently finished writing an enormous social media customer service best practice guide on just this subject, which you can access through ClickZ Intelligence, but just like customer service, it would also be bad of me not to at least try to solve the issue in this post.

The most forward thinking organisations have begun to address these issues by creating posts that are designed to completely own customer experience. Rather than separating touchpoints by channel, a Chief Experience Officer or Chief Customer Officer is primarily charged with making sure that the customer has a good time, all of the time.

On the face of it this seems straightforward (It’s not), and there is definitely a school of thought that says it is as much about mindset and culture as it is systems and processes. The realisation that every department is on the same P&L is, perhaps surprisingly, not a common one in business.

Different channels, different metrics

I mentioned channel expertise earlier. The ability to understand how interactions occur on different platforms is key to successful implantation, because it will fundamentally affect how you measure success.

In the case of email or telephone, it was historically common practice to base reporting on ‘number of closed cases’. This obviously does not always motivate the service representative to supply customers with the best answer to an issue. Merely the quickest.

https://www.clickz.com/intelligence/report/a-marketers-guide-to-social-media-customer-service/

Retailer response times on Facebook

This is again compounded by social, where it is not a linear conversation. A phone call may take ten minutes to complete. A contact through Twitter may be answered immediately, but the customer may not respond for several hours. Time-to-resolution is not a fair or useful metric here.

Also, while it is strategically possible to remove customer satisfaction from channels, it is not as easy to separate it from departments. If your marketing team is providing customer service, then you can bet they’ll want that value reflected in their monthly reports.

The fact that at least a third of social media questions go unanswered is also an issue bought on by a failure to apply considered metrics to social customer service. Marketing has often been guilty in the past of ‘everything, everywhere’ approaches to social. We have to be on Snapchat and Pinterest and Twitter and YouTube and…

Hold your horses.

Success in any form of social media is dictated by the quality of service you can provide. Whether that’s an interesting Facebook page or a raft of multimedia omnichannel responses. If you cannot resource for these channels, then the most valuable thing a business can do is work out which channel is most used by their customer base, and concentrate on responding on that channel.

As businesses become more complex, so too does customer service. Monitoring tools are extremely advanced, but if they do not have a native language speaker setting up initial Boolean search terms, then they will miss a huge number of interactions (If you’d like to see this in action, try typing ‘SEO’ into search.twitter.com and see how many returns you get from Korea that have nothing to do with Search Marketing).

Although these systems are still developing, many use tracking and logging processes designed for traditional CRM. Where ‘traditional’ CRM provides a customer persona based on their interactions with a business by phone, email, through a website or in person. Social CRM data includes every interaction that customer makes with any business, so can be far more valuable if collected and utilised properly, but it requires a more comprehensive tracking and response process.

There is no simple way to provide great customer service through social, but it is achievable, and perhaps more importantly, it has clear commercial value. Forrester found that 45% of users will abandon an online purchase if they can’t quickly find answers to their questions.

The trick is to find out where that customer is online and be ready to provide that information.

Cross channel marketing: three tips from our customer journey report

meeker report

One of the biggest problems marketers have today is in dealing with the changing nature of the customer journey and figuring out how they should plan their cross channel marketing strategies.

In order to address this, ClickZ Intelligence has just published its Future of the Retail Customer Journey report, available to subscribers.

The customer journey has been changing significantly over the past few years. This was highlighted recently in Mary Meeker’s annual trends report.

In our report we highlight that the traditional linear customer journey that resembles a funnel has changed to one that is much more dynamic with the explosion of digital channels.

The author Martin Talks sums this up saying:

“Today, the process that shoppers go through online is much more complex and varied. The new multi-channel journey is characterised by customers weaving in and out of online and offline using multiple devices and accessing information provided by a variety of sources, including fellow shoppers, wherever and whenever.”

So how can marketers respond? Below are a few key tips.

Accept and facilitate cross channel behaviour

In the report we give an example of how a lack of adaptation caused Jessops to collapse, with the staff not being very happy with how customers had been “showrooming.”

jessops sign

Instead brands should look to adapt to this by providing great cross channel, multidevice experiences.

An excellent example of this comes from Neiman Marcus, where customers can use the app not only to shop but to FaceTime with local sales associates in-store. This was driven by research showing that customers who spoke with sales associates typically spent more, so adding digital to this provides a further catalyst for growth.

Another method is to add tablets within your store, as shown by Marks and Spencer.

Integrate your systems and data for better cross channel marketing experiences

Integration of data is a topic that is explored in another ClickZ Intelligence report on Operationalising Customer Experience.

Many businesses do not understand that the ‘soft’ element of experience often requires ‘hard’ data flows to the right systems at the right time. While a click and collect process should appear easy and pleasant for a customer, a lot of thinking and hard work needs to occur on the business side to make that happen.

The author Andrew Campbell summarises this as below:

crm planning framework

Focus on seamless service

With so much choice and competition now, retailers should approach cross-channel marketing with an emphasis on seamless service.

As Martin Talks notes:

“Retailers have known for some time that offering a consistent brand experience across channels is crucial for developing trust. This needs to extend to the service experience too. This is often described as the omnichannel approach to retail.”

An innovative example comes from Hobbs and Doddle, where customers can order clothes online to be delivered to Doddle collection points at major travel hubs. The customer can then try them on in changing rooms at these location points and either wear them on the journey home or return instantly, if not right.

doddle

For further key takeaways, check out the Future of the Retail Customer Journey report now.

Facebook is testing a new way to make damn sure your friends see your posts

miffy twitter

Learn all about a new feature that Facebook is testing that allows you to notify select friends about all your hilarious misadventures and pictures of sausage dogs. But first, a little story…

It began with me discovering that Darren Hayes, the lead singer of 90s Australian pop band Savage Garden, had blocked me on Twitter.

Yes, that’s right, the singer of ‘Truly Madly Deeply’ and ‘To the Moon and Back’ does not wish to hear all the various witty things I have to say on Twitter.

I have no idea why or how this has happened – it may have just been an accident, or perhaps there’s another Christopher Ratcliff out there causing him grief – nor will I go through the boring reasons how I found this out (NO I WASN’T STALKING HIM, although yes that would explain a few things), but this has made me think in my many years as a music writer, “how many other minor celebrities have blocked me without me knowing?”

Phew, it’s okay, me and Miffy are still tight.

Anyway, the point is, I was amused and confused by this in equal measure, and therefore I felt the need to share this development with my friends. And rather than share it on Twitter and risk upsetting Mr Hayes any further, I thought I’d share it with my close friends on Facebook instead…

Facebook status update

That’s when I noticed a brand new feature on my newly updated status. Check out the bottom of the above image.

I can ‘notify a few friends about this post’ by clicking the Get Started button.

This leads me to a text box, where I can type in up to 10 friends’ names who will each receive a notification about my update, but no knowledge of who else has received it. Unless they tell one another in the comments under the post, which would obviously be terribly embarrassing.

Playing out this whole scenario in my head, I chose only to notify my wife.

facebook notify friends

And here’s the notification she received…

facebook notification

Which then led to an awkward conversation where I questioned why she hadn’t then liked my post, but whatever!

Apparently this is just a test rolled out to a select few in the UK, Canada and France. The fact that I have this new feature *maybe* makes up for the fact that I was seemingly the last person in the world to get their ‘Reaction’ emojis.

Previously you could (and still can) just tag your friends in any post and that would guarantee they’d see it. They would also see who else is tagged in the post, and everyone would receive notifications regarding any interaction around it.

You have to use this carefully as it can be a bit spammy, especially if you’re tagging loads of friends, multiple times.

This new feature instead adds a layer of secrecy to proceedings, where nobody has to know who else is notified.

However, again, you have to be really careful about doing this, as it can still be rather spammy. Remember the old ‘poke’ function. Yeah, that’s why that got chucked in the bin.

Anyway, you’ll probably be asking yourself right now, why is this relevant to Search Engine Watch or ClickZ? Well, because there are certain ways you can use it to your advantage…

For instance, I run a small pop culture website, with a struggling Facebook presence. It’s hard to get anyone to take notice of anything on the channel, and it just gets buried. However now if I write an article that I’m particularly proud of (this one? Mmmm, sure) at the very least I can post it on my personal Facebook feed and notify a few friends about the good(!?) work I’ve done.

But AGAIN, this can be terribly spammy, so AGAIN² don’t do this too much.

Final note: I’m aware of the irony of trying to keep the Darren Hayes blocking story away from Twitter in order to keep it a secret, and yet talking about it at length in an article that will be seen by a far wider audience than it would have on Twitter or Facebook, so you don’t have to bring it up.

EU referendum / brexit betting: who’s winning in organic search?

betfair brexit

With the referendum on Britain’s membership of the EU coming up this week, there’s obviously been a lot of debate online.

It has also become a major online gambling event, with the value of bets set to exceed the previous biggest political event, the 2012 US election, according to a recent press release from online betting exchange Betfair.

In fact, it has now exceeded that mark, with more than £43 million in bets matched on the exchange. There’s also plenty of betting on the financial markets too, but we’ll stick to the bookmakers for this article.

While the polls predict a close outcome, the bookies are more certain that a brexit vote is unlikely. Odds of 4.1 for a leave vote seem very generous when some opinion polls put the two sides neck and neck.

One interesting aspect with a relatively unique betting event like this is the opportunity for new customer acquisition. It’s likely to attract customers who wouldn’t normally bet, and should be seen as an opportunity for the betting sites.

So which betting sites are ranking for brexit betting?

Here’s the data from Google Trends, showing the spike in search interest for terms around EU referendum betting.

We can see the spike in interest, which obviously presents an opportunity for traffic and customer acquisition for the betting sites.

Unlike seasonal SEO events like Christmas or major sports championships, this referendum is a one-off (hopefully) so strategy has to be geared towards this one event.

brexit betting

In the case of the EU referendum, the betting sites here have had around a year to prepare for the event, though we can see that interest in betting has only really taken in the last two to three months.

According to PI Datametics, the term ‘EU referendum odds’ was searched on average 1,000 times in November 2015 and is now being searched 40,500 times a month.

We don’t have the data for June yet, but I think it’s safe to assume that we’ll be in six figures, as the spike on the chart above suggests.

Top organic search performers

The top performers (from the gambling sector) are:

  • Odds Checker
  • Paddy Power
  • Ladbrokes
  • Top-3-performs_2

    All three sites rank highly for the term, and consistently too. Just compare their performance to that of their rivals:

    Eu-Referendum-odds_2

    This points to a lack of a coherent strategy around EU referendum betting. For example, Betfair has has 10 separate pages performing for this term, hindering its ability to hit a high search position.

    Again, so much of this is about effective internal linking and creation of single landing pages for high value and high traffic search terms.

    Clearly, with £43m matched on the exchange alone, Betfair has done well, but could it have done better with the right SEO strategy?

    A missed PPC opportunity?

    One final side point here – have the betting sites missed a PPC opportunity around referendum betting?

    There is just one site buying ads today on the term. Given the spike we can see from Google Trends, and the high cost of customer acquisition for online betting, it seems strange that more sites aren’t using PPC to gain instant visibility here.

    ppc eu

    How to improve your CTR using Google Search Console

    Search Console Search Analytics

    Last month I wrote a comprehensive guide on how to use Google Search Console, covering every aspect of what is essentially a giant toolshed full of useful stuff for all webmasters to use.

    It was exhaustive, and probably exhausting. Don’t read the whole thing in one sitting, just dip in and out when you come across something you’re unsure about.

    While wading through Search Console’s huge amount of features, I noticed a few elements that deserved to be highlighted, not only because I had no idea they existed or were even accessible to webmasters, but also because they can probably help raise your click-through rate (CTR) on search engine results pages (SERPs), or at the very least, show you where to improve.

    The first thing you need to be aware of is this… You can see all the search queries that bring traffic to your site in Search Console.

    Yeah it was a massive pain when Google encrypted your search terms Google Analytics and replaced them with the ambiguous (not provided) but at least you can still find them here…

    Just go to Search Console, then click on Search Traffic>Search Analytics.

    There you go, a veritable bounty of delicious search terms, keywords and traffic-generating pages.

    Now here’s the really good bit…

    How can I use Search Console to help improve my CTR?

    As I said in my original Search Console walkthrough, here you can toggle between a variety of options, filters and date-ranges.

    Here are the Impressions and CTR for my own website Methods Unsound for April 2016:

    ctr april

    Using this simple overview, ordered by number of impressions, I can see which posts have the highest visibility, but also the ones with a relatively low CTR.

    Perhaps all these pages need is a tweak of a meta-description or the addition of some structured data?

    And that’s what I did. I went through every article in the top 20 and I made a number of changes to them in the CMS, including using the recommendations as featured in my guide to writing meta descriptions and guide to writing title tags.

    These included:

    • Making sure the most important keywords for the webpage showed up in the meta description.
    • Making sure the most important keywords were first in title tags.
    • Rewrote meta descriptions so they were more legible and meaningful.
    • Made sure the meta descriptions were as compelling and as relevant as possible.
    • Made sure meta descriptions were no longer than 135 – 160 characters long
    • Made sure title tags were 50-60 characters long, including spaces.
    • Made sure headlines ( tag) were different from the title tag.
    • Removed duplicate meta descriptions and title tags.
    • Used rich snippets, in the form of Schema markup, to add elements such as visible star ratings to my results.

    The results

    Looking at the following 28 day period’s Impressions and CTR, here’s what I achieved in making these small changes…

    CTR May

    For some of the individual posts, there’s a negligible difference of 0.02% (and sometimes not in the direction I intended) however for others there’s an improvement of nearly 6%.

    Overall, the average CTR for the site has risen from 2.7% to 3.37%, but as you should already be pointing out, this can also be attributed to wealth of other factors – seasonal changes in traffic, algorithm fluctuations, general site health.

    This is far from an exact science, but what my intention is here is to show you that using Search Console you can clearly see which of your posts are doing well in terms of visibility but poor for click-through, and that by sticking to a few basic SEO techniques, you can make a difference.