“Ubiquitous and seamless”: The future of voice search

These are interesting times for voice search, both in terms of its adoption among consumers and its technological development.

We have moved beyond seeing voice search as a futuristic concept with rather limited and stilted realization, to viewing it as an increasingly integral part of our interactions with home and handheld devices.

However, voice search brings with it a lengthy list of questions for technology providers, consumers, and marketers alike.

If we are indeed at something of a crossroads for this technology, it seems a good time to address these questions, giving particular thought to how the landscape will change over the next few years.

These questions include, but are certainly not limited to:

  • What types of queries are best suited to voice search?
  • What do people use voice search for?
  • How will voice search be monetized?
  • How will voice search performance be tracked?
  • Is voice search really the end-game for Google, Amazon, et al? Or is it rather a means to an end?

Unfortunately, neither Siri, Alexa, nor Google Now were of much assistance when I posed them these questions, so we will endeavor to answer them the old-fashioned way.

Let’s start with a quick recap of where we are today.

Voice search: Some background

Voice search allows users to speak directly to a search engine via their mobile or desktop device. The search engine then looks for data that responds to the user’s query based on the voice command.

Voice search providers understand a user’s intent not just through what question is being asked, but also through geo-information, browsing history and past behavior, with the goal of instantly answering that query.

At its apotheosis, this technology should be able to alert us of – and resolve – queries and issues before we even become consciously aware of them. Push notifications from Google Now on Android devices provide a glimpse of just how effective this could be.

Voice search has actually been around for well over a decade, but until recently it has been subordinate to its text-based counterpart, hindered by hilarious but damaging bloopers.

Verbal communication, of course, predates written language and, as such, it feels more natural for us to hold a spoken conversation.

However, when it comes to searching for and retrieving information online, we have experienced this development in reverse, starting with written language and progressing to verbal communication.

As a result, marketers have often been left with the unenviable task of inferring user intent from the simple phrases typed into search engines.

This has come with benefits, too. One of the real defining elements of search marketing has always been the predictability of search queries and volumes.

We set budgets aside based on these numbers, we forecast performance taking these numbers as facts, so it will affect us if search trends are imbued with the inherent fluidity and transience of speech patterns.

That said, it has taken the collective might of Google, Amazon, Baidu, Microsoft and Facebook to get us to a point where voice search is now a viable (and sometimes preferable) way of requesting information, and there is still some way to go before the technology is perfected.

There are many reasons for this staggered roadmap.

First of all, the task of taking meaningful spoken units (morphemes) from a person, converting them to text units (graphemes) on a computer, and finding the corresponding information to answer the original query, is an incredibly complex one.

As such, the list of possible voice commands for a search engine still looks something like this:

We shouldn’t expect such formulaic constructions to remain as the standard, however.

Industry developments like Google’s Hummingbird algorithm have moved us closer to true conversational search than we have ever been before. Voice search therefore seems, logically, to be the area that will develop in tandem with advances in conversational search.

And for us search marketers, developments like the addition of voice queries within Search Analytics mean we can soon report with at least a modicum of accuracy on our campaigns.

So, as natural language processing improves, the anthropomorphic monikers given to digital assistants like Alexa and Siri will make a lot more sense. They will engage us in conversation, even ask us questions, and understand the true intent behind our phrases.

We are already seeing this with Google Assistant. This technology has the ability to ask questions to the user to better understand their intent and perform actions as a result, for example to book train tickets.

This is a fascinating and impressive development that has implications far beyond just search marketing. When combined with Google’s integration of apps into its search index, we can gain a clearer view into just how significant voice search could be in shaping user behavior.

It also moves us a few steps closer to query-less search, where a device knows what we want before we even think to ask the question.

It must be said, nonetheless, that Google is far from monopolizing this territory – Amazon, Apple, Baidu and Microsoft are all investing heavily and there is an ongoing land-grab for what will be very fertile territory.

Why is voice search so important?

We know that voice recognition, natural language processing, and voice search are of strategic importance to the world’s biggest tech companies, and a recent quote from Google reveals exactly why:

“Our goal in Speech Technology Research is twofold: to make speaking to devices around you (home, in car), devices you wear (watch), devices with you (phone, tablet) ubiquitous and seamless.”

To be both ubiquitous and seamless means being driven by a unified software solution.

Digital assistants, powered majoritively by the technology that underpins voice search, can be the software that joins the dots between all of those hardware touchpoints, from home to car to work.

As Jason Tabeling wrote last week, this is a growing hardware market and the onus is on securing as much of this market as possible.

Amazon and Google won’t always want to invest so heavily in the hardware business, however.

It would be far more sustainable to have other hardware makers incorporate Amazon and Google’s software into their devices, increasing the reach of their respective virtual assistants much more cost-effectively.

For now, winning the hardware race is a sensible Trojan horse strategy to ensure that either Google or Amazon gains a foothold in that essential software market.

Who is using voice search?

Predominantly younger generations, although this trend is even more deeply entrenched in China than in the West, due to the complexity of typing Chinese phrases and a willingness to engage with new technologies. As such, Baidu has seen significant growth in the usage of its voice search platform.

Google voice search and Google Assistant are increasing their recognition accuracy levels significantly, however, which has previously been one of the important barriers to widespread uptake.

In fact, the difference between 95% and 99% accuracy is where use goes from occasional to frequent.

These margins may seem relatively inconsequential, but when it comes to speech they are the difference between natural language and very stilted communication. It is this 4-point increase in accuracy that has seen voice search go from gimmick to everyday staple for so many users.

Certain types of queries and searches are likely to require more than just one instant answer, as they require a visual element; for example, planning a trip, or deciding which winter coat to buy.

It is imperative that businesses do not over-optimize for voice search without thinking this through, as voice search does not yet lend itself so readily to these more complex answers.

The graph below shows the different ways in which teens and adults have reported using voice search:

This generational gap is telling, as it strongly suggests that voice search will become more prevalent over time; not just because of the improved technology at consumers’ disposal, but also because of an increased number of people who have grown up with voice search and are accustomed to using it.

It is still noteworthy that so much of this increased usage relates to informational queries, nonetheless. The $37 billion per year search industry is predicated on the notion of choice, mainly within commercial queries.

There may be one true answer to ‘What time is it?’, but ‘What should I buy to wear to the party on Saturday?’ opens itself up to any number of possibilities.

Monetizing this new landscape

The biggest challenge facing voice search providers as they try to monetize the increasing demand is that the interface simply doesn’t lend itself to advertising.

We saw this very recently with the Beauty and the Beast ‘ad’ controversy, which was seen as invasive, primarily because if there is only one answer to a question, users are unwilling to accept an advertisement in place of a response.

That issue aside, other questions remain unanswered. If users do start to conduct commercial queries and the response is multifaceted, the traditional SERP seems a much more fitting format than a single-answer interface.

The question of how to monetize voice search has been raised repeatedly at Google’s quarterly earning meetings, so we can surmise that they will find a solution.

We can expect Google to continue experimenting with ad formats for however long it takes to devise the right formula, while hopefully keeping its huge user base content along the way.

Is augmented reality the answer?

One prediction is that Google and Amazon will use the advent of augmented reality to provide multiple options in response to a voice-based query.

This would be in keeping with the nature of this more futuristic interaction, as it would feel disjointed to speak to a digital assistant and simply see four PPC ads on a phone screen as the response.

By creating an augmented reality-based search results page, search engines can sell advertising space and keep users satisfied.

We have seen signs that this could be tested soon, with Amazon said to be exploring the possibility of opening augmented reality homeware stores.

The irony of Amazon, they slayer of so many traditional stores, now taking a seemingly retrograde step by opening stores of its own, will not be lost on most in the industry.

These will be much more than just traditional brick-and-mortar presences for the online giant, however, and will be more in line with its forays into grocery shopping.

Now if we bring voice search and the Alexa digital assistant back into the frame, this all starts to fit together rather nicely.

Voice search suddenly becomes a vehicle to showcase and provide a wide range of products and services, from timely reminders about appointments, to contextual ad placements in response to commercial queries.

The more data is fed into this machine, the more accurate it becomes and – should privacy concerns be allayed or bludgeoned into submission – the happier the consumer will be with the results.

In summary

Voice search is not, on its own, the future of the search industry.

One real, if slightly lofty, ambition is to arrive at query-less search, requiring neither a text nor a voice prompt for a digital assistant to spring into action.

Another, more tangible and realistic goal, will be to use voice search to unify the varied touchpoints that make up the average consumer’s day. Though tangible and realistic in technological terms, this goal will remain tantalizingly out of reach if consumers use a variety of hardware providers and data is not shared across platforms, of course.

Making all of this a “ubiquitous and seamless” experience will be hard for consumers to resist and will make it even harder for them to move to another provider and start the process over again. This will be the bargaining chip used to persuade consumers to stay loyal with Apple or Google products from home to car to work.

Key points and predictions

  • Search will adopt a more natural, conversational approach.
  • We will be able to report on voice queries through SQR reports and Search Analytics, with digital assistants sharing their data.
  • Long-tail keyword terms will become the focus of content strategy, as voice queries tend to be longer and more detailed.
  • Content will provide direct answers to questions – but the focus will be on accuracy, rather than just brevity.
  • The importance of being the one, correct answer to an informational query will grow.
  • Optimized videos will see a rise in the search results, as this medium fits well with the voice search results interface.
  • Google will experiment with new ways to monetize its Home product, albeit in subtler ways than the Beauty and the Beast faux-pas.
  • Amazon, in particular, will use augmented reality to tie together its offline stores with its e-commerce experience.
  • Google may experiment with augmented reality to provide a voice search interface that allows for paid ads.

We will see the continued rise of query-less search, where digital assistants answer our questions pre-emptively. Think Google Next, rather than Google Now.

How to capitalize on Facebook mobile traffic – even with a poor mobile experience

We all know that Facebook is a viable source of huge amounts of mobile traffic with relatively cheap CPCs (cost per click).

It’s too good an opportunity to ignore in today’s digital landscape – even if your mobile landing-page experience isn’t up to snuff. Maybe you’ve got a completely new mobile experience in the works, but you don’t want to pass up a few months of good traffic while development and launch is underway.

So how do you continue to scale and drive incremental conversions? You use Facebook mobile ads as an “interest indicator”.

What this means is that you’ll want to still create ad sets targeting your audience on mobile. However, the purpose of these ad sets is to have clear-cut creative and copy so users know what your service/product is and, if interested, click on your ad to get on your site.

It is crucial that our ads are as transparent as possible in what our product/service is about, so we essentially pre-qualify the user. The following is a good example:

Now with these being mobile ads, they may not convert as well due to your less-than-optimal mobile experience, but you now know the exact users who are interested in your offering.

The next thing to do here is create a remarketing ad set on the desktop News Feed and serve your ads to users who have specifically clicked on your ad via mobile. So how do you set this up?

  • When building out your mobile ad sets to prospect for mobile users, add an extra parameter to your URL. For example: device=mobile. This will help in identifying users coming in from your mobile ads.
  • In the Facebook audience section, create a Facebook remarketing audience based off of the URL parameter:
  • Next, create your ad sets remarketing to that mobile-specific remarketing list and select the desktop News Feed to ensure that you are only pulling them into your site via desktop.
  • Let’s use an ecommerce scenario as an example.

    Users love to browse around on their mobile devices, but actual transactions are clunky for multiple reasons – shopping experiences are poor, there’s a lot of information to enter on a mobile device, people on mobile devices are in public places and squeamish about typing credit card info, etc.

    The goal shouldn’t be to get them to convert; it should be to get them to come back on a desktop device, where they’re much more likely to buy.

    In this scenario, we’d retarget users with Facebook’s dynamic product ads, which feature products someone has viewed on your site. Create a separate ad set to leverage Dynamic product ads on the Desktop News Feed that exclusively targets users who have come through on your mobile acquisition campaigns.

    In short, even if your mobile experience is sub-par, you can bring mobile users into your funnel and convert them on desktop. (Note that this is a good tactic even if you DO have a good mobile experience.)

    Don’t let weeks or months of mobile opportunity slip past; get ahead of your developers, use the customer journey to your advantage, and keep the conversions coming.

    Site speed tactics in a mobile-first world: Why ecommerce brands need to step up their site speed game

    A study of 700 top ecommerce brands found that the majority are underperforming when it comes to optimizing their sites for speed. Find out how you can avoid the same mistakes.

    Web users are not patient. The speed of your site can make a massive difference to whether people will visit your site, whether they’ll stay on it, and whether they will come back. Not to mention whether they’ll make a purchase.

    A massive 79% of shoppers who have been disappointed by a site’s performance say that they’re less likely to use the site again. But what constitutes ‘disappointed’?

    We’re only human after all

    Kissmetrics research on customer reactions to site speed has resounded across the industry, but it’s not something that should be forgotten:

    “If an e-commerce site is making $100,000 per day, a 1 second page delay could potentially cost you $2.5 million in lost sales every year.”

    That’s a 7% reduction in your conversion rate, and 52% of customers say site speed is a big factor in their site loyalty. A one second delay is bad – a two second delay is worse. 47% of consumers expect web pages to load within two seconds.

    But based on the same research, a faster full-site load leads to a 34% lower bounce rate, and an improvement by just one second results in a 27% conversion rate increase.

    It’s because site speed is such a vital part of building a successful ecommerce site that my team at Kaizen and I conducted a study into 700 top UK ecommerce sites, analyzing various aspects of their site speed performance.

    What we found is that the biggest brands have some of the poorest optimization, with outdated web protocols, unresponsive pages, and bloated page size.

    The average web page size is now 2.3MB (that’s the size of the shareware version of the classic game Doom), so we wanted to see whether the ecommerce industry is any better – since their businesses are directly dependent on their website performance.

    Surprisingly, we have found that the web page size of the top UK ecommerce sites is 30% larger on average than standard websites – at 2.98 MB.

    Average webpage size according to HTTPArchive

    However, the web page size isn’t the only factor impacting the site speed. Even larger sites load and render quickly if they’re smart about how they deliver.

    My team and I picked the top 700 UK ecommerce sites, based on their estimated monthly traffic with data kindly supplied by SimilarWeb. For each, we analysed them using Google’s PageSpeed Insights API, checked their page size and loading time on Pingdom, and verified their HTTP protocol using https://http2.pro/.

    From this, we found the following data, and used it to determine which websites are best optimised for speed:

    • PageSpeed Insights Desktop Score (not considering third party data)
    • PageSpeed Insights Mobile Score (not considering third party data)
    • HTTP/2
    • Web page size
    • Loading Time
    • Loading Time per MB

    Desktop vs mobile

    Mobile connections are usually slower than desktop to begin with, so further delays are felt even more keenly. This, together with the fact that Google’s latest mobile update now factors site speed into mobile page ranking, makes it a high value consideration to ensure mobile pages are sufficiently optimized.

    This becomes even more of a consideration when you factor in how much of ecommerce traffic is now mobile – for example Vodafone, the third top-scoring website in our recent research, receives only 20% of their traffic from desktop, with 80% coming from mobile devices.

    Make your site work for you

    Your site speed isn’t simply a dial you can turn up in your page settings; there are a number of factors which contribute to it. here’s what they are, and how you can start making your site one of the fastest out there.

    Protocol power

    HTTP/1.1 isn’t SPDY enough

    Network protocols are the rules and standards that govern the end points in a telecommunication connection – how data is transmitted over the web. Common examples include IP – Internet Protocol – and HTTP – Hypertext Transfer Protocol.

    The HTTP/1.1 protocol is decades old and doesn’t make full use of newer technologies. Its main downside is it doesn’t allow you to download files in parallel. For each file (or request), the server needs a separate connection.

    HTTP/1.1 enables only one request per connection, while browsers now support a maximum of 6 connections per domain. This means that the number of files which can be downloaded and rendered simultaneously is limited – and that costs time.

    Since the time of HTTP/1.1, Google has developed a newer version of the protocol, SPDY (“Speedy”), which allows simultaneous connections to be opened, and means it can serve multiple parts of the website (JavaScript, HTML, images, etc.) in parallel.

    But SPDY isn’t the latest protocol developed by Google. Working closely with W3C (World Wide Web Consortium), they’ve developed the new HTTP/2 protocol. HTTP/2 has roughly the same characteristics as SPDY, but is also binary, and allows the server to ‘push’ information to the requester, with better HPACK compression.

    Despite the clear advantages of the HTTP/2 protocol, only a few websites have made use of it. Our recent research discovered that only 7.87% of the top 700 ecommerce sites use the technology – compared to 11.3% of sites overall. Some examples of websites using HTTP/2 are https://www.vans.co.uk/, https://www.paperchase.co.uk/ or https://www.expedia.co.uk/.

    According to Cloudflare.com, when they implemented HTTP/2, they saw customers’ average page load time nearly halved – from 9.07 seconds for HTTP/1.X falling to 4.27 seconds for HTTP/2. That’s a significant improvement in a key area of website efficiency.

    However, HTTP/2 doesn’t solve everything, and in some cases the results can be disappointing. In our research, many websites achieved only very small speed gains in their loading times when served over HTTP/2 instead of HTTP/1.1.

    Switching to HTTP/2 isn’t enough by itself – many websites fail to optimize for the change and lose out on the maximum speed gains.

    Old-school techniques, such as domain sharding or sprites, can be counter-productive. And using huge CSS or JavaScript files where less than 10% of the rules and code is relevant to pages likely to be visited is a waste of both your user’s time and your server’s time.

    Screenshot from Dareboost comparison analysis of Oliver Bonas’ loading performance

    Even our own measurements showed that the average loading time per 1 MB for websites supporting HTTP/2 was 1.74s, compared to 1.44s for websites not supporting HTTP/2.

    A nice example of a successful HTTP/2 optimisation is Paperchase, who saved a full second of time necessary to load their website, as is shown here:

    Screenshot from Dareboost comparison analysis of Paperchase loading performance

    How To Tackle Protocols – HTTP/2 and you

    If you want to be at the forefront of network protocols – and at the top of the list of faster sites – get an HTTP/2 protocol in place.

    While HTTP/2 only requires the server to support the new protocol (many now do, though Microsoft’s IIS has no plans yet), the browsers need a TLS connection. This means every connection over HTTP/2 will be safe and secure, adding an extra layer of security to the internet.

    For more information on how you can get started with HTTP/2, have a look at the Kaizen in-depth article here.

    It’s all about size

    The smaller, the better

    If you’re trying to get speed up, you need to keep size down. The less there is to move from the Internet to the user, the less time it takes.

    As I mentioned earlier in this article, the ecommerce sites looked at in our study were bigger on average than other webpages out there – 30% bigger, at 2.98 MB, compared to a global standard of 2.3MB.

    Format, compress, minify

    One of the biggest issues on plus-sized websites is pictures. Unless they’re compressed and resized to suitable formats, they can be over-large and slow page speed to a crawl.

    The solution to that problem explains itself – compress and resize – but less obvious fixes can be found in using the appropriate file type. The .png format makes files smaller if they’re in block coloring and simple – like infographics, illustrations and icons.

    But for photographs, with a wide number of colors and much finer details, .png can compromise the quality of the image. You might consider using .jpg files instead, or .WebP, an open source image type format from Google, which supports both lossy and lossless compression.

    Using correctly sized, unscaled images manually can be quite a daunting task for web developers. PageSpeed modules from Google can come in handy, automating many of the tasks necessary for site speed optimization.

    You can also minify the source codes. CSS and JavaScript resources could be minified using tools like http://javascript-minifier.com/ and http://cssminifier.com/ – and should save kilobytes otherwise spent on whitespace.

    The HTML should be also as compact as possible. We recommend stripping out all the unnecessary whitespace and empty lines.

    Time to go mobile

    Not very responsive

    Most retailers in the study had mobile-optimized sites, but 24% of them served their mobile users a separate mobile site – usually on a separate sub domain. While this approach improves UX, it can be inconvenient for two reasons:

    1) Google handles subdomains as separate domains.

    2) Depending on how the redirects based on viewport are set up, in the new, mobile-first index world, this can mean that the Googlebot (visiting with smartphone user agent) will have troubles reaching the desktop version of the site.

    A safer solution can be to use a responsive site that delivers the same HTML code to all devices, but adapts to the size and shape of the device used. We found that this had representation on only 76% of the sites.

    Alarmingly, mobile sites themselves were largely poorly-optimized for mobile; the average mobile site scored 53.9/100 for speed, as opposed to the average desktop score of 59.4/100.

    Hewlett Packard had a massive difference of 29 points between their desktop score (at 77/100) and their mobile (48/100), while the worst offenders were Carat London, who scored zero for both mobile and desktop score.

    Here is the list of the top 10 websites based on Google’s Page Speed Insights:

    Desktop Score
    Mobile Score
    Total PageSpeed Score

    Mobile management

    Much of the mobile optimization requires coding and/or web development skills, but worry not – Google have created a guide to delivering a mobile page in under a second.

    AMP it up

    AMP – Accelerated Mobile Pages – is Google’s initiative for producing more efficient webpages for mobile. It’s a work-in-progress, but every day brings new developments and more support, customization and stability.

    AMP pages have a number of benefits for all sites, including being preferred by Google in search rankings, and being faster to load. For ecommerce it’s a technology to implement ASAP, or at least keep an eye on.

    While AMP debuted for publishing sites, recent updates have brought ecommerce sites into the fold, and eBay in particular have been quick on the uptake, now serving over eight million pages through the AMP system. Google is also working with eBay on things like A/B testing and smart buttons.

    “With items like these in place, AMP for ecommerce will soon start surfacing,” says Senthil Padmanabhan, the principal engineer of eBay.

    Customization on ecommerce AMP pages is currently low, but it’s an ideal technology for the industry, allowing customers quicker transitions between products – improving conversion rates and making the website easy to use.

    During testing on the websites in our study, AMP was found to have a 71% faster load speed for blog posts, and a reduced page size from 2.3MB to 632kB.

    Onwards and upwards

    Site speed isn’t a problem that’s going to go away. As time goes by, the technology improves – AMP and HTTP/2 are just the latest steps on the road to real-time loading. 5G is on the horizon, and customers are only going to become less patient with slow-loading pages.

    As a result, it’s increasingly necessary to keep an eye on your site analytics and your customer behavior. A speed improvement of just one second can improve your conversion rate by 27% – and a delay of one second can cost you millions a year.

    Make sure you’re on top of bringing your ecommerce business and site into the modern era with the tips I’ve listed here.

    5 easy ways to launch a local email marketing strategy

    Email marketing is alive and well in 2017, with over 269 billion emails being sent every day.

    Unfortunately, according to Email Monday, of these 269 billion only 22% of retail emails are opened. This is significantly less than the open rate of 34% garnered by other types of emails.

    It’s also important to note that of the emails that are opened by consumers, 45% are done via a mobile device. In fact, email marketing is becoming synonymous with mobile marketing.

    If your company’s email marketing campaign isn’t seeing success and you find yourself in the 78% of retail emails that are being sent to the junk folder without a second glance, you might want to rethink your strategy. Creating or updating your campaign to focus more on local marketing could be the answer you’ve been looking for.

    There’s a lot of evidence to suggest that the future of email marketing is hyperlocal. Below we’ve compiled some tips for how to create a successful localized email marketing strategy.

    1. Make sure your offers are tailored to your customers so they can actually use them

    One strategy to use when creating a local email marketing campaign is to send out coupons and offers for specific geographic areas.

    Sending coupons is a great way to get consumers to open your emails, but if you consistently send offers that they can’t realistically take advantage of, they’re going to get annoyed and eventually start sending those emails straight to the trash.

    Do some research and figure out how your audience best likes to redeem offers. Is it in person in a brick and mortar store, online via a voucher, etc.? If you have to differentiate your emails and offers based on various target groups, then take the extra time to do that.

    Your audience will thank you by consistently opening the emails and taking advantage of the coupons you’re sending.

    2. Add a personal touch to email campaigns, and reach out in person whenever possible

    No one wants to receive an automatic email that seems to have been written by a robot—it’s impersonal and boring and won’t succeed in engaging your audience. Even if you include a great offer, chances are consumers will stop reading before they even notice it.

    Something as simple as a border or a photo around an image can immediately tip the viewer off that it is an automated email. You can try to make emails sound as personal as possible by sending emails from the name of someone in the company, as opposed to the business name itself, and formatting the email in a more natural way.

    AWeber.com suggests sharing emails on social media since it’s easy for people to like, comment, and share them with others. One of the businesses they work with, Vault Brewing, also saw success when they went out into the community to look for email subscribers. They asked people in person at their business location and at live events via surveys and subscriber apps.

    They saw much greater success with their email marketing campaign after they incorporated these personal touches, and it’s also a good way to connect with the community and learn more about your target audience.

    In the example above, which was sent to a family member of mine, you can see that the email is certainly automated, but that isn’t completely obvious at first glance. This gets you to open the email and engage before making any assumptions, so I thought they did a great job.

    Although this may not necessarily be targeted locally, it gives itself a “local” feel by creating an inclusive environment.

    3. Integrate social media to spread the word

    As I stated above, social media is a great way to connect with a local audience and spread the word about your email marketing campaign. If your business has a newsletter that you email to subscribers, consider posting parts of it on social media with a link to sign up for your email campaign.

    Post offers and rewards on social platforms to encourage people to sign up for emails, and get to know your audience better by investigating what groups and communities they participate in socially online. Just make sure you don’t offer the exact same incentives and materials on social media that you do via email, or consumers won’t have a reason to subscribe to your campaign.

    As far as local impact goes, social media is actually very localized when shared by individuals as opposed to businesses. Be sure to share these posts directly with those in your local filters.

    4. Use a subject line that relates to the local area

    When crafting an email, how much thought do you put into the subject line? Of course you want it to be engaging so people will be tempted to open it, but have you thought much past that?

    Businesses who are focusing on hyperlocal email marketing have suggested using a specific state or city name in the subject line of emails being sent out in order to make national content more relevant for a local audience. According to ImaginePub.com, this simple strategy has the potential to increase open rates by as much as 7%.

    Even if you don’t see an improvement at this same rate, most likely your open rates and click-through rates will increase to some degree when you deliberately target audiences in a specific geographic area.

    5. Segment your list by language and region for more targeted marketing

    Don’t be afraid if a section of your target audience speaks a language other than English, or resides in an unfamiliar location. Instead, embrace these differences and target your marketing to meet them.

    Campaignmonitor.com says that incorporating this strategy is a no brainer, seeing as how the Localization Industry Standards Association carried out a study that showed $25 was returned for every $1 invested in localization.

    First you’ll want to survey your customers to make sure you have accurate data before segmenting your email list(s). Modify your subject lines based on the criteria from your different lists, and don’t forget to consider your calls-to-action. What works in one language and one region might not have the same effect somewhere else.

    If all of this seems a bit overwhelming, consider working with a localization specialist to help get you started. There are also several programs that can help automate your email campaign and free up some of your time so you can focus more on your localization efforts.

    The takeaway

    As you can see, there’s a lot that goes into launching a local email marketing strategy. The tips above are just some of the ways you can localize your email marketing; there are many more ways you can do this depending on how much effort and time you’re willing to put in.

    I suggest taking it one step at a time, incorporating one or two strategies, and monitoring their success before going further. Launch 27 recommends many more helpful strategies and case studies in this article published on their blog.

    Also keep in mind that when constructing a localized email campaign, don’t forget about best practices associated with email marketing. You always want to ask a customer’s permission before signing them up for email, and you should also offer them an “opt-out” option as well. NoRiskSEO goes into more detail about these strategies and offers other ways to take your marketing campaign to the next level.

    Image credit 1-3: Screenshots taken by author March, 2017

    Image credit 4: 1.bp.blogspot.com

    Amanda DiSilvestro is a writer for NoRiskSEO, a full service SEO agency, and a contributor to SEW. You can connect with Amanda on Twitter and LinkedIn.

    Google Chrome SSL certificate proposal could affect millions of websites

    Last year, the developers behind Google’s Chrome browser began taking steps designed to protect users and encourage companies to use HTTPS.

    But now, potentially millions of websites that use SSL certificates issued by Symantec and affiliated resellers could find that their certificates are effectively worthless as far as Chrome is concerned, after a member of the Chrome team published a proposal that would make them untrusted over the next 12 months.

    The reason? According to the Google Chrome team, Symantec has not properly validated thousands of certificates. In fact, the Chrome team claims that “an initial set of reportedly 127 [misissued] certificates has expanded to include at least 30,000 [misissued] certificates, issued over a period spanning several years.”

    Ryan Sleevi, the Chrome team member who wrote the announcement, elaborated,

    “This is also coupled with a series of failures following the previous set of misissued certificates from Symantec, causing us to no longer have confidence in the certificate issuance policies and practices of Symantec over the past several years.”

    Under the proposal he put forth, the accepted validity period of newly-issued Symantec to nine months or less, and an “incremental distrust” of currently-trusted certificates and removal of recognition of Extended Validation status of Symantec-issued certificates.

    A nightmare scenario?

    Symantec is the currently the largest Certificate Authority (CA) and by some estimates, has issued a third of the SSL certificates in use on the web.

    So if the Google Chrome team moves forward with its proposal, it will have a huge impact on Symantec and its customers. Symantec would have to reissue potentially millions of certificates, creating a huge headache for customers, who would have to go through the validation process and install replacement certificates.

    What’s more, under the Chrome team’s proposal, Chrome would immediately remove the status indicators for Extended Validation certificates issued by Symantec.

    These certificates, which require companies to provide greater verification that they are who they say they are, are often used by companies running websites that absolutely need to use HTTPS, such as those that handle payments and financial transactions.

    Extended Validation certificates are more costly, and one of the justifications for the greater cost is the fact that most browsers display indicators for websites that use them. If those indicators go away, it could theoretically harm companies that have relied on these indicators to signal trust to their users.

    Not surprisingly, given the gravity of the situation, Symantec is disputing the Chrome team’s claims about certificate misissuances. In a response, it called the Chrome team’s proposal “irresponsible” and said the allegations leveled at it are “exaggerated and misleading.”

    Symantec is open to working with the Google Chrome team and while it’s reasonable to hope that both parties will identify a satisfactory resolution that averts disruption, companies with certificates issued by Symantec will want to monitor the situation as it develops.

    Five most interesting search marketing news stories of the week

    Welcome to our weekly round-up of all the latest news and research from the world of search marketing and beyond.

    This week, the Google SERP has got a bit more interactive with the addition of rich results for podcasts, and a new study has found that marketers are still failing to use advanced search tactics in their campaigns.

    Plus, Google has launched a new website to bring all of its open-source projects under one umbrella, and an unlikely partnership has arisen between Google and Chinese search giant Baidu to bring faster mobile web pages to a wider user base.

    Google adds rich results for podcasts to the SERP

    Google has stealthily launched some new guidelines for structured data on its Developers blog, to bring rich results for podcasts to the SERP.

    At the moment the new feature is only available via Google Home (where you can use voice activation to start up a podcast) or in the Google Search app v6.5 or higher on Android, but Google hopes to soon add support for Chrome on Android.

    Google’s blog provided a sample image for how this will look in practice:

    In his article for Search Engine Watch this week, Clark Boyd explains how you can get your podcast indexed on the SERP, and how to add the right structured data to your podcasts.

    Study: Marketers still aren’t using advanced search tactics

    Those of us who keep close tabs on search innovation and strategy – or comment on it – are probably familiar with search tactics like retargeting lists for search ads (RLSA), voice search optimization, ad extensions in paid search listings, and schema markup. We know how to use them, and the benefits that they bring to ROI and visibility.

    But a study by Bing and search agency Catalyst has revealed that among marketers as a whole, very few still are making use of advanced search tactics like these in their campaigns.

    When asked which of a range of tactics their company used or was planning to use in 2016, only 34% of marketers reported using ad extensions; 30% said they used Product Listing Ads (PLAs); and 28% used retargeting lists for search ads (RLSA).

    Just 28% of respondents reported using voice search optimization, 27% said they used sitelinks, and a dismal 17% reported using schema markup.

    So why are many marketers still failing to tap into the full potential of search? Search Engine Watch spoke to Microsoft’s Rob Wilk and Catalyst’s Kerry Curran to find out what search marketers can do to improve their campaigns.

    Twitter introduces pre-roll ads for Periscope

    Pre-roll ads might just be everyone’s least favorite ad format – so much so that YouTube did away with 30-second unskippable pre-roll ads earlier this year. But Twitter-owned livestreaming platform Periscope announced this week that it will be adding pre-roll ads to live and replay Periscope streams.

    The new ad product is named, unsurprisingly, Ads on Periscope, and is an expansion of Twitter’s existing Amplify ad product. The Periscope ads are expected to share revenue with content creators in the same 70/30 split as Amplify ads.

    Amidst Twitter’s struggle to drive revenue on its social platform, monetizing Periscope could be one way to bolster its flagging fortunes. But the autoplay ads may prove to be unpopular with users, especially with the news that they will run over streaming content – meaning that viewers will miss several seconds while the ad finishes.

    Google’s new site brings all of its open-source projects under one umbrella

    Google has launched a new website this week which is designed to act as a central directory for all of its open-source projects, bringing them together under one umbrella.

    In its blog post announcing the launch, ‘A New Home for Google Open Source‘, Google wrote that the new site:

    showcases the breadth and depth of our love for open source. It will contain the expected things: our programs, organizations we support, and a comprehensive list of open source projects we’ve released. But it also contains something unexpected: a look under the hood at how we “do” open source.

    The site contains the source code for Google’s Accelerated Mobile Pages Project official website, as well as the source code for its Android mobile OS, the Chromium web browser, its Tesseract Optical Character Recognition engine, and hundreds of other Google projects, both well-known and obscure.

    While Google has always made the code for these projects available on GitHub and its self-hosted git service (this being the nature of open source), this is the first time users have been able to browse them from a central location, and is sure to provide Google enthusiasts with plenty of cool material to scour.

    Baidu is working hand-in-hand with Google to accelerate the mobile web

    And speaking of Accelerated Mobile Pages (AMP), an unlikely partnership has arisen in the world of search, as Baidu and Google confirmed that they are teaming up to bring a faster mobile web to a wider user base.

    Google has a rocky history with China. It has had a presence in the country since 2005, but in 2010 decided to stop censoring its searches in accordance with Chinese law in response to a Chinese-originated hacking attack on itself and a number of other US tech companies, redirecting the searches instead to its Hong Kong search engine. Access to Google’s search engine and services has been blocked by the Chinese government on a number of occasions.

    In the wake of this, native Chinese search engine Baidu overtook Google as the main search provider in China, and now enjoys around 80% of the Chinese search market, while Google China only has about 10%. But the two have evidently agreed to set aside their rivalry in order to pursue a higher goal: accelerating the mobile web.

    At Google’s first AMP conference in New York, Baidu’s Gao Lei announced Mobile Instant Pages (Chinese-language link), or MIP, Baidu’s answer to Accelerated Mobile Pages. Hermas Ma reported on Search Engine Land that MIP has very similar technology to AMP, the main difference being that MIP are optimized for the Chinese internet.

    Mobile Instant Pages can reportedly reduce the rendering of above-the-fold content by 30 to 80 percent, and Baidu has been considering giving MIP a ranking advantage in search results (something which AMP doesn’t yet have).

    Ma also notes that the AMP Project website now loads in mainland China where it didn’t before, further pointing to a burying of the hatchet between Google and its Chinese counterpart.

    Study: Why do marketers still struggle with innovative search tactics?

    Many marketers who are seeing flagging returns from their search marketing campaigns might wonder what they’re doing wrong – especially if they’ve already got best practices like accurate site descriptions and keyword optimization covered.

    But a new study commissioned by Microsoft’s Bing and search agency Catalyst, and carried out by Forrester Consulting, may have some light to shed onto why marketers aren’t realizing the full potential of search.

    The study, whose findings are written up in a whitepaper, ‘Prioritize Search to Maximize ROI of Marketing‘, found that more advanced search marketing tactics like local inventory ads, voice search optimization, sitelinks and schema markup have low adoption by marketers, who may not even know about them.

    In addition, marketers struggle to properly integrate search with other channels in order to take advantage of the demand which they themselves have created.

    “We too often see advertisers spending significant dollars in, let’s say, TV, and then failing to fully fund their search campaigns,” says Rob Wilk, Vice President of North America Search Sales at Microsoft.

    “So if a consumer hears a message somewhere and then decides to search on Bing to get more information, many times the advertiser isn’t present, and that consumer ends up taking a different path than what the advertiser would have desired.

    “In a worst case scenario, consumers come to search and end up clicking on a competitor ad. Think about that for a moment – clients are spending their dollars to line the pockets of competitors.”

    So what do Bing and Catalyst think is keeping search marketers from tapping into the full potential of their campaigns, and how can they go about addressing the problem?

    Challenges in allocation and attribution

    The study’s findings drew on online surveys of 300 US-based marketing agencies and B2C advertisers, together with Forrester’s Consumer Technographics data.

    Wilk explained that Bing and Catalyst commissioned the study to “better inform the market about the importance of looking at search not just as an individual, effective marketing channel, but to clearly articulate the benefits of closely aligning all media spend in concert with search advertising investments.”

    Overall, respondents to the survey gave a high rating to the ROI they receive from search marketing, with 74% of respondents who were investing in search giving its ROI a rating of “excellent” or “good”.

    However, 53% of marketers cited cross-media attribution as one of their top three challenges in budget allocation, with another 53% citing a lack of data to inform strategy; 44% also cited measurement as one of their top challenges.

    “Competing business demands force marketers to rely on hard attribution data to develop and support their cross-channel investment strategies,” notes the study.

    “Unfortunately, their attribution models today do not necessarily paint an accurate reflection of the consumer engagement with cross-channel touchpoints, which inhibits them from moving budget fluidly from channel to channel.”

    Kerry Curran, Senior Partner and Managing Director of Marketing Integration at Catalyst, adds:

    “The majority of the data supports that consumers consistently use and value paid search, and marketers find it to be a strong ROI driver; however, adequate budget allocation is still a challenge.

    “With competing business demands and attribution data that does not measure cross-channel impact, paid search marketers are struggling to fully invest in their programs.”

    Search marketers still aren’t being innovative enough

    Those of us who keep close tabs on search innovation and strategy – or comment on it – are fairly familiar with concepts like retargeting lists for search ads (RLSA), voice search optimization, ad extensions in paid search listings, schema markup, and so on.

    But for the majority of marketers, advanced tactics like these go far beyond what they would use for their campaigns. When asked which of a range of tactics their company used or was planning to use in 2016, only 34% of marketers reported using ad extensions; 30% used Product Listing Ads (PLAs); and 28% used retargeting lists for search ads (RLSA).

    Just 28% of respondents reported using voice search optimization in their campaigns, 27% said they used sitelinks, and a dismal 17% reported using schema markup. (Findings like this shed light on why, even now, less than 1% of websites are using schema.org vocabulary to mark up their webpages).

    I asked Wilk and Curran why they thought that marketers weren’t going the extra mile with their search marketing tactics. Was it due to a lack of expertise, or perhaps just budget and time?

    “It’s all of those reasons,” replies Wilk. “Doing all of the tactics well in search requires constant learning, constant testing and of course constant optimization.

    “These days, all marketers are being asked to do more with less, and we don’t see that changing anytime soon. So in a world of squeezed time and resources, clients and agencies are forced to make trade-offs, and often the tactics mentioned tend to get a lower priority.

    “Eventually clients do get to these things but every query we see, whether it’s voice, on desktop or mobile is a perishable good. That “magical” moment of someone expressing clear intent comes and goes in an instant. Getting ahead of these trends, and sticking to them, is where the return on investment lives.”

    Curran adds: “There are so many advanced search tactics already available, and as search engines continue to innovate, they continue to release new options and update existing features.

    “While the advanced tactics can drive campaign improvements, alignment between the search engines, paid search teams, and brand is required to roll out and test new tactics.

    “In addition to the intricacies of day-to-day management, search marketers need to prioritize the opportunities, budgets, and resources to allow for testing in a manner that provides statistical significance.”

    What can marketers do to improve their search campaigns?

    It’s one thing to pinpoint where the problems might be, but if marketers want to take concrete steps to improve their search marketing, where should they begin?

    “One – prioritize their search budget,” says Rob Wilk.

    “Two, when running media campaigns – especially expensive TV commercials – marketers need to make sure they have strong search campaigns so that consumers can easily engage with the brand and find what they are looking for via search engines.”

    “Three, make sure they have full alignment across all channels. Marketers must keep their ear to the ground when it comes to search.

    “We have billions of moments every month where consumers express their desires, and marketers must tap into this wealth of data to inform marketing decisions in terms of what message to deliver, to whom and in what way.”

    The search industry is constantly innovating, and it might seem overwhelming for marketers with limited time and resources to try and keep on top of developments. However, as we’ve seen, there is a large number of advanced search tactics available that most marketers aren’t taking advantage of.

    Investing in even one of these tactics could prove to have significant benefits for search marketing ROI, which would pay dividends in the long run.

    Google adds rich results for podcasts to SERPs

    On its Developers blog, Google stealthily launched some new guidelines for structured data to bring rich results for podcasts to search results.

    To date, this is only available via Google Home or the Google Search app v6.5 or higher on Android devices, but support for Chrome on Android is coming soon.

    This was first noted over on Search Engine Roundtable, and Google provided an image to show how this will look in practice:

    Podcasts can be indexed and embedded in results, which could be a particularly useful functionality for Google Home and smartphones.

    The example above shows just how much SERP real estate can be occupied when this is implemented correctly. An embedded podcast player within the search results also means users won’t even need to click through to a landing page to listen to an episode.

    How can I get my podcasts indexed?

    The first stage to achieving this is to get podcasts indexed, and Google has provided very clear and thorough guidelines on how to do this:

    • Expose a valid RSS feed describing the podcast that conforms to the RSS 2.0 specifications as well as the feed requirements described below.
    • The feed must contain at least one episode that conforms to the requirements given on this page.
    • The podcast must have a dedicated homepage with the elements described below. The homepage must have a pointing to your RSS feed.
    • The homepage, the RSS feed, and any non-blocked audio files must be exposed to Googlebot; that is, they must not require a login, and must not be protected by robots.txt or tags.

    Adding Structured Data for Podcasts

    Structured data implementation can lead to increased SERP presence and click-through rate, but it also provides search engines with valuable guidance when they crawl your content.

    The full list of required tags required within a podcast’s RSS feed is provided in Google’s post and it includes the following:

    Tags must be added at both a podcast- and episode-level within the RSS feed in order to show up via rich results. This is an essential consideration and is one that will need a bit of extra time from developers on an ongoing basis.

    We can surmise from these elements and the zebra podcast example provided by Google that the typical structure of a podcast listing will therefore look as follows:

    Are brands taking advantage of this yet?

    These are very early days for this feature, but I have conducted a number of searches via the Google Search app in the US for a range of podcasts and have yet to see this live.

    However, Google has been quite surreptitious about this recent release, and it will take some time for brands to implement the requisite changes too.

    Nonetheless, support for Chrome on Android is coming soon and presumably, other Google software and hardware will follow suit. That will provide quite a wide audience and any brand that releases podcasts will want to avail of this new opportunity to attract visits.

    Vector graphic of the Android robot sitting behind a desk with a laptop and a cup of coffee.

    In summary

    Combined with its developments in voice search, mobile, and personalization, it makes strategic sense for Google to add podcasts to its rich results-enabled assets.

    Data from Edison Research show the significant year-on-year growth for this medium in the US, with further increases expected in 2017:

    Column graph showing the increase in monthly podcast listening from 2008 to 2016. It begins at 9% in 2008, and increased to 21% in 2016. The legend beneath it reads: % listening to a podcast in last month.

    As such, it is worth paying attention to this now for any brand that produces podcasts – especially those that exist in a competitive niche.

    It may take some time to implement the technical changes and see them go live but, as with all such industry updates, the early adopters will reap the greatest rewards.

    Article graphics by Chelsea Herbert

    Understanding voice search: What are the implications for marketers?

    Last week, Amazon added their voice search product “Alexa” to their iPhone app.

    This is yet another signal in the continuing avalanche of signals that voice search is a major part of every major tech company’s strategy. One report by VoiceLabs predicts that voice device growth will quadruple this year. In last year’s Google IO conference Google CEO Sundar Pichai announced that 20% of search queries were coming from voice search.

    What data can help us to understand the impact of voice search when it isn’t yet a reporting field that is provided by most publishers? Also, what are the implications for search marketers?

    What data do we have to potentially understand voice search?

    Without specifically understanding voice search data provided by search engines we have to rely on other indicators. I like to use query keywords (who, what, where, when, why, how) as a way to understand how consumers may be using voice search. While not perfect, I think it helps give some insight into how consumer behavior is shifting.

    People expect questions to be answered via voice search. I took a look at our data from this year vs. the same period in 2016. This data shows some interesting trends. Overall, query search term use as a percentage of total impressions was up 47% year over year.

    This shows that as voice search becomes more mainstream and search engines get better and better at providing answers, people are changing the phrases they use to search.

    Breaking this down by the specific query used I think gives an exact picture of what types of expectations consumers have when asking questions. In just the last year, queries containing ‘Where’ and ‘When’ have risen by almost 300%.

    These two questions lead the way due the local nature of many voice search queries, and the high likelihood of a receiving direct answer. While a term like ‘how’ is also up 13% year over year, it is still difficult to get answers to ‘how’ to do something via a voice query. However, saying “OK Google, where is the closest burger restaurant?” elicits a fairly specific response.

    The implications of voice search for marketers

    For me there are two key strategic impacts from the growth of voice search, both on mobile and from in-home devices. For each of these, there are several questions you can ask yourself to determine how voice search might affect your brand, and how you can best optimize for it.

    1. How are my consumers finding my brand?

    • Do you need to create more localized content? If ‘where’ and ‘when’ queries continue their growth, there is an opportunity to dominate these queries with both paid and organic rankings.
    • What data is available through my search term report in AdWords? Search term performance in the dimension’s report is a great way to update and optimize your keyword list in general, and to understand the types of questions consumers are asking.
    • Do you need to create more query-driven content to rank in answer boxes?

    2. Consider keyword experiences when question-style queries are asked. Ask yourself these questions:

    • What ad copy and landing pages are being used for question searches? If someone is searching for “where” or “when”, do you give them the same landing page that every other query gets? Is this the correct experience, or would a unique landing page and ad copy be more appropriate?
    • Is my location data current and accurate? The local pack is showing up more and more, especially via mobile devices, so making sure your data is correct in this space is critical.
    • Do I have the proper location extensions and product inventory available? There are ways to help consumers get the proper answers via paid ads. Making sure this content is eligible and aligned to your AdWords campaigns is key.

    If you can think through the consumer experience when these question-style queries are asked, you will understand the gaps in your voice search offering, and the opportunities to provide a better experience.

    A visual history of Google SERPs: 1996 to 2017

    Over the past 20 years, Google has revolutionized how we source information, how we buy products, and how advertisers sell those products to us.

    And yet, one fact remains stubbornly true: the shop-front for brands on Google is still the Search Engine Results Page (SERP).

    The original lists of static results, comprised of what we nostalgically term ‘10 blue links’, have evolved into multi-media, cross-device, highly-personalized interfaces that can even adapt as we speak to them. There are now images, GIFs, news articles, videos, and podcasts in SERPs, all powered by algorithms that grow evermore sophisticated through machine learning.

    Nonetheless, in the face of such change, it still matters where our website ranks on those all-important SERPs. The content of those results pages, however, is in constant flux, as a result of 20 years of innovations and new products.

    We experience this evolution iteratively and, while we can all appreciate that significant changes have taken place, it can be easy to lose sight of the wider context and acknowledge just how radical the overhaul of Google results has been.

    Google hasn’t always got it right along the way, but it has always been willing to put its failures aside and invest again in new initiatives. As such, we thought it a good time to take a step back and look at Google’s evolution over the past two decades, its many successes and its few notable failures, through the lens of the humble SERP.

    Infographic created by Clark Boyd, VP Strategy and Safiya Lawrence, SEO Manager at Croud, and graphic designer Chelsea Herbert.