Six ways to improve your micro-targeted long-tail keyword strategy

related search terms

Using micro-focused keyword strategies can be a simple, impactful approach to improving your SEO results.

When analyzing websites that are already successful from an SEO perspective, it’s always surprising to see how many easy long-tail wins are possible but aren’t being optimized for.

Long-tail keyword targeting is nothing new. We know users are quickly adjusting their queries to more conversational, semantic searches. I mean, there’s a reason that Amazon, Apple and Google are pushing into voice search.

Why then, are we as SEOs and marketers still targeting ‘motorcycle parts’ and ‘refrigerators’ when we know people are searching for ‘vintage Honda motorcycle parts’ and asking questions like ‘Does LG make a counter depth refrigerator?’ People are looking for answers — not keywords.

And yet we’re seeing a trend of SEOs focusing on head terms, rather than optimizing for long-tail. While head terms have huge search volume, they don’t necessarily drive qualified traffic. It’s time SEOs worked to drive users who are ready and willing to buy.

Apply a different approach: micro-targeting

Micro-targeting is a three-step process. First, discover large amounts of long-tail keywords that share common phrases or expressions. Second, identify very specific opportunity gaps on your site where those phrases are lacking. Finally, make small adjustments to individual components of your site or content to better address long-tail keyword groups, not specific keywords, that are easier to rank for.

There are several ways to approach a micro-targeting strategy using different tools, workflows, and processes. Below, you’ll find six points to consider as you execute a long-tail micro-targeting strategy.

1) Don’t completely replace your head term efforts with micro-targeting long-tail keywords.

While I have seen long-tail strategies keep many companies afloat, those strategies shouldn’t be your be-all end-all. For the most part, long-tail keywords will convert better.

On the other hand, head terms can drive massive volume for your sites – and the more traffic, the higher your opportunity for conversion. Consider long-tail keyword targeting supplemental to the larger goal of head term targeting, and a secondary activity to optimizing for head terms.

2) Not every customer is the same, so don’t assume they search the same.

I’m from Salt Lake City. Living here has its perks, including access to outdoor activities. When I search for outdoor gear, I search for ‘top rated daypacks’ and ‘lightweight gore-tex rain shells,’ not ‘outdoor gear.’ However, someone I’m going hiking with this weekend may search for ‘affordable daypacks online’ and ‘men’s waterproof rain shell.’

There is no single way of optimizing for the differences in customers and their searches, which is exactly why finding trends like ‘daypacks’ and ‘rain shell’ are key to phrase-level micro-targeting.

Find the commonalities or trends among a wide range of searchers’ queries, and optimize for them. Also, realizing you won’t be able to optimize for everything will save you a lot of stress. Instead, look for phrase and expression trends within long-tail terms that are worth your effort or that span a number of queries.

3) Use more than one tool for keyword research and discovery.

The biggest mistake I see when trying to micro-target long-tail keywords is that SEOs stop digging too soon. The more keywords you discover, the more likely you’ll find a trend.

One of my favorite methods for continued discovery is taking an export from Google Keyword Planner for several head terms (for example, ‘outdoor gear,’ ‘camping gear,’ and ‘outdoor clothing’) then plugging that entire export into a tool like Term Explorer’s Bulk Keyword Tool. Term Explorer’s tool helps me to further diversify and expand my keyword pool. Übersuggest can also help diversify keyword focuses.

4) Find common phrases or expressions to associate between keywords.

The core purpose of micro-targeting is to find similarities between keywords and expressions those keywords contain.

Don’t forget: you aren’t going to be able to target everything for everyone. What you can do is target common expressions or threads between a variety of keywords to narrow your focus on small wins that have value.

There are some really great free or affordable tools for discovering phrase and expression trends. Personally I’ve found Textalyser to be a quick option that allows for a bulk check and then presents the data in an easy-to-understand format.

The data can be pulled into Excel and aggregated to find common expressions that appear between two, three and four keyword strings. You may be surprised to find a quality correlation of five to ten common expressions across 800–1200 keywords.

While using tools like this, keep in mind that many of the expressions and phrases, as well as the keyword input in them, may be grammatically incorrect. It’s up to you to find ways to these expressions together in usable combinations or simply ignore them for optimization and content creation.

keywords to ignore

5) Analyze and adjust existing content and optimization first.

While jumping into new content creation is all well and good, it’s important to look at the current content on site and optimize existing pages for opportunities to micro-target keyword phrases.

Anything from title tags, H1 tags, and even image alt tags could be under-optimized. Using Screaming Frog’s custom configuration or a similar tool, you can quickly analyze phrase gaps that need better optimization in your current efforts.

If you want to get even more targeted, you can run these phrase crawls against competitors to compare their keyword targeting gaps to yours.

custom search

comparing keywords with competitors

6) Develop new content where necessary, if expression gaps hold enough value.

I know this may sound like the most obvious point of this discussion, but often the biggest mistake with this expression-targeted approach is that SEOs focus too much on optimizing content and elements that are already live on site.

Looking at the last example, it’s clear there is opportunity to target ‘camping equipment’ expressions and terms for the site on the right.

While you could alter content or elements already on site, it could prove more useful to provide specific pieces of content targeting these expressions that simultaneously offer a unique, targeted experience for users searching for them.

If you’re able to silo the master keyword list you started with into these expression-focused silos it will help you better organize your long-tail terms into topical categories. This document can then be handed off to your copywriters or content strategists to help the team develop more collaborative strategies and content pieces.

Final thoughts

Long-tail keyword targeting and micro-targeting expressions can be a great ‘small win, big return’ strategy. Because expressions span such wide varieties of keywords, the opportunity to rank for variations of long-tail terms can be seemingly endless.

Once you get into the mindset of finding and addressing expression and phrase gaps in your long-tail strategy you’ll start seeing content and optimization opportunities you never thought to look for.

Dave Hutton is the SEO Manager at Clearlink, a full-funnel customer acquisition firm based out of Salt Lake City, Utah. You can connect with Dave on LinkedIn.

The Penguin in the room: what to do until Google rolls out its latest update

penguins marching to war

Google’s Penguin 3.0 update affected less than 1% of U.S./English queries in 2014. Granted, Google processes over 40,000 search queries every second, which translates to a staggering 1.2 trillion searches per year worldwide, so Penguin 3.0 ultimately hit 12 billion search queries.

What’s scary though, is that Penguin 3.0 wasn’t too bad. Penguin 1.0 hit 3.1% of U.S./English queries, or 37.2 billion search queries. The quasi-cataclysmic update changed the topography of SEO, leaving digital agencies forever scarred by the memory.

Now, Google is supposedly going to roll Penguin 4.0 out in the imminent future. Everyone expected the monolithic tech company to launch the update in 2015, but the holidays delayed it to 2016. Then, everyone expected it to drop sometime in Q1 2016.

However, the SEO world still waits with bated breath.

Why is everyone so afraid of the Big Bad Penguin?

Google first launched the Penguin Update in April 2012 to catch sites spamming its search results, specifically the ones who used link schemes to manipulate search rankings. In other words, it hunted down inorganic links, the ones bought or placed solely for the sake of improving search rankings.

In the time it took for Penguin 2.0 and 3.0 to come out, digital agencies wised up. They heard the message loud and clear. Once a new Penguin update comes out, they know they have to take action to get rid of bad links.

Google targets links that come from poor quality sites, have little to no relevancy to the backlinked site, have overly optimized anchor text, are paid for, and/or are keyword rich.

However, what makes Penguin truly terrifying isn’t only the impact it can have on a site’s ranking, but on an honest marketing campaign.

Earning backlinks is tough. That’s why some stoop to paying for them or working with shady link networks. The most tried-and-true way to earn backlinks is guest blogging, which is not only difficult, but time consuming, as well.

Although Google usually ignores backlinks earned by guest blogging, that’s not to say that they’re completely Penguin-proof. Your guest blogging backlinks may have become toxic in an unlikely, but entirely possible scenario.

In other words, people are so afraid of Penguin because it can ruin a lot of the hard work you’ve put into a campaign.

How can you slay the fearsome Penguin?

Luckily, there are a number of preventative measures you can take to avoid Penguin’s wrath.

The first thing you’re going to want to do is look at your backlink profile using Open Site Explorer, Majestic SEO, or Ahrefs. Look at the total number of links, the number of unique domains, the difference between the amount of linking domains and total links, the anchor text usage and variance, page performance, and link quality.

If this sounds like too much work, there are tools that will automate the analysis process for your and apply decision rules for a fee, such as HubShout and Link Detox.

If you find a bunch of toxic links – the backlinks that came from link networks, unrelated domains, sites with malware warnings, spammy sites, and sites with a large number of external links – you need to take action before the Penguin strikes.

Your next step is to remove the links manually. Contact the site’s owner to request he or she remove the links. Failing that, you can always disavow them. This tells Google not to count the links when it determines PageRank and search engine ranking.

How can you recover after a Penguin attack?

If Penguin 4.0 does wind up pecking your campaign to the verge of death, don’t worry. You can recover.

Analyzing your backlink profile and removing toxic links – what you should do to prevent a Penguin issue – are also the steps you need to take to recover.

However, the thing about disavowing a link is that it may actually hurt your campaign. No one but the Google hivemind really knows whether or not a link helps or hurts. You can only make an educated guess. Despite this risk, you still need to disavow any links that appear to be toxic.

The next logical step after purging your backlink profile is to build it up again. Although you should never stop trying to earn backlinks, it’s a smart idea to redouble your efforts after a Penguin attack.

Guest blogging isn’t the only way to earn backlinks, either. Entrepreneur offers a great list of creative ways to get people to link to your site, such as:

  • Broken-Link Building: Check a site for broken links, and compile them into a list. Then, take said list to the webmaster, and suggest other websites to replace the links, one of which being yours.
  • Infographics: The thing about infographics is that they’re more shareable than blogs. Research shows that 40% of people respond better to visual information than plain text. The idea here is exactly the same as the idea of content marketing. You create a great piece of content – an infographic, in this case – and people are going to share it. In the case of an infographic, other sites and blogs could repost it. Success isn’t guaranteed, but this method can work.

what makes a good infographic

  • Roundups: Similar to guest blogging, reaching out to bloggers and sites that run weekly or monthly roundups is a great way to get some backlinks. Search your keyword and “roundup,” and limit the results to the past week or month. Once you’ve found a few, send the webmaster a link to one of your guides, tutorials, or other pieces of content (like, say, a new infographic). Sites that run roundups are constantly looking for content, so there’s a good chance they’ll include your work in their next edition.

What’s next?

So long as you take these precautionary steps, you’ll be fine, whenever Penguin does rear its beaked head.

What is HTTP2 and how does it affect us?

HTTP2-graphic - source akamai

The web is about to get faster, with the introduction of the latest version of the HTTP protocol: HTTP/2.

It’s been 17 years since the last update and so many things have changed in almost two decades. Technology has created more demanding users, sites only got heavier and speed is an important factor for most of us while browsing.

As servers already started adapting to HTTP/2, it’s time to learn more about it and try to understand everything we need to know about this significant change on the web. How does it affect us?

What is HTTP/2?

HTTP/2 is an updated version of HTTP (Hypertext Transfer Protocol) and it is based on Google’s SPDY protocol, which was developed to improve the speed and the performance of the browsing experience.

The history of HTTP

The Hypertext Transfer Protocol (HTTP), or what most of us know as the ‘http://‘ in a web address, is the protocol that established the connection between a user’s browser and a server’s host.

HTTP was defined back in 1991, while its current version, HTTP/1.1, was introduced in 1999, which means that it was only a matter of time to welcome the next update. Last February the Internet Engineering Task Force (IETF) formally approved a draft of HTTP/2 and that’s how the standardisation attempt started.

source: http2.akamai.com

Why should I care?

If you are using the web, then you should probably care. You don’t have to be a developer to be interested in this exciting change, as it promises a faster and more functional browsing experience for everyone.

Sites have significantly changed since the last HTTP protocol update almost 20 years ago and it’s time to face the fact that modern sites consist of more images and data, which affect the loading time for a page.

According to Daniel Stenberg,

“When looking at the trend for some of the most popular sites on the web today and what it takes to download their front pages, a clear pattern emerges. Over the years the amount of data that needs to be retrieved has gradually risen up to and above 1.9MB”

HTTP/2 promises to adapt to the needs of our time, by assisting everyone to access any site as fast as possible, even without having a high speed internet connection.

source: w3techs.com

What’s changing?

We don’t need to dive into technical details to discover the most important changes that HTTP/2 brings, so this is a simplified overview:

Multiplexing

Multiple messages can be sent at the same time, with just one TCP (Transmission Control Protocol) connection. This will reduce the required time to process the requests that are sent and received, improving the user experience, by also speeding up the loading time.

Up to now, HTTP/1.1 allowed only one request to be handled at a time, which led to a series of multiple requests and slower connection. What’s more, a page load used to require several connections, while HTTP/2 solves both challenges with multiplexed streams and the use of just one connection while a site is open.

These lead to a cleaner and faster connection, improving latency, which is expected to be highly appreciated.

multiplexing (source cloudflare)

source: Cloudflare

Server Push

Server push is about saving time, with the server analysing the client’s next request, sending additional information, even before they are needed.

There’s no need to wait for the HTML to load until the browser requests the Javascript, or images, etc., as HTTP/2 protocol will allow the server to make faster data transmissions by sending “push” responses.

No more delays, time for proactively pushed responses!

Prioritization

Prioritization is about understanding the importance of each element, by transferring the most important requests first. It’s the browser that suggests the data to be prioritized, but the final decision is made by the server.

http2 (source google)

source: Google

Binary

HTTP/2 focuses again in boosting the sites’ loading speed by transferring data to a binary format, which is the computer’s native language. This will remove the unnecessary step of translating text messages to binary protocols, which leads to a more efficient result.

Header Compression

HTTP/2 allows the compression of the headers, in order to reduce the header’s size along with the number of round trips needed for each request. This is even more important in mobile browsing, where a page’s assets and its latency may be even more challenging.

isthewebhttp2yet

source: isthewebhttp2yet.com

Is HTTP/2 currently in use?

HTTP/2 may not be the standard protocol yet, but there is a growing interest on its use month by month, with 6.6% of all websites currently using it. In fact, the percentage goes up to 13.5% percent for websites that rank in the top 1,000.

http2 usage1 (source w3techs.com)

source: w3techs.com

According to Can I Use, it is supported by 71.14% of the browsers globally, with Chrome, Firefox and Opera supporting it only through encrypted connection (HTTPS).

It is promising to consider that several top sites and servers are starting to embrace HTTP/2, with CloudFlare and WordPress supporting it for several months now. Beta support is also available from Akamai, Google, and Twitter, while Microsoft and Apple are planning to support it on their future releases.

canuse

source: caniuse.com

In case you’re wondering whether it’s still early for HTTP/2, Mark Nottingham is clear about it:

“It’s just important to remember that HTTP/2 is an infrastructure upgrade for the web, and as such it’s going to take time to see the full benefit. That said, there’s still considerable benefit in adopting them now.”

isthewebhttp2yet (2)

source: isthewebhttp2yet.com

What should I do?

There’s no need to do anything from a user’s point of view, as the change has already started in several sites. As HTTP/2 is backwards compatible with HTTP/1.1, a user won’t notice any difference except for the speed and as more and more servers and browsers eventually adapt to it, we will all enjoy a faster browsing experience.

Here’s an example:

If you’re curious to see the actual performance of HTTP/2, Akamai created a test site for you to compare the latency of each protocol.

akamai

As you can see, there is a difference in the loading time and according to the initial stats, we are generally expecting a speed boost of 20 – 30%.

Six of the most interesting SEM news stories of the week

london hotel Google Search with right hand side ads

Welcome to our weekly round-up of all the latest news and research from around the world of search marketing and beyond.

Oh you’ve been away all week? Right, okay. Well sit down. We have some news.

Google’s been making some changes…

Google kills its Right Hand Side Ads

The inescapable news this week is of course Google AdWords removing all the ads from the right hand of its SERPs.

Now instead of seeing PPC listings, you’ll either see an odd blank space, or Product Listing Ads, or the standard Knowledge Graph for Ryan Gosling (or insert your current crush here – mine’s still Ryan Gosling).

The ramifications for the change are myriad, but the biggest change for user and marketer alike is the increase in PPC ads at the top of the SERP from three possible links to four.

And speaking of Google…

Google launches Accelerated Mobile Pages

Although we had been expecting the launch of AMP – Google’s open source initiative which aims to improve the performance of the mobile web – around now anyway, Google began rolling out AMP at the beginning of this week (Tues 23 Feb).

@sewatch Sure, try this one pic.twitter.com/HVjCg1CEPA

— Mark Chalcraft (@markchalcraft) February 23, 2016

As expected, pages enabled with AMP carry a symbol on their mobile search results to tell users they are faster loading pages. And you can probably be certain that this is a positive ranking factor.

amp pages symbol

There’s lots more information on Google’s AMP project here.

And speaking of Google AGAIN…

Google to close its financial comparison service

For some people this may be the biggest (positive) change of the week. Google is to shut down Google Compare from March 23 in both the UK and US.

Google compare serps

As Graham Charlton reports, “On the face of it, this news will have competitor comparison sites jumping for joy, as Google Compare constituted a major threat to their own business models.”

A spokesperson from Google stated it has decided to focus more intently on AdWords and other future innovations, which will “enable us to provide fresh, comprehensive answers to Google users, and to provide our financial services partners with the best return on investment.”

Facebook rolls out new Like buttons

In non-Google news, Facebook has added to your arsenal of social interactions with a variety of facial expressions and emojis.

They’re called Reactions.

reactions-images-facebook

I still don’t have them yet and I’ve got a LOT of anger built up about it and when I eventually get them EVERY SINGLE ONE OF MY FRIENDS is getting the furious face.

Outgoing links probably good for your sites SEO

Reboot recently carried out a study to prove somewhat definitively whether or not the strength of a site’s outgoing links has an effect on ranking.

The good news is that yes it does.

Reboot created 10 new websites each targeting the same keyword, only half of which included links to high authority sites and after five months it was concluded that, “Outgoing relevant links to authoritative sites are considered in the algorithms and do have a positive impact on rankings.”

outgoing-link-experiment-position-graph-p-950

For a complete guide to the research and lots more well-explained graphs such as the one above, visit the study and be safe in the knowledge that linking to bigger sites will definitely not do you any harm in search.

Call your mum

Finally, Bing has released a few search insights in time for Mother’s Day, revealing that more than half of Mother’s Day retail searches are set to be made from a mobile device.

  • Over 60% of searches are expected to be made on the move via mobile, with search volumes set to increase by five times between 7am-9am on the day itself.
  • Women take the lead in searching for gifts, making up over two thirds (67%) of all searches.
  • Searches will increase by up to four times in the 48 hours leading up to Mother’s Day.

I’m merely including this to remind you that Mother’s Day is 6th March, so you have more than a week to buy a card. You’re welcome.

Say goodbye to Google: 14 alternative search engines

Bing homepage

Well it’s been a big week for search, I think we can all agree.

If you’re a regular Google user (65% of you globally) then you’ll have noticed some changes, both good and bad.

I won’t debate the merits of these improvements, we’ve done that already here: Google kills Right Hand Side Ads and here: Google launches Accelerated Mobile Pages, but there’s a definite feeling of vexation that appears to be coming to a head.

As the paid search space increases in ‘top-heaviness’, as organic results get pushed further off the first SERP, as the Knowledge Graph scrapes more and more publisher content and continues to make it pointless to click through to a website, and as our longstanding feelings of unfairness over Google’s monopoly and tax balance become more acute, now more than ever we feel that there should be another, viable search engine alternative.

There was a point not that long ago when you could easily divide people between those that used Google, Yahoo, Ask Jeeves and AltaVista. Now it’s got to the point where if you’re not using Google, you’re not really using the internet properly.

Remember when The Amazing Spider-Man reboot came out in 2012 and the most unbelievable thing in the movie – which involves a teenager with superhuman spider powers crawling up walls and swinging through New York in lycra – is that Peter Parker uses Bing?

Right now though maybe we should be paying more attention to the alternatives. Maybe our daily lives and, for some of us, careers shouldn’t need to balance on the fickle algorithm changes of the world’s most valuable company.

Let’s see what else is out there in the non-Google world. It’s not that scary, I promise.

Please note: this is an update of an article published on SEW in May 2014, we felt like it needed sprucing up especially many of the listed engines (Blekko, Topsy) are no longer with us.

Bing

Microsoft’s search engine is the second most popular search engine in the world, with 15.8% of the search market.

But why should you use Bing? Lifehacker has some great articles where they try to convince themselves as much as anyone else why Bing is a serious contender to Google. Plus points include:

  • Bing’s video search is significantly better than Google’s, giving you a grid of large thumbnails that you can click on to play or preview if you hover over them.
  • Bing often gives twice as many autocomplete suggestions than Google does.
  • Bing can predict when airfares are about to go up or down if you’re searching for flights.
  • Bing also has a feature where if you type linkfromdomain:[site name] it will highlight the best ranked outgoing links from that site, helping you figure out which other sites your chosen site links to the most.

Also note that Bing powers Yahoo’s search engine.

DuckDuckGo

The key feature of DuckDuckGo is that it doesn’t retain its users’ data, so it won’t track you or manipulate results based on your behaviour. So if you’re particularly spooked by Google’s all-seeing, all-knowing eye, this might be the one for you.

There’s lots more info on DuckDuckGo’s performance here.

Quora

As Google gets better and better at answering more complicated questions, it will never be able to match the personal touch available with Quora.

quora

Ask any question and its erudite community will offer their replies. Or you can choose from any similar queries previously asked.

Dogpile

Dogpile may look like a search engine you cobbled together with clip-art, but that’s rather the point as it pulls in and ‘curates’ results from various different engines including Google, Yandex and Yahoo, but removes all the ads.

Dogpile Web Search

Vimeo

Of course if you’re going to give up Google, then you’ll also have to give up YouTube, which can be a terrifying prospect. But there is an alternative. And a pretty good one at that… Vimeo. The professional’s choice of video-sharing site, which has lots of HD video and no ads.

otis the cat reviews in videos on Vimeo

Yandex

This is a Russian portal, offering many similar products and services as Google, and it’s the dominant search engine in Russia.

As you can see it offers results in a nice logical format, replete with favicons so you can clearly see the various channels for your branded queries.

search engine watch on Yandex

Boardreader

If you want to get into the nitty-gritty of a subject with a variety of different points of view away from the major publications, Boardreader surfaces results purely from forums, message boards and, of course, Reddit.

Boardreader Forum Search Engine

WolframAlpha

WolframAlpha is a ‘computational knowledge engine’, or super clever nerd to you and me. Ask it to calculate any data or ask it about any fact and it will give you the answer. Plus it does this awesome ‘computing’ thing while it thinks about your answer (which can take a short while.)

what really killed the dinosaurs Wolfram Alpha

It’s not always successful, you have to practice how to get the best from it. But at least it’s aware of the terrible 90s television show The Dinosaurs.

IxQuick

Another search engine that puts its users’ privacy at the forefront. With IxQuick none of your details are stored and no cookies are used. A user can set preferences, but they will be deleted after 90 days of inactivity.

Ixquick Search Engine

Ask.com

Oh look… Ask Jeeves is still around. Also he’s no longer a Wodehousian butler, but a computer generated bank manager. Weird.

Ask Jeeves

It’s still a slightly mediocre search engine pretending to be a question and answer site, but the ‘Popular Q&A’ results found on the right hand side are very handy if Jeeves himself can’t satisfy your query. And what a good use of the right-hand side space, huh Google.

SlideShare

SlideShare is a really handy place to source information from presentations, slide decks, webinars and whatever else you may have missed from not attending a conference.

You’ll also be surprised what information you can find there.

hamburgers on SlideShare

Addict-o-matic

“Inhale the web” with the friendly looking hoover guy by creating your own topic page, which you can bookmark and see results from a huge number of channels in that one page (including Google, Bing News, Twitter, YouTube, Flickr).

Addictomatic Inhale the Web

Creative Commons Search

CC Search is particularly handy if you need to find copyright free images for your website (as discussed in this post on image optimisation for SEO). Just type your query in then click on your chosen site you want to search.

CC Search

Giphy

Because really, when it comes down to it, we could imagine a worse dystopian future than one in which we all communicate entirely in Gifs.

GIPHY homepage

Why accessibility is key for search and visibility

If you’re involved with SEO, you’ve no doubt thought about all sorts of ways and means to boost your site in the search rankings. But if your site isn’t web accessible, your efforts will be in vain for 1/5 of your potential visitors.

Web accessibility is the name given to making websites and online materials usable to people with disabilities, removing barriers to the way they experience the internet.

From physical disabilities like loss of mobility, blindness and deafness to learning difficulties like dyslexia, a wide range of disabilities affect the way that someone accesses the internet.

People with disabilities make up a significant portion of the population, yet far too little is done to cater towards them online, representing a huge missed opportunity in terms of traffic and visibility for any website or brand.

The need for accessibility

The Global Economics of Disability Annual Report 2014 estimated the global population of people with disabilities as 1.3 billion – nearly 18% of the world’s population, or one in every five people.

These numbers are likely to climb as Generation X ages, making catering to age-related disabilities even more important to anyone targeting the baby boomer generation.

In no other context would it make sense to overlook such a significant demographic, yet making websites and digital materials accessible is far too often seen as a tedious and pointless exercise. But anyone who pays attention to SEO and optimises their website is already part of the way there.

Making your website more accessible to users with disabilities also happens to overlap nicely with improving the all-round user experience, and with boosting your site that much higher up the search rankings.

Accessibility and good user experience go hand-in-hand with search.
Image by Paul Veugen on Flickr; some rights reserved

Why accessibility works for search

A good rule of thumb for accessibility is making sure that all information is delivered to the user in more than one way.

For example, you shouldn’t rely only on the ability to see colour to distinguish the important parts of a web form, or the ability to use a mouse to navigate a website.

Images, audio and video should all have text alternatives available in the form of alt text descriptions, closed captions and transcripts. This makes the content accessible to users with visual or hearing impairments. It also provides more information to search engines, which rely on text to find out about a site.

In a past piece on the SEO benefits of web accessibility, Mark Jackson explained that text browsers, which ignore graphic content, are often used to review how a website appears to a search engine. A website will appear in the same way to users of a screen reader, which can interpret the web for those who are blind, visually impaired, illiterate or learning disabled.

In other words, making the text-based ‘version’ of your website as comprehensive as possible has benefits for both accessibility and search.

The overlap between increased accessibility, SEO and a better user experience can be seen in all sorts of areas.

Providing a site map, for instance, gives a handy point of reference for all users, but particularly those with a screen reader. It also allows search engine robots to quickly crawl a website.

Writing for your website in simple, jargon-free language can benefit users with a learning difficulty or cognitive disability, but it’s also a helpful practice for everyone, especially those for whom English isn’t their first language.

People are more likely to search the web for simple words than for industry jargon, too, so writing for your website simply will help ordinary users to find it in search.

There are also a few search engines, such as Net Guide, specifically designed to promote and rank websites which are highly accessible. Google used to have one such search outlet, but it is no longer supported.

The extra traffic from these search engines might not be huge, but they are worth bearing in mind as another source of visibility for an accessible site.

Putting it into practice

The theory of how accessibility can boost your site’s visibility and traffic is all very well, but how can you carry it out in practice?

I mentioned a few practical steps in the previous section, such as providing text alternatives to visual content and checking out how your website looks in a text browser.

Not everything that makes a site more accessible will align directly with optimising for search, and a lot of it will require extra effort and some new ways of thinking. But it’s a worthwhile exercise, and there are a lot of free tools and resources available to help anyone who sets out to design for accessibility.

A good place to start is Deque System’s guide on Designing for Website Accessibility, part of a series on accessible marketing produced in association with the Whole Brain Group. They divide website design into five key areas for accessibility – complete with a handy checklist you can download.

W3C, the web accessibility initiative, also has a set of tips for getting started with web accessibility with visual reference points. To check how accessible your website is already, you can run it through WebAIM’s web accessibility evaluation tool.

A screenshot of the web accessibility check for the website Search Engine Watch, using WebAIM's WAVE tool. The screenshot shows a total of 33 errors, 126 alerts, 27 features, 53 structural elements, 5 HTML5 and ARIA and 27 contrast errors. These are flagged up across the homepage with various yellow, red and green icons.The accessibility check for Search Engine Watch shows some work still to be done…

A major current development in search technology, voice search, also has its roots in something that is highly beneficial to users with disabilities.

While ways of catering to voice search technology are not yet as refined as other techniques in SEO, there are still ways you can adapt your site to capitalise on voice search, as laid out in depth by Asim Ahmed.

Altogether, adaptations to make your site accessible are well worth adding alongside your usual optimisation for search in order to stay ahead of the curve and ahead of your competitors – and your visitors will thank you for it.

Why companies create content – part two: to gauge public opinion

three everton fc logos

Following on from Part One of this series where the topic of influencing brand perception was discussed, this new instalment looks at how content can help you tap into the mindset of the people you’re trying to sell to.

Part two: to gauge public opinion

Pretty much any content related book, article or conference talk you come across will at some point mention the term ‘audience-focused content’. As a phrase and a concept, it’s a simple one – create stuff that people are going to want – but the cogs that sit behind it can be both complicated and costly.

Rather than researching what people want then creating content to reflect that, there’s an argument to say your content could actually be your research method. By publishing then assessing how people react to what you put out there, the data can be used to create something bigger and better, or to inform other business actions.

Pre-internet (a scary thought) I recall being involved in various focus groups asking me what I thought of this pair of trainers, chocolate bar or gadget. For a handful of spotty-faced opinions, that would have involved the rigmarole of contracting a market research company, finding some willing participants, hiring a venue, recording the proceedings and collating the findings.

These days you can just put up a vote on your Twitter feed.

Which of these packaging designs do you prefer? Should we serve a sour cream dip or salsa with our wrap? Which of these do you think should be our new TV ad? What should our new album be called?

People are used to being asked questions by brands, and companies are using these valuable contributions (along with other forms of data) to validate their marketing and product development efforts; “Our fans prefer the red one” is a perfectly valid rationale to bring up in a board meeting.

Ignore this opportunity at your peril.

Sticky Toffees

Everton Football Club have a proud tradition – despite not winning a league title since the mid-1980s they’re one of the best-supported clubs in England and social ad agency RadiumOne found in 2013 that if fan interactions formed the basis of the league table, they’d be the team taking home the silverware.

With that in mind, it seems all the more strange that they had such a catastrophic fail when it came to a rebranding exercise prior to the 2013/14 season.

A new club crest (below, in the middle) was announced to overwhelming online derision – there are few things that people feel more passionately about than their football club, and the fact that fans hadn’t been consulted caused uproar.

23,000 petition signatures later, the club had their hand forced; they had to do something. What followed was a series of apologies, consultations with fan and community groups and eventually an online vote which was won by the logo that most closely resembled the one they already had.

Rebranding twice within as many years is not a good thing for any business to be doing, let alone one which reaches of billions of people across the world every week. Wouldn’t this have all been much simpler if they had put a few options to the vote of their millions of social fans straight away?

That’s what New York City FC did.

Made in NY

This is a club that doesn’t have such an illustrious past – in fact they’ve only been in existence since early 2013, which may be why they seem to have a better grasp of the importance of crowdsourcing opinions to influence business decisions.

The public were asked to pick between these:

CBO Tim Pernetti said:

“Our supporters will always have a voice in our Club at New York City FC. We are truly excited about this opportunity to partner with them on this decision and we are counting on all New Yorkers and fans beyond the city to get involved, cast a vote and make New York City FC history.”

Fellow member of the City Football Group, Manchester City have taken lessons from this process for their own re-badging exercise.

manc city badges

In a statement that has obviously come from the same press office, Chief Executive Officer Ferran Soriano commented:

“We are looking for our fans to share their views as to what they consider to be the most authentic symbols of the Club. The views of our Cityzens are essential to the process; they will have a real say on the future of our badge.”

A host of online questionnaires, lectures on the history of the badge and articles in local newspapers detailing every element from the shape to the importance of each featured symbol was created, all building up to the big reveal.

Supporters consulted, decision made, perfect new badge. #proud @MCFC pic.twitter.com/v6eJj6hY5n

— David Tragen (@David_Tragen) December 26, 2015

What they were essentially doing was building a dossier of evidence to support their launch, using content to give people ample opportunity to offer feedback.

As long as you can keep out online mischief makers looking to derail the voting process (4chan famously managed to get Miley Cyrus to the top of Time Magazine’s 2013 Person of the Year poll), there’s no excuse not to involve those who have actively shown an interest in your business in decisions around the products you want them to buy.

From views to dwell time to shares to comments, every piece of content you publish gives you an idea of how people feel about what you’re saying, which can in turn inform the approach the rest of the business takes.

As an aside to this seven part series, check out Ayima’s free DIY Content Marketing Strategy ecourse, designed to help you improve the ROI of your content.

Stop thinking about long-tail keywords and start focusing on searcher intent

Over the years, the usefulness of certain types of keywords has been debated, analyzed, celebrated, and even disparaged.

Long-tail keywords – those specific phrases of low-volume but perhaps higher-quality queries from searchers who are closer to taking action on procuring the product or service they seek – have certainly received a heck of a lot of recognition for their value to marketers.

However, I am here to declare the demise of these keywords that we held in such high regard only a few short years ago.

Please let me explain…

As the use of search has evolved and search engine optimization has become commonplace, businesses have succeeded in increasing their visibility in search results and made adjustments to be most visible for those queries they care most about.

This, by itself, would be fine; a positive and helpful thing actually, if the end effect was search results pages all containing exactly what the user was searching for. However, the issue we’ve seen is that queries on many broad keywords no longer provide the relevant results that a searcher wants.

A search engine user looking for information now often uses one of these two methods to arrive at the search results they need:

They start with a broad search and continue to refine that search until they get to appropriately relevant results.
They mentally refine their search, knowing the broad results will not bring what they want. So they begin with a more specific search and refine fewer times.

Certainly, longer search queries are becoming the norm. Part of the issue here is that Google has populated broad queries with many different universal result offerings… News, Images, Videos, Knowledge Graph. This moves those specific, relevant pages that many searchers are actually looking for further down the page – or possibly onto the next page.

Then, on top of those universal results, we have results like Wikipedia and educational or governmental pages that don’t exactly fit the intent of the user’s search either.

Now that searchers are finding it necessary to further and further refine their queries to get to the precise results that they want, I suggest that what we once called long-tail queries are now simply queries. The keywords that users now commonly rely upon are becoming so lengthy and diffuse that the distinction is lapsing into serving little function.

Thus, the term ‘long-tail’, to me, no longer exists. The once novel concept of paying attention to long-tail keyword queries is now so commonplace that it can go without being said. Long-tail keywords are now just the queries we all use to actually find what we need. Our ability to identify specific combinations of words that lead to our desired results will continue to evolve.

This is why the SEO community is moving away from the targeting of specific keywords or queries to instead thinking about themes and the searcher’s intent.

If we, as marketers, ditch our focus on query length and instead drive the focus toward the theme of the content, we can then start to adjust that content to make sure that what we are truly providing is the search destination that our potential clients and customers have in mind.

Ready to incorporate this mindset? Start with asking these three questions of your business’s website:

Are we satisfying the searchers’/customers’ journey?
Are we giving them what they need and in the way they need it?
What do our customers need from us that we aren’t currently providing, but should?

Kevin Gamache is Senior Search Strategist at Wire Stone, an independent digital marketing agency for global Fortune 1000 brands.

Google has launched Accelerated Mobile Pages

amp pages symbol

Welcome to a speedier mobile web.

It’s been a massive week for the Google SERPs this week and it’s only Tuesday. As well as Google killing Right Hand Side Ads and shutting down its own comparison service, it seems that Google has also launched its Accelerated Mobile Pages project.

Although we had been expecting it around next weekend, as Google stated it would be here “late February”, it seems some people have been seeing results with a little AMP icon today (Tues 23 Feb).

@sewatch Sure, try this one pic.twitter.com/HVjCg1CEPA

— Mark Chalcraft (@markchalcraft) February 23, 2016

The AMP project is an open source initiative which aims to improve the performance of the mobile web. As our own Rebecca Sentance explained in her article on Google’s AMP only yesterday, AMP pages are a “stripped-down version of the mobile web which runs on a reinvented version of HTML.”

Google has stated that a page created with AMP HTML can load anywhere from 15 to 85% faster than the non-AMP version of that page.

What this means for users is a much faster mobile web, and for publishers using AMP a likely boost in search rankings, as site speed and mobile friendliness are both vital for user experience.

It was informally speculated during the announcement of AMP back in December that these pages created using AMP might receive a ‘fast’ label (similar to the ‘mobile friendly’ label), but it seems that Google is going with AMP and a little lightning bolt…

For us within the search industry it may be obvious what this label means, but is that the case for regular users?

From asking around the non-SEW parts of the office, nobody guessed that the symbol meant faster loading pages. One person, who will remain anonymous replied, “louder pages.”

Fair enough to be honest.

The point is, there perhaps needs to be a lot more education from Google in order to tell people what this symbol means, if it’s not going to use something more obvious like ‘fast’. Especially if AMP pages are meant to benefit the user and are Google’s preferred way of improving the mobile web.

Evolution of call tracking in a mobile first world

Marchex click to call stats

Sponsored content in collaboration with Marchex. Views expressed in this article are those of the guest author and do not necessarily reflect Search Engine Watch’s opinions.

If you answer your phone or want your phone to ring as a result of your marketing, in particular search engine marketing, then you’ll want to read the following interview.

I had the privilege of sitting down with Adarsh Nair, Senior Director, Search Product and Engineering at Marchex. Adarsh filled me in on the latest advances in call tracking and marketing automation that have been baked into their Call Analytics platform and specifically Marchex Search Analytics.

Kevin Lee: Why is tracking and 100% attribution at the keyword level important?

Adarsh: We hear a lot about big data and the power of data. Search is a very sophisticated media channel with hundreds of millions of keywords to bid from leading to trillions of dollars in sales. This is a big data problem. If you break it down to the basics, each keyword is a market of customers searching for something specific. And marketers have a cost to get their ad to show up in front of customers.

It is our strong belief that for the best ROI optimization, search marketers need to have the most granular data (keyword level), complete data (online and offline sale attribution) and the tools to make sense of data.

Keyword level attribution for online and offline sales is just the beginning to building a great search optimization strategy.

Why do some marketers prefer to take orders over the phone?

For industry verticals such as auto, financial services and travel, the product being sold is sometimes complex and expensive. These verticals also see a fair share of companies competing for the same customer. Getting a consumer on the phone and/or in a store increases the opportunity to convert a prospect into a customer through excellent customer service. The human connection in many cases makes the difference.

Separately, it is also important to note that we are seeing marketers responding to customer choice.

Consumers are choosing to call businesses from the web using their mobile phones. This has pushed Google, Facebook and other publishers to respond with mobile advertising formats that incorporate a click to call.

I get it. For many products, particularly those with lots of options or service offerings with many service levels it’s better to have a rep talk to a potential customer because the conversion rate to sale is higher AND the average sale price is higher. Plus you get fewer returns or customer service issues when the buyer gets the right product or service.

For many marketers, being able to attribute call conversions to search media at the keyword level results in ROI being properly stated and the PPC bid reserve price going up. Can you explain why this is important?

Having the ability to raise one’s keyword bid while maintaining a high measured ROI can facilitate higher positions for the keyword, if the competition doesn’t escalate bids.

Anyone who wants site visitors to call should be tracking phone conversions, but which industry categories are having the greatest success with your platform?

We see two key customer journey paths when it comes to calls from search. First, customers are choosing to click to call from the search ad itself. Second, customers click through to the landing page or the site, and call a number on the site. The key industry verticals for Marchex, where calls make up more than 20% of the overall conversions, are Auto, Financial services, Travel, Home Services, Telco and Cable services.

Can you provide specific examples that might shock readers as to just how big a difference call tracking can make?

One of our customers in financial services faced a challenge. Mobile CPCs were going up, leading to higher cost per acquisition (CPA), and around 40% of their total conversion came through calls. The customer had full visibility into online conversions driven from keywords, but did not have visibility into call conversions.

The lack of a complete picture that showed total conversion from a keyword caused optimization challenges and cost per acquisition remained high. Marchex Search Analytics helped the customer develop a complete picture of total conversions (online and offline) from each keyword.

With call conversion added in, the customer saw the top performing keywords change. CPAs of specific keywords dropped by more than 50%. The customer used the data to revamp their search strategy and in a three month timeframe reduced overall CPA by ~10% and drove ~15% more conversions. And this is for a company with a massive paid search budget.

In addition to simply understanding phone conversions and improving one’s PPC media optimization what other insights can one extract from calls using your platform?

In addition to calls, duration, conversions, there are three categories of data/insights that Marchex Search Analytics provides:

1) Deep call insights through machine learning: CallDNA is a Marchex technology the platform uses to provide deep insights into what happened on the call. Search marketers find a variety of uses for this data.

As an example, one of our customers in the Auto industry looks for a specific phrase in calls as it’s an indicator for future sales. Getting a sense of keywords that drive calls with the specific phrase helps our customer invest in the right keywords to drive demand. Another example is where a customer in the Telco/Cable vertical uses our platform to understand which keyword drives conversation vs. hangups/misdials. This helps the customer invest in keywords that drive the right kind of calls into their call centers.

2) Consumer touch points along the call flow: The call consumer journey is very similar to the website consumer journey. Very similar to how the website’s ease of use and responsiveness determines the consumer experience, the success of the call consumer journey is determined by how intuitive the IVR is, how responsive the agent on the phone is.

Marchex Search Analytics is able to surface the IVR input form the consumer back to the search marketer at the keyword level. Many search marketers use this feature to determine how many new customers are being driven through the call flow by tapping the ‘new customer’ IVR input at the keyword level

3) Enhanced conversion data: We also support advanced conversion data. Many platforms discuss bringing conversion counts or using a proxy for conversion like ‘calls above a certain duration’. Marchex Search Analytics partners with our customers to bring in conversion data at the keyword level that includes total sales transactions, revenue drive from those transactions, product SKUs that drove the conversion.

Are there any important things marketers should know between click to call originated calls vs. in-ad phone numbers being dialed vs. landing pages with a phone call to action?

In an increasingly mobile world, customers in industry verticals where calls are important are seeing more than 60% of their calls come from in-ad phone numbers. Customers should do the due diligence to understand if the call tracking solution they use can seamlessly provide call conversion data at the keyword level for both in-ad phone numbers (also referred to as call extensions or call only campaigns) and landing page based phone numbers.

For keyword level tracking of call from in-ad phone numbers, we advise against hacks such as mapping adgroups to single keywords. Such strategies are expensive, do not scale for enterprise customers and could have negative impact on quality score. Marchex Search Analytics works with the existing search campaign structure of our enterprise customers and we are able provide a seamless experience by pushing the new keyword level click to call data to bid optimization platforms automatically.

Finally, we are also seeing that keywords that drive calls from in-ad phone numbers typically differ from the keywords that drive calls from landing pages. Having granular attribution for calls from in-ad phone numbers and landing pages will be critical for best in class optimization strategies.

Many marketers want to know the system they pick has been around a while to be robust and stable. How long has Marchex been in the call tracking sector? What are the other reasons that marketers know and trust your system?

Marchex has been in the call tracking space for close to 10 years. Marchex is the trusted partner for Fortune 1000 enterprise brands and we are the largest call analytics provider with more than 300 million calls flowing through our systems every year. Finally, Marchex invests in product innovation and our enterprise customers choose Marchex due to cutting edge innovations such as search, display, video and site analytics for call conversions.

What else should marketers know when they evaluate call tracking solutions?

Call conversions are driven from a variety of media channels, which includes search, display, mobile video, email. While direct attribution of calls from search is important, marketers are now beginning to realize that display and mobile video are influencing call conversions in a big way. Marketers should consider a call analytics platform that has the capability to track view-through call conversions from display and mobile video and can provide cross channel attribution for call conversions.

I’m pleased to have learned a lot about call tracking from Adarsh, and I hope you, the reader learned something too and of course you can learn more here: Marchex Search Analytics.

*Sponsored content in collaboration with Marchex. Views expressed in this article are those of the guest author and do not necessarily reflect Search Engine Watch’s opinions.