What can we learn from the winners and losers of organic search in 2016?

A line graph plotting the winners and losers of SEO visibility in 2016 against the various Google updates throughout the year.

Who were the winners and losers of organic search in 2016?

For the third year in a row, Searchmetrics has published its annual Winners and Losers Report, which reveals how certain sites fared in organic search visibility on Google.com over the course of 2016.

Searchmetrics’ analysis is based on the ‘SEO Visibility’ of each website, an indicator developed by Searchmetrics to measure a webpage’s performance in organic search.

This is not the same as organic search ranking, but aims to give an overview of how often a website shows up in search results, based on “search volume and the position of ranking keywords” (as set out in the Searchmetrics FAQ).

Based on this metric, Searchmetrics then analysed the change in websites’ SEO Visibility during the course of the year, and sorted the top 100 winners and losers by absolute change in visibility.

While the results are limited to Google.com – and thus are mostly applicable to websites from the U.S. – they are an interesting insight into how the trends and algorithm changes that we cover throughout the course of the year affect sites in real terms.

So who did well out of 2016, and what was the secret to their success? What caused the downfall of the poor performers?

The winners: social media and shopping

Among the biggest winners, although only making up 10% of the ‘winner’ sites overall, was social media. Instagram, Twitter, Pinterest and Facebook all ranked in the top 10 in terms of absolute gains, with Pinterest the surprise leader of the back – it came in at #2 in terms of overall visibility (Google, surprise surprise, was #1) and had a whopping 80% gain in overall visibility during the year.

We’ve already covered Pinterest’s prowess as a platform for visual search, which goes a long way towards helping keep users on its site by finding Pins which are visually similar to what they’re looking for. Searchmetrics attributes Pinterest’s success in SEO visibility to its forays into deep learning, which “generate more relevant results that evolve based on user behavior.

Like Apple, Google (and even Searchmetrics), Pinterest’s application of deep-learning techniques is a key step in helping software understand the ultimate intent of a user, thereby generating more loyalty and stickiness online.”

While not every site has the resources or the technology to experiment with deep learning, the idea of generating “stickiness” and giving users an incentive to come back to the site again and again is something all site owners/SEOs can aim for (and many already do). Improving a site’s internal search to help users find what they’re looking for more easily is another way to achieve this.

Shopping was another significant category of website that did well over the year 2016, with nearly a fifth (19%) of sites that ‘won’ falling into this category. Searchmetrics attributed this trend to “relatively steady economies” and “strong economic recovery”.

Among the successful ecommerce sites in 2016 were ebay.com, target.com, retailmenot.com and walmart.com. Two of the retail sites which saw the biggest percentage gains in visibility were theblackfriday.com (217%) and blackfriday.com (212%), which illustrates the SEO power of exact-match domains – when used carefully.

Media: the ups and downs

Media and publishing was the largest category of ‘winners’ in 2016, with nearly two-fifths (38%) of sites who gained SEO visibility falling into this category. However, a large number of media sites also lost visibility in the same period, with the likes of Wired, USAToday, NYMag, Time and BuzzFeed all appearing in the ‘losers’ category. So why the discrepancy?

Searchmetrics’ analysis pointed to the ‘Google News-Wave‘ update of 2015 as one possible cause. The ‘News-Wave’ update, as dubbed by Searchmetrics, was a mysterious update to Google’s core algorithm which caused a lot of media, magazines and news websites to rise in search visibility.

It’s possible that this bump in visibility has now been reset by other adjustments, causing the affected publishers to sink back down in the rankings, or that the algorithm change introduced more volatility in general for media publishers in search.

A line graph showing the SEO visibility of the atlantic.com throughout 2016, with a huge drop in visibility in April 2016 which slowly climbs back upwards, though to about half of the site's previous height.

Searchmetrics’ graph plotting SEO visibility for theatlantic.com in 2016 – and showcasing a huge drop in visibility midway through April

Google’s Panda algorithm update also rewards quality, in-depth content above thinner, more short-form content, which may be the reason behind a rankings drop for some publishers.

However, not all publishers lost out. Among the ‘winners’ in publishing were the New York Times, The Huffington Post and the LA Times, as well as more niche publications like GameSpot, Livescience, Rolling Stone and Women’s Health Mag.

The bigger news titles will likely have benefited from the huge flurry of political events which dominated the headlines in 2016, but for smaller publications, it seems as though focusing on a specific interest area is the way to go for better SEO visibility.

Lyrics out, music in?

One or two other notable trends to be observed from the data are that dictionary and encyclopaedia websites frequently numbered among the losers in 2016. Not exclusively – four or five dictionary or encyclopaedia sites were still found among the winners, including Oxford Dictionaries and Macmillan Dictionary.

However, online encyclopaedias were far more likely to lose out in SEO visibility than to succeed, with major sites like Wikipedia, Wiktionary, Urban Dictionary and Thesaurus.com all suffering a loss in visibility.

This trend is probably attributable to Google’s increasing inclusion of encyclopaedia-style information on the front page of search in the form of Quick Answers, featured snippets and Google Knowledge Graph, reducing the need for users to click through to a reference site for more information.

If Google makes user ratings for films and TV shows a permanent feature in search, we might start to see a decline in the visibility of film and TV ratings websites like Rotten Tomatoes as well.

A graph showing the overall decline in SEO visibility for the free dictionary.com.

A downward trend in visibility for thefreedictionary.com

Meanwhile, another type of reference site suffered a drop in visibility in 2016: lyrics sites. Metrolyrics.com and Lyricsmode.com both saw big drops in visibility of 43% and 54% respectively, although A-Z Lyrics saw a small rise of visibility of 7%. Meanwhile, music providers like Apple, Spotify and Deezer all saw big gains.

The fall in lyrics sites doesn’t mean that lyrics are no longer in demand, but rather that users are getting them from elsewhere. The likes of Amazon Music and Apple Music are increasingly including lyrics with the products they sell, while music publication Genius (which rose in visibility by 34%) also provides lyrics on its site.

While it’s probably too early to declare the death of the lyrics site, sites which only provide one thing will inevitably become obsolete as soon as people start being able to get that thing elsewhere.

What can we learn from the winners and losers of 2016?

Some of the ups and downs of the Searchmetrics Winners and Losers Report are reflective of general online trends, and not necessarily anything that SEOs need to account for (although if you run a lyrics website, take note!). But from the rest, there are a few things we can learn:

  • The ‘stickier’ your website and the more people like to return to it, the better. It also helps if you have put thought into user intent in navigation and internal search
  • Well thought-out, high quality content continues to be a winner with Google’s algorithms
  • When it comes to publishing, targeting a specific niche – rather than trying to spread your coverage over too wide a ground – will also help you rank in search
  • For reference sites, Google’s Quick Answers and featured snippets can be a real blow. But for the rest of us, they can be a huge boost in visibility if you know how to cater to them properly.

Does dwell time really matter for SEO?

good-ctr-for-organic-search-flux

What’s the real impact of machine learning on SEO? This has been one of the biggest debates within SEO over the last year.

Please note, this article was originally published on the Wordstream blog; it is reprinted with permission.

I won’t lie: I’ve become a bit obsessed with machine learning. My theory is that RankBrain and/or other machine learning elements within Google’s core algorithm are increasingly rewarding pages with high user engagement.

Basically, Google wants to find unicorns – pages that have extraordinary user engagement metrics like organic search click-through rate (CTR), dwell time, bounce rate, and conversion rate – and reward that content with higher organic search rankings.

Happier, more engaged users means better search results, right?

So, essentially, machine learning is Google’s Unicorn Detector.

Machine Learning & Click-Through Rate

Many SEO experts and influencers have said that it’s totally impossible to find any evidence of Google RankBrain in the wild.

That’s ridiculous. You just need to run SEO experiments and be smarter about how you conduct those experiments.

That’s why, in the past, I ran an experiment that looked at CTR over time. I was hoping to find evidence of machine learning.

What I found: results that have higher organic search CTRs are getting pushed higher up the SERPs and getting more clicks:

Click-through rate is just one way to see the impact of machine learning algorithms. Today, let’s look at another important engagement metric: long clicks, or visits that stay on site for a long time after leaving the SERP.

Time on Site Acts as a Proxy for Long Clicks

Are you not convinced that long clicks impact organic search rankings (whether directly or indirectly)? Well, I’ve come up with a super easy way that you can prove to yourself that the long click matters – while also revealing the impact of machine learning algorithms.

In today’s experiment, we’re going to measure time on page. To be clear: time on page isn’t the same as dwell time or a long click (or, how long people stay on your website before they hit the back button to return to the search results from which they found you).

We can’t measure long clicks or dwell time in Google Analytics. Only Google has access to this data.

Time on page really doesn’t matter to us. We’re only looking at time on page because it is very likely proportional to those metrics.

Time on Site & Organic Traffic (Before RankBrain)

To get started, go into your analytics account. Pick a time frame before the new algorithms were in play (i.e., 2015).

Segment your content report to view only your organic traffic, and then sort by pageviews. Then you want to run a Comparison Analysis that compares your pageviews to average time on page.

You’ll see something like this:

time-on-page-before-rankbrain

These 32 pages drove our most organic traffic in 2015. Time on site is above average for about two-thirds of these pages, but it’s below average for the remaining third.

See all those red arrows? Those are donkeys – pages that were ranking well in organic search, but in all honestly probably had no business ranking well, at least for the search queries that were driving the most traffic. They were out of their league. Time on page was half or a third of the site average.

Time on Site & Organic Traffic (After RankBrain)

Now let’s do the same analysis. But we’re going to use a more recent time period when we know Google’s machine learning algorithms were in use (e.g., the last three or four months).

Do the same comparison analysis. You’ll see something like this:

average-time-on-site-after-rankbrain

Look at what happens now when we analyze the organic traffic. All but two of our top pages have above average time on page.

This is kind of amazing to see. So what’s happening?

Does Longer Dwell Time = Higher Search Rankings?

It seems that Google’s machine learning algorithms have seen through all those pages that used to rank well in 2015, but really didn’t deserve to be ranking well. And, to me, it certainly looks like Google is rewarding higher dwell time with more prominent search positions.

Google detected most of the donkeys (about 80 percent of them!) and now nearly all the pages with the most organic traffic are time-on-site unicorns.

I won’t tell you which pages on the WordStream site those donkeys are, but I will tell you that some of those pages were created simply to bring in traffic (mission: successful), and the alignment with search intent wasn’t great. Probably someone created a page that matched the intent better.

In fact, here’s one example: Our AdWords Grader used to rank on page 1 for the query “google adwords” (which has huge search volume – over 300,000 searches a month!). The intent match there is low – most people who search for that phrase are just using it as a navigational keyword, to get to the AdWords site.

A small percentage of those searchers might just want to know more about Google AdWords and what it is, kind of using Google as a way to find a Wikipedia page. There’s no indication from the query that they might be looking for a tool to help them diagnose AdWords problems. And guess what? In 2015, the Grader page was one of those top 30 pages, but it had below average time on site.

So at some point, Google tested a different result in place of the Grader – our infographic about how Google AdWords works. It’s still ranking on page 1 for that search query, and it matches the informational intent of the keyword much better.

dwell-time-intent-match

A Few Caveats on the Data

To be clear, I know these analytics reports don’t directly show a loss in rankings. There are other potential explanations for the differences in the reports – maybe we created lots of new, super-awesome content that’s ranking for higher-volume keywords and they simply displaced the time-on-site “donkeys” from 2015. Further, there are certain types of pages that might have low time on site for a perfectly acceptable reason (for example, they might provide the info the user wants very quickly).

But internally, we know for sure that a few of our pages that had below average time on site have fallen in the rankings (again, at least for certain keywords) in the past couple of years.

And regardless, it’s very compelling to see that the pages that are driving the most organic traffic overall (the WordStream site has thousands of pages) have way above average time on site. It strongly suggests that pages with excellent, unicorn-level engagement metrics are going to be the most valuable to your business overall.

dwell-time-pages-vulnerable-seo

This report also revealed something ridiculously important for us: the pages with below average time on site are our most vulnerable pages in terms of SEO. In other words, those two remaining pages in the second chart above that still have below average time on site are the ones that are most likely to lose organic rankings and traffic in the new machine learning world.

What’s so great about this report is you don’t have to do a lot of research or hire an SEO to do an extensive audit. Just open up your analytics and look at the data yourself. You should be able to compare changes from a long time ago to recent history (the last three or four months).

What Does It All Mean?

This report is basically your donkey detector. It will show you the content that could be most vulnerable for future incremental traffic and search rankings losses from Google.

That’s how machine learning works. Machine learning doesn’t eliminate all your traffic overnight (like a Panda or Penguin). It’s gradual.

What should you do if you have a lot of donkey content?

Prioritize the pages that are most at risk – those that are below average or near average. If there’s not a really good reason for those pages to have below average time on site, put these at top of your list for rewriting or fixing up so they align better with user intent.

Now go look at your own data and see if you agree that time-on-site plays a role in your organic search rankings.

Don’t just take my word for it. Go look: run your own reports and let me know what you find.

Brand TLDs vs .com: why the world’s biggest brands are making the switch to their own web extension

New Top-Level Domains (TLDs) are becoming more popular in the last couple of years, so here’s everything you need to know about them.

Please note: this content is produced in association with Neustar.

What are New TLDs?

TLDs are the suffix component of domain names, and recent deregulation has seen legacy TLDs such as .com, .net or perhaps extensions such as .co.uk now facing competition from hundreds of new options that allow domain names to be created by consumers in extensions such as .club, .luxury or even .nyc for New Yorkers.

These changes have also paved the way for over 550 of the world’s largest brands to apply for their own branded extensions – think .ibm instead of ibm.com – where IBM would own all the domains to the left hand side of the dot and create shorter, potentially more memorable URLs.

Who applied for a .brand TLD?

As a result of the hefty application fee (almost USD $200k), you can pretty much guess the types of organizations that have taken the plunge on this digital asset.

From tech giants such as .apple, .google and .microsoft, through to the majority of top tier banks such as .citi, .hsbc and .chase, the big end of town has secured these .brand TLDs.

Other notables include big players in retail (.walmart, .nike, etc.) automotive (.ford, .ferarri, etc.), sport (.nfl, .mlb and .nfl).

Why might .brand TLDs be so important?

.brand TLDs are quite a shift from the simple brand.com model that we’re all accustomed to, but industry experts claim significant benefits both to marketers and to consumers.

These include greater flexibility in domain name selection, simplified calls to action in advertising and the ability to reduce reliance on third parties for customer acquisition such as social media or search.

However, for consumers perhaps the benefits aren’t so clear and significant education is required.

One potential benefit for consumers is that .brand TLDs offer simplified navigation which we haven’t seen in the online world since the halcyon days of Windows 2000. Back when websites had two or three navigation options each and you could find whatever you wanted within one or two clicks (assuming you knew how to get to a website that is) because websites were so much smaller.

Now even the best-designed corporate website has hundreds of products, variants, and geographic content presentation intricacies which have driven the significant growth in the use of search as the means of navigating the web.

But for large brands making this shift already, their hope is that .brand TLDs have the ability to allow us to simply add a ‘dot’ and get straight to what we want via domains such as jets.nfl or airmax.nike.

Examples of .brand TLDs

There are nearly 600 .brand TLDs and while many have not yet begun to use them in major advertising campaigns, there are numerous live examples on the web.

There are a number of household brands like Canon, which announced in May 2016 the launch of its own TLD, transitioning its existing “www.canon.com” domain to “global.canon”. Similarly showing technical leadership is European banking giant Barclays Bank which transitioned from www.barclays.com to www.home.barclays.

Other .brand TLD owners have begun their TLD usage in more measured steps, such as BMW who celebrated its 100th anniversary with the launch of a site championing its plans for the next 100 years of car design, cleverly located at www.next100.bmw.

Powerhouse digital organization Google applied for 101 TLDs including matches for the majority of its best known products like .gmail, .chrome, .youtube, .android and lots more. In late 2016, Google started rolling out domains like www.blog.google and www.environment.google as the first steps in its .brand TLD strategy.

Some organizations are also finding ways to introduce their .brand TLDs without transitioning or building new content, but still allowing for clearer, more memorable calls-to-action. An example of this is Microsoft, which has created the simplified domain www.surface.microsoft which redirects to the existing site at www.microsoft.com/en-ca/surface.

These are just a handful of the thousands of .brand domains currently being used but already trends are developing and some of the world’s largest and most recognized organizations are beginning to innovate in the way they build domain names and use them in their marketing and digital strategies.

Stay tuned for part two of our introduction to .brand TLDs as we look deeper at the potential benefits of .brand TLDs and what they mean for the future of digital.

If you want to learn more, join our webinar hosted by ClickZ Intelligence, Neustar and other industry experts on 28th February. We will cover everything you need to know about branded TLDs, exploring their history, benefits, limitations implications and everything in between.

Investing in video: when and how to succeed

facebook audience network

In digital marketing, we’re always trying to keep up with the hottest new thing – advertising methods, ad types, targeting types, etc. – being pitched heavily within the industry in general.

Whether it be the “year of mobile” to the “year of RLSA,” there is always another trend to consider investing heavily in.

Over the last year-plus, video has been most frequently cited as the new digital frontier. Whether it’s important to a comprehensive marketing campaign isn’t the question, though. The question is how do we best leverage it?

In this post, we’ll talk about when – and how – to put video in play for your marketing campaigns.

First, you need to determine if video will even be beneficial for your company.

Videos are best used as an educational/informational tool to help relay info to your audience. If you have a business that requires some explanation of the service or product, or have a variety of advanced features that need to be showcased, or even need to establish credibility and trust for the user to move forward, video can be key for your growth.

If you’ve determined that your business would indeed benefit from adding or expanding on video, how should you leverage it in your advertising efforts? What are some strategies to do so?

Well, below are some tips to make the most out of Facebook and YouTube video for direct-response/performance-driven efforts:

Start with Facebook

Facebook is probably the best platform to leverage video from the direct response perspective. You can get extremely granular with its targeting capabilities and ensure that you are reaching highly relevant audiences to whom you can introduce your brand and explain its value proposition.

As a reminder, the best practice is to keep video length less than 30 secs; that’s about as long as you can plan to keep a user’s attention.

Initially, you’ll want to use Facebook videos in your prospecting efforts. These videos will serve as a first touch to audiences who haven’t heard of you or don’t know you well.

The goal of these video ads is twofold: educate the user and also determine which users are actually interested in what you have to offer.

How do you determine that? Well, Facebook creates audience lists based off how much of the video users have viewed. If someone has completed your full video, they likely have a relatively high level of interest in your product or service.

Once you’ve identified this group, take that audience list of users who have completed the video and begin serving remarketing ads towards them to drive them onto your site and get them to convert.

Now perhaps someone came to our site from another method – paid search, organic, etc. You can also leverage Facebook video ads to help further convince users who haven’t converted why we are right for them.

One powerful ad type within Facebook is Carousel Ads, which let you show 3-5 images, concepts, and messages to help get your point across, deliver value props, and get people to convert.

What many people don’t realize is that you can actually incorporate video into one of your carousel cards. This becomes extremely effective with remarketing as it allows you to relay numerous different messages while also providing the user an educational video to further convince them.

Use YouTube for remarketing

We all know about YouTube and its huge traffic numbers. Of course you should consider advertising here, but note that YouTube is often seen more as a branding play than a direct-response. The one way to really make YouTube effective with DR in mind is to leverage for remarketing.

My recommendation is to develop and segment audience lists of users who visit your site but do not convert based on their interaction with the website (for example, people who get to a sign-up page have shown higher intent than someone who has only gotten to the home page).

YouTube-icon-full_color

Then test various different audiences using different video assets. Essentially, you should aim to further educate these audiences via YouTube and test which videos tend to work better with higher-intent audiences vs. those in research mode.

Videos can be an impactful format when trying to reach your audience and scale your business – but not before you determine when to use it, what channels to leverage it on, and how to strategize to invest your budget wisely.

All you need to master your site speed without getting overwhelmed

pagespeed

Poor website performance is one of the most widespread problems for business websites, yet it’s the most essential one hurting your business on many levels, from lost customers to bad reputation.

These easy-to-use tools will help you solve the problem.

Despite what some people may think, site speed is not a purely technical issue. Marketers have been talking about the necessity to speed up your page load for ages. Not only does poor page load time hurt your site usability, but it also hinders your rankings (by screwing your page engagement metrics), conversions, social media marketing campaign performance and so on.

Fixing the page load time issue is not that easy though. It does take come development budget and good diagnostics tools. Luckily, I can help you with the latter:

Page Speed Insights

Google’s Page Speed Insights measures your page speed and provides PageSpeed suggestions to make your web site faster.

The PageSpeed Score ranges from 0 to 100 points. A score of 85 or above means your page speed is optimal. The tool distinguishes two main criteria: How fast your page above-the-fold loads and how fast the whole page loads. Each page is tested for mobile and desktop experience separately.

Each PageSpeed suggestion is rated based on how important it is.

Pingdom

Pingdom monitors your site and reports if your site seems slow or down. It operates a network of over 60 servers to test your website from all over the world, which is very important for a global business website because your server location effects in which parts of the world your site reports well.

Pingdom also has a free tool you can test here. While Pingdom is mostly known as Uptime monitoring solution (you can read about Uptime here), it also does performance monitoring.

Because I monitor a lot of metrics for many websites, I use Cyfe to integrate Pingdom stats into my website monitoring dashboard:

cyfe

WP Super Cache

WP Super Cache turns your dynamic WordPress blog pages into static HTML files for the majority of your users. This way your web server serves static files instead of processing the heavier WordPress PHP scripts.

This plugin will help your server cope with traffic spikes. It makes the pages faster to load, and stops those traffic overloads from happening in the case of a viral hit.

If you operate a huge database-driven website, a better solution for you would be setting up a content delivery network.

Incapsula

Speaking of your website being slow in remote parts of the world, Incapsula is a premium tools helping you to solve that problem. The platform offers a reliable Content Delivery Network, i.e. a network of servers all over the world allowing your site visitors to load files from the server located closer to them.

This means your site is fast wherever your future customers choose to load it from.

If you want to know more about how CDN works, here’s a very good resource to read and bookmark.

cdn-for-your-wordpress-blog-infographic

Compressor.io

Compressor.io is a handy tool to optimize your image size to allow for faster page load. As most of web pages have images these days, this is a must-bookmark and use tool.

Compressor.io reduces the size of your images while maintaining a high quality. You’ll be surprised to find no difference in your images before and after compression.

The tool supports the following image formats: .jpg, .png, .gif, .svg. I have found it invaluable to animated GIF compression because all the tools I use produce really huge images.

compress

The tool is absolutely free and there’s no need to register to use it. Your files will be stored on the servers for 6 hours and then deleted, so don’t forget to download your optimized images!

Have I missed any essential tool or resource? Please add a comment below!

Content repurposing is your hidden online marketing gem

03

Every marketer and small business owner knows how much time and effort goes into a piece of content.

It isn’t just writing a blog post—it is coming up with the idea, researching, developing, linking, keyword optimizing, and finally posting and promoting even after the article is live.

Coming up with a theme alone can be challenging, so why not get five or even ten pieces of content out of a single idea?

Content repurposing, which is ultimately using one idea or piece of content to make several different content pieces, is really the hidden gem of online marketing. Not only is this kind of content generation great for SEO, but it is also extremely valuable for online businesses looking to make the most of their marketing efforts.

Once they spend time and effort developing a piece of content, they can use it as a launching point for other types of content that can be easily generated.

WordStream points out that content repurposing does not just mean making the most of your efforts in content development, but also in helping you reach a new audience and bringing back content-favorites and successes.

Let’s look a those two aspects a little more closely:

  • Reach a Different Audience. You may already realize that different social media platforms reach different audiences in terms of content sharing, but the way that content is presented can also have a big impact on the kind of audience that it targets. For example, some users may be more likely to watch a video, while others benefit from an infographic or a webinar. By using the same message across multiple platforms in different forms you are expanding your target audience based on consumption preferences.
  • Use Oldies but Goodies! Sometimes content is just a hit and the kind of success a text piece gets exceeds expectations. If a particular piece of blog content got a lot of shares, likes, and comments, this is a good sign that you should think about repurposing it. While you can start repurposing a piece of content as soon as you post it, there is something to be said for dusty content that got a lot of attention a few months ago. Go with what you think your followers will enjoy in a different format.

Turning Text into Something New & Different

As mentioned above, you can begin to repurpose content as soon as you post it to your blog, or use something you posed a while back that gained a lot of attention.

They key is not necessarily what content you choose to repurpose, as much as it is about turning it in to something new and different. Here are just some of the ways that you can turn text blog content into something different:

PDF How-To Guides

You can take excising blog content and turn it in to a downloadable how-to guide for your clients. These PDF guides can be more than just text, you can incorporate visually appealing details to take concepts and present them in a new, tangible, and understandable way.

Make a Video

YouTube and Vimeo have become extremely popular platforms for hosting videos online. One of the most productive ways to repurpose content is actually turning your blog post into a video.

If you turn your content into a 3-5 minute script, you are set up nicely for an instructional video that repurposes content you’ve already created! Say you already have 20 or even 50 blog posts written, you are nicely set up for an entire YouTube channel.

Since many people prefer watching videos to reading content, you are also setting yourself up for an expanded audience.

Visual Presentation Slides

Using a platform like SlideShare (popular with LinkedIn and other social media sharing sites) you can use you content as a starting point for visual presentations. Turning your content into this specific format can allow people to take in the information at their own pace in a visually appealing way.

Create an Infographic

Infographics have really become the trend in marketing visuals for a reason—they present information in a fun, clear, and aesthetically appealing manner. You can easily create these kind of graphics based on previous blog content you’ve written.

Platforms like Canva make it extremely easy to produce without a background in visual design. You can learn more about how to create awesome infographics here.

02

Start a Podcast

If you’re feeling like there is no way that you can use one piece of content to create all of these different repurposed forms, then fortunately, you’re wrong. Why? Because people tend to have their personal preferences for how they like to take in new information!

For example, many really enjoy listening to podcasts over reading or watching a video. Not only that, it is a great way to get your content shared. You can submit your recorded podcast to the following sources to get even more exposures and shares:

  • http://www.apple.com/itunes/podcasts/
  • https://www.podomatic.com/login
  • http://www.podbean.com/
  • http://www.digitalpodcast.com/
  • http://www.blogtalkradio.com/

Host an Interactive Webinar

Having a live audience to share your information with is the trend at the moment (think Facebook Live or SnapChat stories). Webinars can be promoted to your email-marketing list, social media channels, and can get a lot of attention depending on the kind of information you are providing.

Try to use content that would be most informative and make the biggest impact with you audience. Remember that you will have to create a visual and auditory presentation to share, and although it is live, you will want to have it planned out well in advance.

The key to webinars is getting a large active audience, so you will have to do some promoting well in advance and up to the very last minute to achieve the best results.

01

The Takeaway

There are so many ways to repurpose content and this list is by no means exhaustive—you can get as creative as you want!

The important thing to remember is that repurposing content is a hidden gem in marketing because it takes your hard work and efforts to the next level, and you are able tog et much more reward out of your research and preparation.

Each piece of content is going to have different potential—some may be able to be repurposed seven different ways, while you may only get three out of another. This practice is extremely valuable for marketing efforts, and ultimately SEO and best-content practices!

Do you have any other ways that you have repurposed content? Let us know in the comments section below!

Amanda DiSilvestro is a writer for HigherVisibility, a full service SEO agency, and a contributor to SEW. You can connect with Amanda on Twitter and LinkedIn.

Six most interesting search marketing news stories of the week

The Meta search engine logo, with the Chan Zuckerberg initiative logo in the corner.

Welcome to our weekly round-up of all the latest news and research from the world of search marketing and beyond.

This week, we’ve got a bumper crop of stories from the search and marketing industry, including the Chan-Zuckerberg Initiative’s acquisition of an AI-powered search engine, new ad-targeting features on YouTube, the most popular emoji on Instagram, and the news that mobile search and YouTube are leading growth in Alphabet’s fourth quarter earnings.

Also, you’ll never guess who one of Google’s most prolific advertisers is – it’s Google.

Chan-Zuckerberg Initiative acquires AI-powered search engine, Meta

The Chan-Zuckerberg Initiative, the $45 billion philanthropic organisation founded by Mark Zuckerberg and his wife Priscilla Chan, has made its first acquisition – of a search engine. Meta is an search tool which uses artificial intelligence to make connections between scientific research, making it easier for researchers to search through and link together more than 26 million scientific papers. The Chan-Zuckerberg Initiative intends to make Meta, which was previously partly subscription-based, free for everyone to use after spending a few months enhancing the product.

“But wait!” I hear you cry. “Didn’t Search Engine Watch already run a story recently about a scientific search engine powered by AI?”

You’re absolutely right, astute reader – as Adam Stetzer reported earlier this month, Semantic Scholar is an AI-powered search engine for scientific research which is already free to use. While there’s no reason why the world can’t have more than one AI-powered science search engine, it will be interesting to see how the two different projects interact over the coming months and years.

YouTube adds new ad-targeting features

One of the biggest weapons in Google’s advertising arsenal is the sheer amount of data that it is able to collect about users’ search and browsing histories, in order to better target ads in their direction. Last Friday, it was revealed that Google is bringing that scary amount of knowledge to bear on YouTube by allowing advertisers to target users based on their Google account activity.

A blog post on the Google Inside AdWords blog explained:

Now, information from activity associated with users’ Google accounts (such as demographic information and past searches) may be used to influence the ads those users see on YouTube. So, for example, if you’re a retailer, you could reach potential customers that have been searching for winter coat deals on Google and engage with them with your own winter clothing brand campaign at just the right moment.

Al Roberts reported on the news for ClickZ this week and examined why Facebook could be the driving force behind Google’s decision to give advertisers more flexibility in how they target users on YouTube.

Instagram is making Stories more appealing to brands

In August of last year, Instagram debuted Stories: a new feature on its social network devoted to posts which disappear after 24 hours, and a direct and unashamed copy of the Snapchat feature of the same name. Despite a bit of mockery at first, response to Stories has been positive, with 150 million users enjoying the feature daily – and some saying that Instagram Stories has all but replaced Snapchat for them.

Now, Instagram is bringing in some additions to make Stories a more appealing prospect for brands, with new Business Insights available to users with business profiles, and full-screen photo or video ads appearing in between Stories.

Ads will be initially tested with 30 clients around the world, including Capital One, Buick, Maybelline New York, Nike, Yoox, Netflix, and Qantas.

These are 2016’s most popular emoji on Instagram

We’ve got a two-for-one special on Instagram stories this week, with a study by Quintly which has revealed exactly how and how often emoji have been used on Instagram.

Quintly analysed 20,000 Instagram profiles and 6.2 million posts during 2016 to observe how emojis have been used on the platform over the last year. Among its findings were the fact that 56% of Instagram profiles have used emoji so far, and there has been a 20% increase in their use during 2016 alone.

Also, the most popular emoji on Instagram is the camera 📷 – commonly used as a way of attributing photos, which might speak to the amount of pictures on Instagram which aren’t created by the accounts who uploaded them.

Instagram emojis 1

One of Google’s most prolific advertisers is… Google itself

Google is the single biggest recipient of digital ad spend, with its well-oiled ad machine generating tens of billions of dollars of revenue every year. Now, an analysis by the Wall Street Journal and SEMRush has revealed that “ads for products sold by Google and its sister companies appeared in the most prominent spot in 91% of 25,000 recent searches related to such items. In 43% of the searches, the top two ads both were for Google-related products.”

Al Roberts took a look at the study’s methodology and findings over on ClickZ, and considered what this means in terms of conflicts of interest from the internet’s biggest search engine.

Mobile search and YouTube lead Alphabet’s revenue growth

Yesterday, Google’s parent company Alphabet announced its fourth-quarter earnings for 2016. Quartz reported that Wall Street was expecting Alphabet to post revenue of around $25 billion, but it in fact exceeded this prediction with more than $26 billion in revenue, up 22% over the same quarter the previous year.

A purple column graph showing Alphabet's revenue trending upwards, from roughly $7 billion in 2010 to $26 billion at the end of 2016.
Source: Atlas

In a press release, Alphabet CFO Ruth Porat said that the company’s “exceptional” growth was “led by mobile search and YouTube.” While this is interesting news for the search industry (especially ahead of Google’s mobile-first search index – coming soon to a search engine near you), the earnings report revealed that Alphabet’s non-search prospects haven’t been doing so well. Nearly 99% of Alphabet’s revenue came from Google, while its “Other Bets” – the other projects it is pursuing to diversify its revenue streams – posted a loss of roughly $1.1 billion.

Google is still finding ways to increase its revenue, and the company is by no means struggling to bring in the money. But thus far, its parent company hasn’t been too successful in shifting the focus away from the search and advertising it is best known for.

The ONLY lesson from every social media brand fail example ever

rip-carrie

Everyone wants to go viral on social media. But sometimes your brand ends up going viral for the wrong reasons.

After every such social media brand fail, we experience a familiar cycle. Somebody (or multiple somebodies) instantly shames them on social media. The brand (usually) apologizes. The world moves on.

Then a few days later, it happens. A respected industry publication publishes something like this:

X Social Media Lessons From [Brand’s] [Social Media Update About Whatever]

In 2016 – truly a year filled with disasters if there ever was one (and one best summed up by Vice in April after Prince died) – social media blunders still managed to spark swift and bitter outrage.

You see, apparently there are still some lessons that social media managers, directors, and coordinators need to learn.

I disagree. There’s only one lesson. But first…

May the Delete Button Be With You

At the end of the year we were treated to several “Top Social Media Fails of 2016” types of posts. Coming in at number one of just about everyone’s lists was Cinnabon.

In case you missed it, Cinnabon caused a big uproar on Twitter after tweeting what they considered to be a tribute to Carrie Fisher. The actress, who died that same day (Dec. 27), appeared as a image of her “Star Wars” likeness, Princess Leia, along with this message from Cinnabon: “RIP Carrie Fisher, you’ll always have the best buns in the galaxy.”

A tasteful tribute? The Twitterverse didn’t agree.

Shortly thereafter, the tweet vanished. Cinnabon returned humbly to Twitter to say they were “truly sorry.”

Meanwhile, Cinnabon’s buns are as tasty as ever and people are still lining up to fill their bellies with tasty rolls at their local mall or airport.

PR lessons?

Look, your company at some point is going to screw up. An employee at your company will screw up – heck, maybe it will even be YOU who screws up.

Or some person outside your company, whether it’s an existing customer, former customer, or person who would have never been your customer anyway – is going to cause a lot of noise.

It will be scary. But don’t panic. Remember, negative reactions don’t have any more power than positive.

Some people will always get outraged. And these people will be as loud about their outrage as they can. Because it’s really about them, not you.

If you want to survive a tweet storm of negativity, follow this simple advice from comedian Ricky Gervais:

“Twitter? It’s like reading every toilet wall in the world. You mustn’t worry about it. It will send you mad.”

Will these poor brands survive the outrage?

What do Comcast, Bank of America, Mylan, McDonald’s, and Wells Fargo all have in common? Well, this year they were all named America’s Most Hated Companies for providing consistently terrible customer service or doing something the general public didn’t like.

But even some of the most beloved brands have done some incredibly questionable things, yet never make these sorts of lists.

  • Apple basically built its powerful brand using Chinese slave labor. Yet Apple devices still sell like crazy every year.
  • Google (allegedly) avoids paying taxes and has a long list of privacy concerns. Google still makes billions of dollars every quarter.
  • Subway spokesperson Jared Fogle got arrested for possession and distribution of child pornography and paying to have sex with minors. Customers are still “eating fresh.”
  • Starbucks even somehow managed to overcome the most horrific scandal of all: changing their cups around the holidays.

Need we even mention the biggest social media disaster of them all? Hint: he was just elected President of the United States.

The only lesson

Here’s the real takeaway for marketers: a great product will always beat an epic brand fail. Customers who truly love you will overlook your faults because of self-justification.

They want to continue to view themselves as a special snowflake, which means doing the mental gymnastics of rationalization or simply ignoring the flaws of the brands they buy and love.

Any effects of what are often dubbed “social media disasters” are usually small and contained.

  • People will still order Papa John’s pizza even if they briefly pissed off Iggy Azalea (and proudly have Peyton Manning as a paid spokesperson).
  • Fans of the New England Patriots will continue to be fans even if their team autotweeted something incredibly racist.
  • And people (even women) will continue to drink beer that once was sold as being “perfect” for removing the word “no” from your vocabulary.

So make sure your product or service is good enough to withstand any mistakes you make on Twitter, Facebook, or other social networks. Build a loyal audience that loves you.

What exactly is PPC keyword management anyway?

PPC keyword management gets a fair amount of attention as a topic of conversation (at least in the world of PPC pros!).

It’s also a topic that sends my brain into overdrive when clients mention it.

Why? Because the phrase is used so loosely it often means different things to different people.

Part of the confusion stems from the fact that PPC keyword management isn’t just one task—it’s a group of tasks. And some are less obvious than others.

In this post, I’ll clarify what keyword management means to us at Group Twenty Seven and describe its many aspects, including:

Negative keyword management
Keyword trend audits
Quality score benchmarks
Duplicate keyword management
Keyword click-through-rate management
Low search volume keyword management.
Negative Keyword Management

For many non-PPC experts, keyword management is synonymous with negative keyword management.

It’s true that negative keyword management is an important part of keyword management. But it’s only one part.

Regardless, building negative keyword lists is a good place to start when launching new campaigns or taking over existing campaigns. Because the more robust your negative keyword list, the less wasted ad spend you’ll have.

That’s why we often perform negative keyword management hourly when we launch new campaigns. Then, we’ll gradually perform it less frequently as we identify fewer and fewer negative keywords.

But we never stop managing negative keywords entirely. Things change and new irrelevant words emerge over time. So we continue to perform this task monthly, at a minimum.

Keyword Trend Audits

Another keyword management activity we undertake is trending audits. Basically, we look at actual search queries in Google Analytics and AdWords to see if we can identify trending keywords we can use (or exclude).

You might be surprised at how often new terms emerge to describe existing products and services—terms our clients have never thought of using before.

Quality Score Benchmarks

Generally, we don’t manage our client PPC accounts with the specific purpose of achieving high quality scores. We’ve always found that if an account is well managed, a high (or certainly, rising) quality score will result.

But that doesn’t mean we ignore quality scores entirely. If a quality score is particularly low for a new client, we’ll take a closer look.

Sometimes, we’ll find that the problem lies with the client’s landing page. When a client has one landing page with multiple conversion paths leading to it, the landing page may not reflect all the keywords used. This leads Google to conclude that the page is serving irrelevant information to users, and thus may assign some keywords low quality scores.

Usually, we can fix these hiccups by adding a few “missing” keywords to the landing page.

Duplicate Keyword Management

Having duplicate keywords goes against AdWords recommendations. And we don’t recommend it either.

But sometimes we inherit accounts with duplicate keywords (or inadvertently add them ourselves), especially if the PPC program is large.

Fortunately, duplicate keywords are easy to spot if you look for them. This is a task we perform regularly with AdWords Editor, a free downloadable tool.

Keyword Click-Through-Rate (CTR) Management

Normally, we monitor CTRs closely at campaign launch and quickly eliminate keywords that aren’t producing.

That said, it’s also a good idea to monitor CTRs even in more established campaigns. Sometimes CTR stats change suddenly, which could indicate that a hot new competitor has entered the market—which could require some adjustments to your PPC strategy.

Low Search Volume (LSV) Keyword Management

When keywords fall into LSV territory, it’s tempting to immediately remove them. But in practice, many keywords drift in and out of LSV over time. Sometimes, it pays to leave LSV keywords alone for a little while, to see how they perform.

And sometimes, we can lift a keyword out of LSV status by slightly manipulating the keyword (or keyword phrase). (Pro tip: Playing around with singular vs. plural versions sometimes works.)

As you can see, PPC keyword management is much more than one simple task! So when the topic comes up, I hope you’ll forgive me for peppering you with 20 questions to define precisely what we’re talking about.

Please note, this post was originally published on the Group Twenty Seven blog.

Chan-Zuckerberg Initiative acquires AI-powered search engine Meta

semantic scholar

The Chan-Zuckerberg Initiative, the $45 billion philanthropic organisation founded by Mark Zuckerberg and his wife Priscilla Chan, has made its first acquisition – of a search engine.

The Initiative announced on Monday that it would be acquiring Meta, a scientific search engine that uses artificial intelligence to make connections between research papers.

The search start-up, which was founded in 2010, previously charged some users for subscriptions or custom solutions, but the Chan-Zuckerberg Initiative intends to make it free to all after spending a few months enhancing the product.

The Meta search engine is designed to make it easier for researchers to search through, read and link together more than 26 million scientific papers. It also provides free, full-text access to some 18,000 journals and sources of literature.

Meta’s artificial intelligence capabilities allow it to draw connections between papers, recognising where authors and citations overlap in order to surface the most important and relevant research – rather than just what contains the right keywords. It provides an efficient and intuitive way to sort through reams of online studies and locate the most useful papers, in a way that more conventional search engines like Google Scholar can’t replicate.

If all of this sounds familiar, that might be because you’ve heard it before. Semantic Scholar is also a free, AI-powered search engine aimed at helping scientists to sift through mountains of research, using data mining, natural language processing and computer vision to analyse a study’s worth and present its key elements.

Semantic Scholar is also backed by a non-profit organisation: the Allen Institute for Artificial Intelligence, or AI2 for short. The search engine was developed by Paul Allen, co-founder of Microsoft, in conjunction with AI2 and in collaboration with Allen’s other research foundation, the Allen Institute for Brain Science.

Semantic Scholar was only launched last November, while Meta has been around since 2010. Until now, the fact that Semantic Scholar was free to use might have given it an edge, but the intervention of the Chan-Zuckerberg Initiative could change all that.

So which search engine will emerge victorious? Both have the backing of heavyweights in the technology industry – Facebook’s Mark Zuckerberg, and Microsoft’s Paul Allen. Both use artificial intelligence to open up access to scientific research in a whole new way, and both are soon to be free to all.

Semantic Scholar’s field is also quite narrow still, currently only covering 10 million published papers in the fields of neuroscience, biomedicine and computer science. However, it has a huge amount of potential and has grown quickly in the two months since its launch, with 2.5 million people using the service to perform millions of searches.

Maybe the question should be: are the two search engines even competitors? Oren Etzioni, the CEO of AI2, has already refuted the idea that Semantic Scholar would attempt to compete with Google Scholar, saying that their goal is just to “raise the bar” and provide scientists with more effective options to carry out their research. They may take the same view towards Meta, opting to work with the other company for the ultimate benefit of the scientific community.

For the Chan-Zuckerberg Initiative, Meta is just one step towards their larger goal of helping to “cure, prevent or manage all diseases by the end of the century”. Sam Molyneux, the co-founder and CEO of Meta, wrote in his own announcement on Facebook that,

“Helping scientists will produce a virtuous cycle, as they develop new tools that in turn unlock additional opportunities for faster advancement. The Chan Zuckerberg Initiative’s recognition of this “meta” effect is why Meta can be a key piece of the puzzle to enable the future of human health that we believe to be possible within this century.”

Regardless of whether Meta and Semantic Scholar will be competitors or collaborators, one thing seems certain: artificial intelligence has unlocked a whole new set of possibilities for the way that we engage with scientific research, and there’s no doubt that we will benefit from it.

Whatever happens next, it’s going to be exciting.