New research: The Era of Ecommerce

Ecommerce marketers at US brands are struggling to keep pace with where and how consumers browse and purchase online, according to new research published today.

An ecommerce-focused study by Search Engine Watch and ClickZ, produced in partnership with Catalyst, part of GroupM, has found that 85% of browsing and purchasing activity occurs with non-Amazon retailers, but only 25% of US brands say they have a strategy for ecommerce retailers beyond Amazon.

The Era of Ecommerce: Capitalizing on the New Customer Journey has also found that there is a 29 percentage point gap between the proportion of consumers who have visited a retailer to research and the proportion of brands who market on that retailer’s website (53% vs 24%).

Details about the research

The report is based on a survey of more than 750 North America-based consumers and more than 600 business to consumer (B2C) client-side marketers across the following nine sectors: appliances, baby care, beauty and personal care, clothing and apparel, consumer electronics, footwear, furniture and home decor, non-perishable goods and beverages, and toys. It includes in-depth interviews with numerous senior marketers actively engaged in ecommerce advertising for their brands. All surveys and interviews were conducted between June and September 2018.

This research comes as a follow up to last year’s Age of Amazon report, in which we told the story of the ecommerce giant and what its ascent meant for marketers. Building on that, this report looks beyond only Amazon to assess the wider ecommerce industry, highlighting both consumer and advertiser behaviors.

From traditional search engines to ecommerce websites, and from vertical-specific retailers to visual search, this research seeks to understand how and when consumers use specific channels and how advertisers prioritize each channel. Significant opportunity awaits advertisers who think holistically about reaching consumers in each phase of the consumer cycle.

Kerry Curran, Managing Partner, Marketing Integration, Catalyst, commented on the report: “Our new research illuminates a significant disconnect between today’s consumer behavior and ecommerce advertising strategies. We’re hopeful that marketers will be able to use our findings and recommendations to bridge this gap, and ultimately drive better returns for their businesses.”

Downloadable link

The research will debut today at the Transformation of Search Summit held in New York. It is also available for download from ClickZ here (registration required).

Why search marketing matters in 2018

First let me ask you: how many unread emails are in the “promotions,” “updates,” and “other” tabs of your inbox? When I got to work on Monday morning, there were 248. How many of those did I read, you ask? Three at best, and only because they were already my favorite news roundups. The others? Didn’t stand a chance. “Marked as read,” “deleted,” and otherwise wholeheartedly, happily ignored.

Long story short, consumers these days are drowning in emails. We have promotions, “we’ve missed you!”s, “rate our product!”s, and 100 other types of unread newsletters pouring from our inboxes and never getting close to our attention.

For a long time, it was businesses seeking out consumers

And while that obviously is still at play, the tides are shifting. It’s just too much content to keep tabs on. More and more, consumers are ignoring the bombardment and seeking out businesses on their own terms — when and where they want to look.

For a long time, SEO was a small group of nerds (*experts) sitting in a corner doing their thing, trying to convince everyone that search mattered and that there were ways to improve rankings.

For a long time, people kind of let them do their thing while not understanding what SEO actually was or fully grasping their value.

But now, the amount of content is suffocating. I don’t want to read 248 “other” emails to find the information I need. I want it know.

Where do I go? The place where 93% of online journeys begin: I search.

Now, businesses need to be found by searching consumers

As often happens when tech goes mainstream, all of a sudden businesses care a lot more about that group of nerds in the corner.

The question now turns to, “how can I make sure my business is found when and where my customers are looking?”

In a world of customer experience, I don’t want to bother consumers — I want them to happen “serendipitously” upon my product or service. I want to be there when they’re ready.

These days, customer journeys start not when a consumer walks into a store, or lands on my web page. Customer journeys start the second a consumer opens a search engine.

Desktop to mobile to voice

And to top it all off, the stakes keep getting higher. When I search on desktop, I probably look at the first ten results. On mobile, maybe I consider five. On voice? One gold spot at the top.

Exciting times for SEO and search marketing.

As such, we’re thrilled to host The Transformation of Search Summit today here in New York City, in partnership with ClickZ and Catalyst.

Topics to consider in search marketing

We’ll be covering all of these and more:

The new customer journey
Blockchain and the decentralized economy, and what they mean for search
Optimizing for voice search
Amazon and Amazon Marketing Services
Visual search and ecommerce
Strategies for search transformation

Needless to say, we’re pretty jazzed about the event. Speakers include some brilliant minds from SAP, Google, Microsoft, Adobe, LEGO, Hertz, Pinterest, Hilton, Conde Nast, and many more.

Mostly, we’re excited to see the continued rise of search marketing and how businesses adapt to better at being found by consumers.

This post also appeared on ClickZ.

The comprehensive guide to voice search keyword research

Informational keyword research is a subject that has been covered thousands of times across every SEO blog, publication, and web design company.

However, with voice becoming a more prominent way of searching, it’s important that it’s now taken into consideration. With voice usage growing, marketers need to understand how their audience is using this technology, and how they can adapt to this. Keyword research has been advancing dramatically over the last couple of years. No longer are the days of simply sorting by the highest search volume and creating a page; it comes down to much more than that. Semantics, categorisation, ranking difficulty vs reward, questions, featured snippets, people also ask. The list goes on.

A straightforward task has now become much more complex as well as time-consuming, and it’s important that it’s right the first time as keyword research will tend to influence your strategy, projections and, in some cases, KPIs.

We’ll be using a variety of paid and free tools within this guide. However, even without the paid tools it will give you a large dataset but will require a little more manual work.

Keyword Set

We’re going to be basing this on you already having a website that is established in some form or another, meaning you already have some rankings which can be used as an initial starting point. However, if this isn’t the case you’ll be able to simply skip these steps, although in some cases it could cause you to miss out on some smaller keywords.

Search Console

Underrated a lot of the time and critiqued heavily, however it’s a free tool that does give you lots of data, especially with recent updates; you’re able to obtain 16 months’ worth. It can sometimes be slightly inaccurate, however that’s much more data that you will be able to get from many paid tools.

To get the most out of this it’s best to try and filter down to the pages which are informational on your site – for example, your blog, a hub or guides.

This will give you a top list of URLs which you can then dig into and provide you with a list of keywords to expand on. We suggest taking a list of the main subjects you find here to be able to dig into them further later on.

Autocomplete

Another free tool that we can take advantage of? Autocomplete, or Google Suggest, has been around for 14 years now, which simply seems insane. The feature, created on a bus by Kevin Gibbs, has given us bundles of joy throughout the years as well as causing controversy in other instances but it does give us a great research tool.

This is something that we have created in-house tools to take care of, simply due to the nature of the task. However it’s something you can carry out manually by adding your keyword into the search box and grabbing the suggestions.

It’s worth noting here that the monthly search volumes, as well as CPC estimates, are all generated by the keywords everywhere tool which is a must have as it can give you instant feedback on how many people are searching for the keyword as well as commercial intent.

This is something that will work well for some industries but not all. Again, take down all of the keyword suggestions you find and place them into your list of ideas generated from Search Console.

Now, this is where we start to look at voice in more detail. We know that voice search is mainly used for asking questions – the whos, wheres and whats of the world. Having a list of these question modifiers allows you to take the Google Suggest keyword data even further.

  • Where
  • Should
  • Which
  • Is
  • Do
  • Can
  • What
  • Does
  • Why
  • Have
  • When
  • Was
  • Will
  • Are
  • How
  • Who

These question modifiers allow us to really dig down into the questions people are asking, as these may not always have a lot of search volume and be easily find-able. By using wildcards in the search queries alongside the main subject, you will expand the keyword set even further.

Some questions may not make sense and may not be keywords you want to go after. But, doing this for all of the questions modified, along with your key subjects, will give you a great starting position for your voice focused keyword research.

Related Queries & People Also Ask

Yes, we’re still on the same page yet there are more sections where we can continue to expand the keyword set. Related queries and people also ask are great free sources to further expand on what you already have.

If you do this with all of the question queries you have obtained from the initial search console and suggested search data, you will then have a huge list of questions people are asking about your chosen areas. This gives you a great starting point for targeting people searching via voice as well as normal typed searches.

Competitors

This isn’t anything new but is an important step in making your data set as fool-proof as possible. Entering your competitors into Ahrefs or SEMrush, filtering down by their informational areas, or simply anything with one of the question-related modifiers and adding this to your list is a very quick and simple way of making sure you aren’t missing anything the competition is doing.

Ahrefs

This is one of our favorite tools right now. It’s been proven to be one of the best at backlink exploration but is also great for keyword data, especially when building up a keyword set.

Creating a keyword list

First of all, you’ll need to set up your keyword list, based on the previous data you have gathered. This will then automatically pull through the search volumes, clicks, difficulty and many more data points.

It’s important to note that some of the data may not have been updated in some time. If this is the case you’ll need to use your credits to re-run this to gather new data.

Questions

As we know, voice search is all about questions. This is where we can further expand on the keyword set by using Ahrefs’ questions section.

This is going to be one of the most important areas of the keyword research. Ahrefs does only take the first 10 keywords from the keyword list though, so it may be worth inputting the main categories, to begin with, to find questions around those and gradually digging deeper once you have the main questions.

There are going to be a lot of long tail questions with very little search volume, depending on the niche. It’s worth filtering through these to see if they are in fact useful. If so keep them, if they don’t make sense, get rid.

Other data gathering tools

As well as the ‘normal’ keyword research tools there are many other places you can find data on questions people are searching for. Again, depending on the niche this may or may not be useful to you. Quora is a great place for gathering information around voice-based searches as it’s a platform for exactly that – asking questions.

This is simply the top five results for that seed keyword in the search bar. Search for some of your main topics, grab the lists of results and again feed them into your keyword set, either on Ahrefs or your excel sheet.

The same can be said for Pinterest. This is great for lifestyle and retail websites as Pinterest is, of course, a very visual platform.

This search has very quickly given me another list of suggested searches, which I can then take, expand on using the variety of different techniques already outlined and again add to the already extensive data set we have collated. Pinterest is great platform to find keyword ideas for retail, to inform both offline and online strategies.

Local intent keywords

Voice search is not only about content driven informational terms. Sometimes it may be as simple as asking for opening times, contact details or business history. It’s important that these are factored into your keyword sets. Also, if you aren’t a business with a lot of branded search volume it may be difficult to find specific keywords with tools.

However, data from GSC may still provide you some great insight. We would suggest having a separate document to make sure that any questions around your specific brand are answered. This may mean making sure your structured data is up to scratch, Google Local Listings are complete or the content on your site includes what people are looking for.

The Final Keyword Set

Doing a keyword research for voice is not much different to how you would normally carry out a keyword research for informational. However, there is much more of a focus on questions – not only high-volume questions but ones which could have 0-10 searches a month simply due to the long tail nature.

Running through this process multiple times, with your different seed keywords, will help build out an extensive list of questions and informational terms people are searching for. This will not only help influence your content strategy but your sales strategy – knowing what people are looking for is the first step in understanding your user or customer base.

How voice queries are showing up in search data

The rise of voice search is no secret, and many companies are still wondering how to address it. When it comes to search data, how can we monitor which queries are from voice? In this article, Jason Tabeling shows how he finds insights into voice search from his own company’s data.

Alpine.AI estimates that there will be more than 1 billion voice searches completed in 2018.

At this point I’m sure that everyone has seen or done a voice search, even if you just saw an example in a Google Home or Amazon Alexa ad. The power of what can be completed with a voice command is growing by the day. This trend is already having a massive impact on consumer behavior and therefore needs to be a consideration when monitoring or optimizing our search accounts.

Right now Google doesn’t provide specific information on how a search was started. For example, was it a Google home search, from the app, done via typing or voice.

However, I wanted to take a recent dive into our own search data from our company, BrandMuscle. I used Google Ads client data across all verticals, comparing the first nine months of last year with this year, dates 1/1/17-9/30/17 versus 1/1/18 – 9/30/18.

The data revealed some interesting trends that give insight into how voice searches might be showing up in the data we currently have available.

Looking at the length and type of queries

To start, I looked at the length of search queries. The thought is with voice search consumers are using a more natural language. So instead of searching for “car insurance” consumers might search for “cheapest car insurance for Toyota Camry.” Our data shows the average number of words in each query has increased for both mobile and desktop queries year over year. Desktop queries are up 7% year over year to 4.3 words per query. Mobile has increased 9% year over year to 4.14 words per query. With mobile increasing more rapidly I think this shows the impact voice is having, as well as larger devices enabling consumers to input longer queries.

The other area I wanted to dig into was what types of search queries are consumers using. I think voice search has brought an increase of consumers asking questions in a more natural language. Therefore I looked into question modifiers like who, what, where, when, why, and how. The use of these terms indicate more conversational tones. Our data has shown an increase in these terms of 118%, and a majority of that change coming on mobile phones which is up 178%.

So what actions should you take with the increase in voice search? Here are 3 considerations:

1. Build experiences for question based queries

One of the reasons why search is such a great marketing tactic is consumers are giving you specific clues as to what they want. This is even more true when a question modifier is included. For example the implications behind what type of information a consumer is looking for when they are searching for “Where” + Running Shoes vs. “What” + Running Shoes is very important. “Where” would indicate the consumer wants map or location information, where “what” indicates more research based content is needed. Having ad copy and landing page experiences that meet these specific demands will help increase conversion rates.

2. Monitor your search query data

Some of you might be reading these and questioning if this data matches your own search accounts. I bet it’s directionally correct, but might vary significantly. The only way to know is to dig into your own data. See what insights you can derive. Are any queries that your ad appeared where it should not have? This should be a part of regular account maintenance as you update match types and negatives.

3. Test voice queries that are important to your business

Putting yourself into the shoes of one of your customers is still a very helpful lens for account managers. We often take our eyes off what the consumer experience is considering how much focus and attention we put into the details of a good search program. However, sometimes it’s just as simple as seeing what the search query results look like as a consumer. Try a search on Google Home. Does it provide an answer? Is it an answer that was taken from your organic listing? What opportunities can you uncover with this information?

Voice search is having a major impact on our lives as consumers and marketers. This change is happening quickly and will continue to evolve as the technology continuously gets better and better. Following a few of these quick tips to keep an eye on your data will be very helpful to staying ahead of this trend.

7 common SEO mistakes most WordPress bloggers make

WordPress initially started out as a ‘blog-only’ platform and now that it has extended as a full-fledged Content Management System, it remains a popular blogging platform. WordPress.com blogs have over 409 million monthly viewers who looked at 22.4 billion pages per month this past year.

This standalone fact is enough to justify the popularity of WordPress as people’s favorite blogging platform.

WordPress does provide a lot of helpful features for blogging enthusiasts who are looking to start their own blogging website. However, inexperienced bloggers do commit some mistakes in spite of all the online help available. In this blog post, we will review the most common SEO WordPress mistakes that bloggers commit out of either ignorance or sheer carelessness. Regardless of the reason, these mistakes affect the search engine ranking of their blogs and even their online reputation.

So, let’s explore seven of the most common SEO mistakes made by WordPress bloggers.

1. Not using the right SEO optimized blogging theme

If you are new to blogging, you might have missed out on the information that WordPress offers SEO optimized themes for your blogs which are highly helpful when it comes to the quest of online rankings. If you are not using an SEO optimized blogging theme, you are obviously a step behind than the others who are relying on them. There are a lot of SEO optimized blogging themes for WordPress that you could choose from such as Divi, MagPlus, Jevelin etc.

2. Missing on an SEO optimized contact form

Even if your WordPress blog is in its initial phase, it needs to provide a point of contact to its followers, even if they are less in the count than expected. A contact form serves the purpose just right. Your contact form is a conversion driver and optimizing it for the right SEO keywords will help your visitors easily find your blog, hence amplifying the traffic.

3. Not buying a domain

Are you running your free blog using WordPress with the default blog address you were allotted with? If the answer is ‘Yes’, you might not be pleased with what we are about to tell you. A blog or even a website runs well only when it runs as per the need of its target audience. A proper domain name provides an identity to your blog and prepares a path for the visitors to lay their expectations. Not buying a domain can damage the traffic expectations of your blog and kill its overall Search engine ranking.

4. Not optimizing blog images

A great blog comes to being only when its relevant content is paired with original and high-quality images. However, a lot of WordPress blog and website owners forget to tap the optimization of these images. It is very important to optimize the images you use in your WordPress blog. It helps your site load faster and even enhances your Google PageSpeed score.

To optimize your blog images, you can seek help from WordPress image optimization plugins such as Smush It, EWWW Image Optimizer, and TinyPNG. These plugins will help you compress your images without affecting their resolution and also take care of their SEO optimization.

5. Choosing the wrong keyword

Your blog’s reachability depends entirely on the Keyword chosen by you for its Search Engine Optimization. Keyword Research might be a very extensive concept but it can do wonders for your blog’s SEO if done in the right manner.

You have to work on an SEO Keyword strategy for your blog in a manner that you are using Keywords that define the subject of your content, are low in competition yet are commonly used by visitors for finding the information they are looking for. Finding Keywords that fit the bill for all these requirements can be quite a task and might overwhelm certain users. As demanding they might be, they require your focus or the attention if you are looking to rank your blog well.

6. Not focusing on loading speed

Your online blog’s loading time will highly affect the traffic on it and also the site abandonment ratio that follows if your blog takes a lot of time to load for its visitors. A loading time above 2-3 seconds can lead to a lot of visitors abandoning your blog.

If you really are serious about your blog’s loading speed, you must get a Caching plugin for your blogs such as W3 Total Cache, WP Fastest Cache or WP Super Cache. These plugins are easy to use and they make your WordPress blog speedy as well. You must also not refrain from investing in a reliable web hosting service because they tackle your blogging website’s server side issues and have their fair share towards your blog’s overall performance and speed.

7. Not focusing on content and readability

Probably one of the most important aspects of your blog is the content that you push through it. It needs to be of a top-notch quality when you are looking to commit no SEO mistakes in and around it. Make sure the following things about your blog’s your content:

Create original content that is relevant as per the audience.
Make sure that this content is readable and provides a ‘takeaway’ for the target audience.
Blogging consistently will help you have a stable traffic on your blog. Use plugins like the Editorial Calendar to blog regularly.
Conclusion

A lot of experienced blog owners do commit technical and onsite SEO errors and then look for SEO agencies and content marketers to take care of their blog’s SEO. However, the most common mistakes can be easily avoided by creating a checklist of the must-haves.

Analyze your WordPress blog today and see if you are committing any of the mistakes mentioned above. Hopefully, you’ll be able to tackle them and remove them from your blog at the earliest. Once you have a solid SEO content strategy and a perfect plan of action for your blog’s SEO, you will definitely be able to refine and improve the overall SEO performance of your WordPress blog.

The end of Google+ after a data breach and how it affects us

Google has decided to shut down Google+ after discovering a data breach. How should we react to the news?

Not many of us were surprised to hear that Google+ will stop existing in a few months. The only surprise came in the way the news was revealed with Google announcing a data breach that led them to this decision.

Google has published a blog post last week mentioning that they discovered a bug in the API for Google+ that allowed third-party developers to access data of 500,000 users with unauthorized permission.

What’s interesting is that they didn’t disclose the breach back in March when they discovered it and they only brought it to the public after The Wall Street Journal covered it in a post.

The story became so big that Google knew that they had to respond to it.

They’ve provided more details in their recent blog post about the bug:

Underlining this, as part of our Project Strobe audit, we discovered a bug in one of the Google+ People APIs:

Users can grant access to their Profile data, and the public Profile information of their friends, to Google+ apps, via the API.
The bug meant that apps also had access to Profile fields that were shared with the user, but not marked as public.
This data is limited to static, optional Google+ Profile fields including name, email address, occupation, gender and age. (See the full list on our developer site.) It does not include any other data you may have posted or connected to Google+ or any other service, like Google+ posts, messages, Google account data, phone numbers or G Suite content.
We discovered and immediately patched this bug in March 2018. We believe it occurred after launch as a result of the API’s interaction with a subsequent Google+ code change.
We made Google+ with privacy in mind and therefore keep this API’s log data for only two weeks. That means we cannot confirm which users were impacted by this bug. However, we ran a detailed analysis over the two weeks prior to patching the bug, and from that analysis, the Profiles of up to 500,000 Google+ accounts were potentially affected. Our analysis showed that up to 438 applications may have used this API.
We found no evidence that any developer was aware of this bug, or abusing the API, and we found no evidence that any Profile data was misused.

They seem to use the word ‘bug’ making it clear that there is no evidence that there was a misuse of the data.

There was also an action to reassure users with the launch of more granular Google Account permissions through the individual dialog boxes.

Still, it seemed like the best time to shut down Google+, one of their least popular products the last few years.

The end of Google+

When was the last time you used Google+?

Not many of us can remember the last time we’ve had a meaningful interaction on Google+ or used it as part of our social media (or search) ROI.

Google’s attempt to launch its own social network was ambitious but the problem was that it never clicked with its audience.

The stats speak on their own and they come from Google’s latest blog post:

“The consumer version of Google+ currently has low usage and engagement: 90 percent of Google+ user sessions are less than five seconds.”

Thus, users are only accessing Google+ by mistake or they simply find no reason to stay engaged.

On the contrary, there seemed to be a fit for enterprises using Google+ and they might even find new features to benefit from it:

“At the same time, we have many enterprise customers who are finding great value in using Google+ within their companies. Our review showed that Google+ is better suited as an enterprise product where co-workers can engage in internal discussions on a secure corporate social network. Enterprise customers can set common access rules, and use central controls, for their entire organization. We’ve decided to focus on our enterprise efforts and will be launching new features purpose-built for businesses. We will share more information in the coming days.”

Hence, the end of the consumer use may not necessarily mean the end of its enterprise users.

When will the platform shut down then for consumers then?

Google mentioned that there will be a 10-month period that you can still access the social network until it shuts down. This means that we will all say the final goodbye to Google+ at the end of August 2019.

What do all these mean?

Dr Ben Marder, Senior Lecturer in Marketing at University Edinburgh Business School was skeptical from the beginning of Google’s endeavor and the problem started from the network’s positioning:

“Google + was created with a promise that it would solve the ‘multiple audience problems’ an issue that has been shown to cause anxiety within network members as the content is consumed beyond those it is undesired. An issue that Facebook was highly criticized for. Though I commend the effort of Google, unfortunately, I believe it was these arguably good intentions that killed it before it was even born.

People will say that they want to keep their various different social circles separate, no doubt this was what was reported in market research when Google was designing their network. However, what people are reluctant to say is a key component of what makes social media so fascinating is seeing the posts maybe you shouldn’t have, the kind of posts that would spark interest and may be gossiped about. Google+ is best described, like a person you would rather not have a second date with, ‘nice but boring’. At least LinkedIn knows it is boring, but fulfills a specific niche, Google + was essentially a dull jack-of-all trades.”

You may be indifferent about Google+ as a user but you may have used it in the past as part of your marketing or SEO strategy. There used to be a time that Google+ was still relevant for professional reasons and it even brought some sort of ROI for some businesses, especially in niche industries and communities.

What does this change mean for marketers then?

Chances are that you haven’t used Google+ for at least a couple of years. However, it’s still interesting to explore how the social media landscape is evolving. Even Google’s power wasn’t enough to convince users to use its social platform.

It’s a lesson for all of us not to rely on one platform for our marketing strategy, whether it’s Facebook, Instagram, or YouTube, as you can’t predict what the future holds.

It’s always a good idea to look into the future to ensure that your strategy is adapting to the changing consumer habits.

Moreover, another data breach, whether it’s called a breach or a bug, is a matter of concern for users who lose their trust on big tech giants. This is important when creating your next marketing campaigns to ensure that your brand and your messages comply with the audience’s needs. Trust will become key on social networks and we cannot ignore it anymore, especially after a year of multiple data breach scandals.

And what does this change mean for SEO professionals?

Social media and SEO can still make great allies. Google+ used to help companies boost their SEO with social playing as a useful ranking signal. Although it has never been an official ranking signal, it still contributed to online authority. However, as Google+ started losing its audience, it didn’t affect SEO even if it was Google’s own network.

YouTube can still impact your search rankings and of course, your popularity on other social channels can still affect your position in SERPs.

Still, social media is not the most important factor to your SEO strategy and Google+ certainly won’t be missed in 2018.

If you want to check your Google+ data you can visit Google’s Takeout where you can download the data from Google’s services.

Google will also provide more details soon on how you can both download and migrate all your Google+ data.

Is hiding paid backlinks from Google actually possible?

Your website’s link profile is basically one of your biggest areas of concern. Given the kind of penalties levied by Google for any misconduct related to your link profile, the concern is well-justified. The truth is that link profile building is a time-consuming activity that requires judicious decisions; especially when you are pondering over the thought of buying backlinks for your own website.

However, a report also talks about how a sample of 750,000 well-shared articles over 50% had zero external links. This points to the widespread shallow SEO knowledge where website owners don’t care enough to build and earn links for their website. And sometimes the ones who are doing well don’t necessarily care about ethics.

A lot of website owners struggle with their site’s Search Engine Optimization due to several reasons. Since backlinks are a quick and efficient way of spiking your site’s traffic and ranking, many of them resort to adopting unethical ways of getting these backlinks i.e. buying them or exchanging them for other digital favors.

The question is, does it actually work? Can you really fool the Google Algorithms? Will your website be ever penalized for it? If you always had these questions in your mind, this blog post will help you explore it all and better understand if or not you can keep Google from knowing that you are buying links to enhance your site’s SEO.

A basic backdrop on backlinks

If you are new to website building or SEO, you might ask, ‘What is a backlink?‘ Well, a backlink is an incoming hyperlink from one web page to another website. Having these backlinks on a website increases the credibility of the website as well as the business related to it. Backlinks from quality sites that have a high authoritative value can have an added advantage for your website. There is a whole lot of theory behind the use of the right backlink practices and before you get there, you need to know how a paid link can affect your website.

What are paid/bought links vs. earned links?

An ideal link profile is one that features earned backlinks rather than paid ones. These earned backlinks happen when other websites and blogs find your website’s content to be genuinely interesting and useful and they choose to add them as a backlink. However, some website owners have relied on the arrangement of buying and selling links in order to survive the tough SEO war.

Paid link building is when a website pays a third party domain for a followed backlink that points back to their domain. This is strictly forbidden by search engines and can result in harsh penalties. It can benefit your website for a very short period, but it is never there to stay. Hence, your website should stay away from such unethical practices because sooner or later your website will get penalized for indulging in such misconduct.

A rightfully-earned backlink contributes to making websites more resourceful and easy for the online audience, instead of poaching SEO ranks. These earned backlinks point to content that is likable, resourceful, and qualified. Only when your website offers relevant and high-quality content, other websites want to point out to that. Yes, that is a lot of work and that is why most of the website owners try taking a shorter path i.e. paid links.

Will you really be able to hide from Google that you are buying links?

Some website owners and inexperienced SEO enthusiasts function with an opinion that they are smarter than Google or that ” how would Google ever know” that they are buying links. What they seem to underestimate is the fact that the Google’s reach and ability to mine and interpret data is far outreached than our comprehension. If you think that using a dedicated IP VPN such as PureVPN can help you hide the traces, that is debatable.

When these irresponsible website owners sport these paid links on their website, they forget that they always create a pattern, no matter how hard they try to avoid creating one. Patterns such as excessively sharing links that belong to another domain/industry that has nothing to do with their own industry or targeting websites that indulge in such excessive link building activities help Google to flag the culprits.

For people who think that they can outsmart the algorithms, they may be right. However, they also underestimate the human/manual review team of Google that can one day take their website for a roll and end up penalizing it heavily for such paid link building activity. It is always either an algorithm report, a tip-off by a competitor, or a manual team member’s action which is going to take down such websites. So, there’s no escape.

Why paid links don’t work at all?

The answer is short, loud, and clear. Paid links don’t work in the long run because they are unnatural, irrelevant, and deceptive; enough to get your website penalized.

By now, we hope that we have managed to throw enough light on the topic of the super bad paid links and on the good guys that earned backlinks are. Now that you know, here are some Bonus link-building tips:

Check out your website’s Authority with the help of Website Authority Checker. The tool will help you understand and check the authority of a website to give you an outlook on how a website can perform in Google.
Always focus on the quality of backlinks that you accept for your website rather than the number of backlinks.
It is important to make sure that you are not receiving backlinks from low-quality websites. They hamper your site’s chances of ranking well.
Guest posts are probably a great way of getting backlinks in exchange for good blog content. However, don’t play around with this one because you will always end up getting caught.
Broken links are not an issue to be ignored. They lead to a poor user experience and make it difficult for search engines to efficiently crawl and index websites. So, fix them as soon as you figure them out.
Conclusion

Link building is a wide topic to reckon with because it is 2018 and the website ranks make up for a good competition. Even if your website is struggling to stay ahead and make a mark, never fall into the unethical practices of paid links or other black hat SEO techniques. They are always figured out by Google and that can do a lot of harm to the reputation of your website.

With the right backlinking practices, you will be able to win over the other marketers only because you will know how to implement advanced link building techniques in the most ethical way. The progress might be slow but all your efforts will be worth the wait when your site’s search engine ranking improves.

Google data breach + Berners-Lee’s Solid — is the power shifting?

I don’t usually go for drastic headlines, but it does seem like some tides have been turning of late.

We’ve all followed the stories of data breaches, new regulations, fake news, hacks, ever-rising privacy concerns. Not to mention this week’s discovery that webmaster Google had a breach exposing private data from as many as 500,000 people. As a result of which, they’ll be shutting down Google+ for consumers.

Facebook and Google faced scandals of no small sort within months of each other. GDPR passed, and subsequent regulations are hedging their way into the US market.

But perhaps most interesting of all, on September 29 Tim Berners-Lee surfaced to announce the next “one small step” for the web. I may not speak for the masses, but when Berners-Lee pipes up about something I tend to lend my ear. Besides being best known as the person who invented the World Wide Web (how about adding that to your LinkedIn), he’s been quite on-point in following its evolution.

Curious footnote: the WWW started as a memo

As he tells the story himself from a TED stage, “I wrote a memo suggesting the global hypertext system. Nobody really did anything with it. But 18 months later — this is how innovation happens — 18 months later, my boss said I could do it on the side, as a sort of a play project…So I basically roughed out what HTML should look like: hypertext protocol, HTTP; the idea of URLs, these names for things which started with HTTP. I wrote the code and put it out there.”

And now look at us. Running whole businesses on that one widely explosive memo.

Anyway. Almost 20 years after the original invention, Tim Berners-Lee appeared on the TED stage to thank people for all their work contributing to the web so far and to ask support to push the web into the next phase.

From documents to data

Reflecting on the collaborative effort that had been the web thus far, in 2009 Berners-Lee said, “I asked everybody, more or less,”Could you put your documents on this web thing?” And you did. Thanks. It’s been a blast, hasn’t it?” He likened that first evolution to the next: from documents to data. In that talk in 2009 he asked people, governments, universities, the UN, anyone with large, unused, non-private data sets to open them up on the web.

Through data, we saw the magic of Hans Rosling showing us global development over time. We’ve seen data used to help in hurricane relief, to save a primeval forest, and of course to create entirely new industries, products, customer experiences, and interactions.

From one-way data to read-write data

Happily, we’ve seen open data in troves. But most of it has been one-way, for instance government data that can be viewed but not interacted with.

Which brings us back to: hey Berners-Lee, what have you been up to the last eight years?

Besides teaching computer science at both Oxford and MIT (again, casual), he’s apparently been working on a little side project called Solid, “an open-source project to restore the power and agency of individuals on the web.”

Built using the existing web, Solid is a platform that offers two primary benefits: data empowerment and data interactivity. It gives users the power to decide where data is stored and who can access which parts of it. It lets users link, share, and collaborate on data with whomever they want.

Next: power with digital giants to power with consumers?

All of this of course brings us back to the original question: have we reached the tipping point?

Some proponent the concept of “walled gardens,” where internet, media, advertising, search, and data power are concentrated in the hands of primarily four digital giants: Google, Amazon, Apple, and Facebook.

Those four companies continue creeping into our lives and homes in never before dreamed ways. But trust is waning. Earlier this year, Edelman found a “37-point aggregate drop in trust across all institutions” — a steeper decline than in any other market.

In the words on Berners-Lee, “For all the good we’ve achieved, the web has evolved into an engine of inequity and division; swayed by powerful forces who use it for their own agendas. Today, I believe we’ve reached a critical tipping point, and that powerful change for the better is possible — and necessary.”

What does internet in the hands of consumers look like?

Well, who’s to say? Right now it still looks rather swayed by those powerful forces who use it for their own agendas.

A platform like Solid, though, would usurp that. It’s at odds with the current value exchange. Instead of demanding users hand over personal data to digital giants in order to essentially use the web, Solid seeks to take one small step toward restoring the balance of the web as it was actually intended to be. We would each have control over data.

Just as we all “put our documents on this web thing” and “it was a blast,” a platform like Solid seeks data empowerment and data interactivity. Two things many of us struggle to imagine.

But then again, as Berners-Lee ended his post, “The future is still so much bigger than the past.”

Ps in case I haven’t already made this clear, it’s a pretty worthwhile read.

This post also appeared on ClickZ.

A cheat sheet to Google algorithm updates from 2011 to 2018

Google makes changes to its ranking algorithm almost every day. Sometimes we know about them, sometimes we don’t. Some of them remain unnoticed, others turn the SERPs upside down. So, this cheat sheet contains the most important algorithm updates of the recent years alongside battle-proven advice on how to optimize for these updates.

Panda

It all started changing in 2011 when Google introduced its first ever Panda algorithm update, the purpose of which was to improve the quality of search results by down-ranking low quality content. This is how Panda marked the beginning of Google’s war against grey-hat SEO. For 5 long years it had been a separate part of a wider search algorithm until 2016 when Panda became part of Google’s core algorithm. As stated by Google, this was done because the search engine doesn’t expect to make major changes to it anymore.

Main Focus

  • duplicate content
  • keyword stuffing
  • thin content
  • user-generated spam
  • irrelevant content

Best Practice

The very first thing to focus your attention on is internally duplicated content. I can recommend carrying out site audits on a regular basis in order to make sure there are no duplication issues found on your site.

External duplication is yet another Panda trigger. So, it’s a good idea to check suspected pages with Copyscape. There are, however, some industries (like online stores with numerous product pages) that simply cannot have 100% unique content. If that’s the case, try to publish more original content and make your product descriptions as outstanding as you can. Another good solution would be letting your customers do the talking by utilizing testimonials, product reviews, comments, etc.

The next thing to do is to look for pages with thin content and fill them with some new, original, and helpful information.

Auditing your site for keyword stuffing is also an obligatory activity to keep Panda off your site. So, go through your keywords in titles, meta description tags, body, and H1 for making sure you’re not overusing keywords in any of these page elements.

Penguin

Penguin update launched in 2012 was Google’s second step towards fighting spam. In a nutshell, the main purpose of this algorithm was (and still is) to down-rank sites whose links it deems manipulative. Just like Panda, Penguin has also become part of Google’s core algorithm since 2016. So, it now works in real-time constantly taking a look at your backlink profile to determine if there’s any link spam.

Main Focus

  • Links from spammy sites
  • Links from topically irrelevant sites
  • Paid links
  • Links with overly optimized anchor text

Best Practice

The very first thing to do here is to identify harmful links. It’s worth saying that various software use different methods and formulas to determine the harmfulness of a certain link. Luckily, SEO SpyGlass’s Penalty Risk metric uses the same formula as Penguin does.

After you’ve spotted the spammers, try to request removal of the spammy links in your profile by contacting the webmasters of the sites that link to you. But if you’re dealing with tons of harmful links or if you don’t hear back from the webmasters, the only option to go for is to disavow the links using Google’s Disavow tool.

Another thing that I can highly recommend making your habit is monitoring link profile growth. The thing is any unusual spikes in your link profile can be a flagger that someone has spammed on your site. Most probably you won’t be penalized for one or two spammy links, but a sudden influx of toxic backlinks can get you in trouble. And on the whole, it’s of the greatest advantage to check all the newly acquired links.

EMD

The Exact Match Domain update was introduced by Google in 2012 and is exactly what it’s called. The intent behind this update was to target exact match domains that were also poor quality sites with thin content. This was done because back in the days SEOs would skyrocket in search results by buying domains with exact match keyword phrases and building sites with extremely thin content.

Main Focus

  • Exact match domains with thin content

Best Practice

There’s nothing wrong with using an exact match domain. The only condition for you to be on the safe side is to have quality content on your website. What is more, I wouldn’t advise you to remove low quality pages entirely, try to improve your already existing ones with new, original content instead.

It’s also a good idea to run link profile audits on a regular basis to identify spammy inbound links that have low trust signals and sort them out. After that, it’s only right and logical to start building quality links as they are still the major trust and authority signals.

Pirate

You probably still remember the days when sites with pirated content were ranking high in the search results and piracy used to be all over the Internet. Of course, that had to be stopped, and Google reacted with the Pirate Update which was rolled out in 2012 and aimed to penalize websites with a large amount of copyright violations. Please note that there’s no way the update can entail your website being removed from index, it can only penalize it with lower rankings.

Main Focus

  • Copyright violations

Best Practice

There’s not much to advise here. The best thing you can do is publishing original content and not distributing others’ content without the copyright owner’s permission.

As you may know, the war with piracy is still not won. So, if you’ve noticed that your competitors use pirated content, it’s only fair to help Google and submit an official request using the Removing Content From Google tool. After that, your request will be handled by Google’s legal team, who can make some manual adjustments to indexed content or sites.

Hummingbird/RankBrain

Starting from 2013, Google has set a course for better understanding of search intent. So, it introduced Hummingbird in the same year and then RankBrain in 2015. These two updates complement one another quite well as they both serve for interpreting search intent behind a certain query. However, Hummingbird and RankBrain do differ a bit.

Hummingbird is Google’s major algorithm update that deals with understanding search queries (especially long, conversational phrases rather than individual keywords) and providing search results that match search intent.

And RankBrain is a machine learning system that is an addition to Hummingbird. Based on historical data on previous user behavior, it helps Google process and answer unfamiliar, unique, and original queries.

Main Focus

  • Long-tail keywords
  • Unfamiliar search queries
  • Natural language
  • User experience

Best Practice

It’s a good idea to expand your keyword research paying special attention to related searches and synonyms to diversify your content. Like it or not, but the days when you could solely rely on short-tail terms from Google AdWords are gone.

What is more, with search engines’ growing ability to process natural language, unnatural phrasing, especially in titles and meta descriptions can become a problem.

You can also optimize your content for relevance and comprehensiveness with the help of competitive analysis. There are a lot of tools out there that provide TF-IDF analysis. It can help a lot with discovering relevant terms and concepts that are used by a large number of your top-ranking competitors.

Besides all the above mentioned factors, don’t forget that it’s crucial to work on improving user experience. It’s a win-win activity as you’ll provide your users with better experience and won’t be down-ranked in SERPs. Keep an eye on your pages’ user experience metrics in Google Analytics, especially Bounce Rate and Session Duration.

Pigeon/Possum

Both Pigeon and Possum are targeting local SEO and were made to improve the quality of local search results. The Pigeon Update rolled out in 2014 was designed to tie Google’s local search algorithm closer to the main one. What’s more, location and distance started to be taken into consideration while ranking the search results. This gave a significant ranking boost to local directory sites as well as created much closer connection of Google Web search and Google Map search.

Two years later, when the Possum Update was launched, Google started to return more varied search results depending on the physical location of the searcher. Basically, the closer you are to a business’s address, the more chances you have to see it among local results. Even a tiny difference in the phrasing of the query now produces different results. It’s worth mentioning that Possum also somehow boosted businesses located outside the physical city area.

Main Focus

  • Showing more authoritative and well-optimized websites in local search results
  • Showing search results that are closer to a searcher’s physical location

Best Practice

Knowing that factors applicable to traditional SEO started to be more important for local SEO, local businesses owners now need to focus their efforts on on-page optimization.

In order to be included in Google’s local index, make sure to create a Google My Business page for your local business. What is more, keep an eye on your NAP as it needs to be consistent across all your local listings.

Getting featured in relevant local directories is of the greatest importance as well. It’s worth mentioning the Pigeon update resulted in a significant boost of local directories. So, while it’s always hard to rank in the top results, it’s going to be much easier for you to get included in the business directories that will likely rank high.

Now that the location from where you’re checking your rankings influences a lot the results you receive, it’s a good idea to carry out geo-specific rank tracking. You just need to set up a custom location to check positions from.

Fred

Google Fred is an unofficial name of another Google update which down-ranked websites with overly aggressive monetization. The algorithm hunts for excessive ads, low-value content, and websites that offer very little user benefit. Websites that have no other purpose than to drive revenue rather than providing helpful information are penalized the hardest.

Main Focus

  • Aggressive monetization
  • Misleading or deceptive ads
  • User experience barriers and issues
  • Poor mobile compatibility
  • Thin content

Best practice

It’s totally fine to put ads on your website, just consider scaling back their quantity and consider their placement if they prevent users from reading your content. It would be also nice to go through Google Search Quality Rater Guidelines to self-evaluate your website.

Just as usual, go on a hunt for pages with thin content and fix them. And of course, continue working towards improving user experience.

Mobile Friendly Update/Mobile-first indexing

Google’s Mobile Friendly Update (2015), also known as Mobilegeddon, was designed to ensure that pages optimized for mobile devices rank higher in mobile search and down-rank not mobile friendly webpages. However, soon it has become not enough just to up-or down-rank sites according to their mobile friendliness. So, this year Google introduced mobile-first indexing according to which it started to index pages with the smartphone agent in the first place. Moreover, websites that only have desktop versions have been indexed as well.

Main Focus

  • All sites (both mobile-friendly and not)

Best Practice

If you’re curious whether your site has been migrated to mobile-first indexing or not, check out your Search Console. If you didn’t receive a notification, it means that your website is not included in mobile-first index yet. What is more, make sure that your robots.txt file doesn’t restrict Google bot from crawling your pages.

If you still haven’t adapted your website for mobile devices, the time to join the race is now. There are a few mobile website configurations for you to pick from, but Google’s recommendation is responsive design. If your website is already adapted for mobile devices, run the mobile friendly test to see if it meets Google’s objectives.

If you have a dynamic serving or separate URLs make sure that your mobile site contains the same content as your desktop site. What is more, structured data as well as metadata should be present on both versions of your site.

For more detailed information, consider other recommendations on mobile-first indexing from SMX Munich 2018.

Page Speed Update

And now on to the Page Speed Update that rolled out in July of this year which has finally made page speed a ranking factor for mobile devices. According to this update, faster websites are supposed to rank higher in search results. In light of this, our team conducted an experiment to track correlation between page speed and pages’ positions in mobile SERPs after the update.

It turned out that a page’s Optimization Score had a strong correlation to its position in Google search results. And what is more important, slow sites with high Optimization score were not hit by the update. That brings us to a conclusion that Optimization is exactly what needs to be improved and worked on in the first place.

Main Focus

  • Slow sites with low Optimization score

Best Practice:

There are now 9 factors that do influence Optimization Score officially stated by Google. So, after you’ve analyzed your mobile website’s speed and spotted (hopefully not) its weak places, consider these 9 rules for Optimization Score improvement.

  • Try to avoid landing page redirects
  • Reduce your file’s size by enabling compression
  • Improve the response time of your server
  • Implement a caching policy
  • Minify resources (HTML, CSS, JavaScript)
  • Optimize images
  • Optimize CSS delivery
  • Give preference to visible content
  • Remove render-blocking JavaScript

On-site SEO for international brands, do’s and don’ts

International brands have their work cut out for them. Building a consistent brand experience across multiple continents and to audiences that speak different languages is no easy task, and the process of translating individual pages from one language to another is time consuming and resource intensive.

Unfortunately, much of this work can go to waste if the right steps aren’t taken to help search engines understand how your site has been internationalized.

To help you prevent this, we’ve collected a list of “Do’s and Don’ts” to help guide your internationalization efforts and ensure that your pages get properly indexed by search engines.

Do conduct language specific keyword research

The direct translation of a keyword will not necessarily be what users are searching for in that language. Rather than simply taking the translation at face value, you will have more success if you take a look at your options in the Google Keyword Planner to see if there are other phrasings or synonyms that are a better fit.

Remember to update your location and language settings within the planner, listed just above the “keyword ideas” field:

Don’t index automatic translation

Automatic translation can be better than nothing as far as user experience goes in some circumstances, but users should be warned that the translation may not be reliable, and pages that have been automatically translated should be blocked from search engines in robots.txt. Automatic translations will typically look like spam to algorithms like Panda and could hurt the overall authority of your site.

Do use different URLs for different languages

In order to ensure that Google indexes alternate language versions of each page, you need to ensure that these pages are located at different URLs.

Avoid using browser settings and cookies to change the content listed at the URL to a different language. Doing so creates confusion about what content is located at that URL.

Since Google’s crawlers are typically located in the United States, they will typically only be able to access the US version of the content, meaning that the alternate language content will not get indexed.

Again, Google needs a specific web address to identify a specific piece of content. While different language versions of a page may convey the same information, they do so for different audiences, meaning they serve different purposes, and Google needs to see them as separate entities in order to properly connect each audience to the proper page.

We highly recommend using a pre-built e-commerce platform like Shopify Plus or Polylang for WordPress in order to ensure that your method for generating international URLs is consistent and systematic.

Don’t canonicalize from one language to another

The canonical tag is meant to tell search engines that two or more different URLs represent the same page. This doesn’t always mean the content is identical, since it could represent page alternates where the content has been sorted differently, where the thematic visuals are different, and other minor changes.

Alternate language versions of a page, however, are not the same page. A user searching for the Dutch version of a page would be very disappointed if they landed on the English version of the page. For this reason, you should never canonicalize one language alternate to another, even though the content on each page conveys the same information.

Do use “hreflang” for internationalization

You may be wondering how to tell search engines that two pages represent alternate language versions of the same content if you can’t use canonicalization to do so. This is what “hreflang” is for which explicitly tells the search engines that two or more pages are alternates of one another.

There are three ways to implement “hreflang,” with HTML tags, with HTTP headers, and in your Sitemap.

1. HTML Tags

Implementing “hreflang” with HTML tags is done in the section, with code similar to this:

Title tag of the page

Where hreflang=”en” tells search engines that the associated URL https://example.com/page1/english-url is the English alternate version of the page. URLs must be entirely complete, including http or https and the domain name, not just the path. The two letter string “en” is an ISO 639-1 code, which you can find a list of here. You can also set hreflang=”x-default” for a page where the language is unspecified.

Each alternate should list all of the other alternates, including itself, and the set of links should be the same on every page. Any two pages that don’t both use hreflang to reference each other will not be considered alternates. This is because it’s okay for alternates to be located on different domains, and sites you do not have ownership of shouldn’t be able to claim themselves an alternate of one of your pages.

In addition to a language code, you can add an ISO 3166-1 alpha-2 country code. For example, for the UK English version of a page, you would use “en-GB” in place of “en.” Google does advise having at least one version of the page without a country code. You can apply multiple country codes and a country-agnostic hreflang to the same URL.

2. HTTP header

As an alternative to HTML implementation, your server can send an HTTP Link Header. The syntax looks like this:

Link: ; rel=”alternate”; hreflang=”en”,

; rel=”alternate”; hreflang=”es”,

; rel=”alternate”; hreflang=”it”

The rules regarding how to use them are otherwise the same.

3. Sitemap

Finally, you can use your XML sitemap to set alternatives for each URL. The syntax for that is as follows:

https://example.com/page1/english-url

Note that the English version of the page is listed both within the tag and as an alternate.

Keep in mind that this is not complete. For it to be complete you will also need separate sections for the Spanish and Italian pages, each of them listing all of the other alternates as well.

Don’t rely on the “lang” attribute or URL

Google explicitly does not use the lang attribute, the URL, or anything else in the code to determine the language of the page. The language is determined only by the language of the content itself.

Needless to say, this means that your page content should be in the correct language. But it also means:

  • The main content, navigation, and supplementary content should all be in the same language
  • Side by side translations should be avoided. Don’t display translations on the page, just make it easy for users to switch languages.
  • If your site offers any kind of automatic translation, make sure that this content is not indexable
  • Avoid mixing language use if at all possible, and if it is necessary, make sure that the primary language of the page dominates any others in substance

Do allow users to switch languages

For any international business, it’s a good idea to allow the users to switch languages, usually from the main navigation. For example, Amazon allows users to switch languages from the top right corner of the site:

Do not force the user to a specific language version of the page based on their location. Automatic redirection prevents both users and search engines from accessing the version of the site that they need to access. Google’s bots will never be able to crawl alternate language versions of a page if they are always redirected to the US version of the site based on their location.

Turning to Amazon for our example once again, we are not prevented from accessing amazon.co.jp, but we do have the option of switching to English:

Don’t create duplicate content across multiple languages

While you should not canonicalize alternate language versions of one page to another, if you use alternate URLs for pages meant for different locations but the language and content are identical, you should use the canonical tag. For example, if the American and British versions of a page are identical, one should consistently canonicalize to the other. Use hreflang as discussed above to list them as alternates with the same language but for different locations.

Conclusion

Use these guidelines to make sure users from all of your target audiences will be able to find your pages in the search results, no matter where they are located or what language they speak.