Ways to improve your link building

With the right strategy in place, link building can be a hugely effective way of building strong authority to increase longer term, sustainable organic visibility. Unfortunately, it’s very easy to find yourself returning to old, outdated methods. With so many different approaches to link building, it’s important to take a step back and look at the bigger picture to make the greatest impact.

There are a variety of link building tactics that don’t require a huge amount of resource or expense, so whether you’re working for an agency or in-house, dust away the cobwebs that are plaguing your strategy and step up. Below are just a few ways you can improve your approach to link building.

Don’t forget the basics

The first step is not to forget the basics, it’s so easy to forget these – particularly when you’re constantly being served with ‘inspirational content’ that promises to be the best and only method you’ll ever need. Revisiting old, unlinked brand mentions and fixing broken links can have a huge impact, particularly when from a strong authority site.

Immerse yourself in the brand

If you are working with an agency, having a ‘brand immersion’ or ‘discovery day’ can be incredibly useful if you approach it correctly. Start out with a full list of everything you’d want to know about a client, their product or brand – and pretty much interrogate them.

A client of ours recently said he’d been running his business for so long he assumed everyone knew everything he did about their business and products, when in fact they were probably only conveying 10% of their USPs digitally. If a client holds their cards close to their chest, a brand immersion day is an opportunity to get a grasp on who they are as a brand and how they work.

Even better, this is a chance to meet with their PR representatives and see how you can work together to make the best of each other’s work. There may be things you uncover that can be used as an asset, things that they would never consider telling you proactively. For example, new product launches or an existing relationship with a site that you’ve been trying to crack for months.

Future-proof your strategy

If only one thing is certain in life, it’s that Google will continually change its algorithm. Unfortunately, we can’t predict the future and may spend a long time securing a link, only for it to suddenly have no value.

Bend fate in your favor by thinking about the bigger picture, and developing strategies that are built solely on authenticity. Build good solid links from authoritative websites. Be real and genuine, provide value in your content and insights. Always drawback to why you’re building links, whether it’s for the brand awareness they could build, to the referrals they could bring.

Monitor your own backlink profile

Monitoring your own backlink profile is a vital part of growing it, and is surprisingly something a lot of link-builders put to the bottom of their to-do list. It’s essential to see which new sites are linking to you, so you can build that relationship and contribute more great content or insights.

Second to this, a lot of sites will link to you but won’t tell you, so it’s crucial to keep on top of this. It’s also vital to see which sites stop linking to you, as there will be opportunity to try and get that link back, or try and build a relationship with that site.

Relationships over anything else

Having a good relationship with a site or influencer is almost as important as how good a piece of content is. Follow them on Twitter, comment on their activity, be a familiar face and a name that is regularly in touch with pitches and ideas. You will find that they start coming directly to you for content and ideas – instead of the other way round.

Keep a close eye on the competition

Monitoring your competitors’ activity is a very cost and time-effective way of identifying new sites to contact, new content opportunities and outreach methods to use. Using competitor links for your own gains are always fruitful and don’t require a lot of time or creativity

To make things even easier, it’s something you can automate by setting up Google Alerts or backlink alerts and reports on tools like SEMRush. Competitors are always acquiring new links, so this is something that should be continually monitored.

Don’t be afraid of a nofollow link

As mentioned above, we should be focused on the bigger picture and future-proofing link building strategies. Sometimes this means getting a nofollow link or an unlinked citation now and again. Some sites have a policy, some sites do nofollow links automatically. If a citation is genuinely driving traffic and brand awareness, then the fact that it’s a nofollow or unlinked shouldn’t be troubling you too much.

Most link building tactics fall under the category of ‘quick-wins’, and the results can have a huge impact on your site’s authority and brand awareness. Fundamentally, staying wary of the latest link building developments is key, as an outdated strategy can distill your wider SEO strategy and hold back the success of your site.

Using behavioral design to reduce bounce rate

It comes as no surprise that humans have terribly short attention spans. In fact, a study by Microsoft put a number on it: 8 seconds – less than the attention span of a goldfish. The implications for online marketing are huge. In a noisy and highly competitive online space, you either grab a visitor’s attention the moment they land on your website or lose them – possibly forever.

Bounce rate is an important metric for measuring how users engage with a website. It indicates the percentage of visitors who navigate away from your site after viewing only one page.

Think of the times when, as a web user, you visited a website and immediately headed for the back button. While the decision to exit the page may have been made unconsciously, the reality is that certain ‘unappealing’ elements on the website influenced that decision. This is the basis of behavioral design. The rationale is that if certain elements of a web page can drive users away, then there must be other characteristics that can make them stay.

Leading behavior scientist, BJ Fogg, has extensively studied how technological solutions influence behavior and outlines a three-step method for using design to change behavior. These are:

Getting specific about the desired behavior
Making it easy for users to reach that outcome
Using triggers to prompt the behavior.

If we apply this method to bounce rates, then the first step is clear. The goal is to get your site visitor to click another link. But how do you fulfill the other two obligations? How do you create an environment that encourages users to perform this action? Here are three strategies to implement.

Improve branding

Fogg, along with other researchers, studied 2500 web users to understand how they assess a website’s credibility. They found that the average consumer paid far more attention to the visual design of a site than to its content.

Almost half (46.1%) of the participants judged a website’s credibility based on the ‘design look’. This includes the overall appeal of the visual design, the layout, typography and color schemes.

What does this mean for bounce rates? If users don’t perceive your website as credible, you’ll have a hard time getting them to stick around, let alone click on anything else on your site. Uniform and visually-appealing branding immediately catches the attention of a site visitor, especially if you’re a new brand.

Therefore, it’s important to have a brand identity with uniform branding – not just for your web pages (copy and color), but your entire web presence (including social media and landing pages).

Technological advances grant businesses of all sizes the ability to create their brand’s identity based on data. Tailor Logo, for instance, is a tool for generating logos/branding kits using dedicated machine learning algorithms that enable businesses to stay consistent in all the touch points where users may come across your brand.

In addition, the tool helps users develop the perfect typography for their branding through a series of carefully designed questions that provide insights into the brand’s identity and objective. Typography is critical for improving a visual design; a Nielsen study found that small font sizes and low-contrast are the number one complaint for web users as it relates to reading online.

Reduce cognitive load

Cognitive load refers to the total amount of mental effort required to complete a task that involves processing of information. In practical terms, this is the amount of mental resources users have to dedicate to be able to understand/process the information on your website.

Since the recent GDPR implementation, I’ve lost count of the number of sites where I’m bombarded with two or three pop-ups as soon as I land on the page.

This leaves site visitors with too much choice and too many tasks to perform. What should they do first? Accept privacy policies, read content, subscribe to your newsletter, or pay attention to the flashing ebook download? It’s not difficult to see why users will choose the easiest option – a quick exit.

What you should do is consider every page as a single entity and give some thought to what a user who visits a specific page might want to do. If it’s a blog post, then getting the information they need is likely the user’s main intent. So, do away with unhelpful pop-ups and focus on giving the user a seamless reading experience. Embedding the links to your lead magnets within the content could be far more effective in this context. If you must use a triggered opt-in form, have it come up only when the user attempts to exit the page.

Perfect your triggers

Revisiting Fogg’s three-step model, the last step is to provide a trigger for the desired behavior. In this case, you want users to follow a link on your web page. This could be a glaring CTA button or a subtler link embedded within a blog post. But how do you make it easy for users to act on these cues?

Make the triggers relevant. Suppose a user reads an interesting blog post on how to write web copy and is interested in learning more techniques, but the suggested content and lead magnet on the blog post page are about data mining. What would be the logical next step for this user? Contrast that with a page with links to relevant copywriting content. It’s clear how this user will respond differently.

Place triggers in the right places. Understanding how users interact with spaces is important. If you haven’t heard of the F-pattern yet, then you should. The Nielsen group conducted an eye-tracking research, which revealed that people scan web pages and phone screens in the shape of the letter F.

The key takeaway is that for any piece of content, users pay the most attention to the first few paragraphs, then somewhere down the middle and finally take a few glances at the end. In other words, they scan – not read – information.

If you are hiding vital information in between large blocks of text, then that’s bad news. Readers won’t see it. Your content should be easy to scan so that readers can quickly find the information they need. This includes links to more relevant content, offers or contact information.


A good bounce rate is important for online success. By using insights from online user behavior to improve your website design, you can increase engagement, reduce bounce rates and ultimately improve conversion.

Pius Boachie is the founder of DigitiMatic, an inbound marketing agency.

How Alexa and Siri are changing SEO: AI and voice search

The Internet changes rapidly, which means marketers and business leaders must hurry to change with it. While most Internet searches were once done on laptops and desktops, people are now using their smartphones with similar devices to conduct searches for information, local businesses, products, and services.

That shift was closely followed by another somewhat more distinctive shift called artificial intelligence (AI)-assisted voice search.

In the past, a smartphone user would need to type a question or phrase into Google or another search engine to get a set of results to sift through. Now, AIs like Siri and Alexa – which reside in smart speakers and on smartphones, tablets, and laptops – have changed the way users are searching for the information, products, and services they need.

You can conduct searches with nothing more than the sound of your voice. And that’s rapidly changing the SEO landscape.

How voice-assisted search is changing searches

Most people have smartphones these days, and the vast majority of smartphones have voice-assisted search capabilities. According to 2017 data from the Pew Research Center, 77% of Americans now own smartphones. Among 18–29-year-olds, that same figure is 92%.

This means an enormous share of the general public is able to use voice-assisted AI search. When users of smartphones and smart speakers ask those devices for an answer to a query, that leaves the job of searching to certain AI like Siri and Alexa.

While Amazon’s Alexa will not deliver the answer to a voice search query unless it has been proven accurate, Google Voice Search tech (Google Home and Android devices) reports top results from Google. It doesn’t report results lower down on the search engine results page (SERP) or on subsequent results pages.

This makes being at the top of Google’s results more important than ever.

The language of voice search

As voice search through AI becomes more prevalent, the language of search changes.

When typing a phrase or question into Google, a searcher might use a non-sentence, such as “Indian restaurant Houston”, but when conducting a voice search through Alexa or Siri, the searcher will likely use full sentences and grammatically correct language:

“Siri, where is a good Indian restaurant in Houston?”

AI platforms try to respond to such queries in a human way, and they use the text of pages in search results to do so. Content should be optimized for conversational language with clear, grammatically correct answers to specific questions, such as who, what, where, when, and why.

Location and navigation searches

Thanks to voice search, mobile-friendly sites are becoming more important than ever. That’s because many people who use voice-assisted search do so on their smartphones.

Owing to the mobile nature of smartphone use, a large portion of voice requests through Alexa, Siri, and similar AI technologies deal with navigation and location. Integration with Google Maps means an opportunity for greater traffic for businesses with a local search presence.

For instance, a person may conduct a voice search for a “dentist near me” rather than doing a typed general search for top-rated dentists.

AIs process the spoken search query while keeping the user’s location in mind. This places further importance on business integration with Google Maps and creating optimized landing pages with location references.

To put it simply, voice requests lead to a SERP, where local businesses will want to rank. Claiming and maintaining Google My Business listings will become more important as voice search gains popularity.

Why FAQ pages work for voice search

Frequently asked questions (FAQ) pages appear to serve voice search purposes well. Long-tail keywords formulated as complete and conversational questions, answers to those questions, or location (“near me”) searches are becoming more important because they often answer voice search queries.

While a text-based search may seek broad information, a voice search generally seeks key information that can be concisely communicated, such as hours of operation, location, and directions.

Creating landing pages with this key information in mind is likely to improve placement in SERPs for AI-assisted voice searches.

Smartphone search vs. smart speaker

Smartphones are everywhere, but smart speakers are gaining traction quickly. In fact, around 39 million Americans own one of these devices, according to a January 2018 poll from Edison Research and NPR. As smart speakers like Google Home and Amazon Echo become more popular and available, people are beginning to use them to conduct searches.

As smart speakers aren’t linked to a screen or display of any kind, users only receive a verbal response to voice searches. That response is often based on a single search result – chosen by the AI assistant in an unseen selection process that takes only a few seconds.

Developers of these devices and AIs want the single result delivered to the user to answer the question or query fully and concisely. A business that is not highly ranked is not likely to be included in the limited results delivered by AI-assisted voice search.

Looking forward

Whether they’re aware of it or not, AIs like Alexa and Siri are changing SEO, and it’s up to marketers and businesses to adapt. From opting for conversational content to fully integrating businesses with Google Maps, there are plenty of steps to take to capture the benefits of this new type of search.

While AI-assisted voice search brings new goals and challenges to the table, the ultimate goal of SEO remains the same, whether you’re involved in SEO for law firms, restaurants, doctors’ office, or any other business. To convince AIs to include your content in their very limited answers to voice searches, you still need to occupy the top of the SERPs.

A page two or even top five ranking isn’t what it used to be. As voice search gains traction, being number one becomes more important than ever.

This article was originally published on our sister site, ClickZ.

Something to hide? The rise of privacy-focused search engines

Many people are comfortable opening up their world to others, some are not. This can even extend to the use of the Internet; some feel uneasy at the thought of somebody watching and analyzing every move to build a profile. And ultimately, when users believe this to be the case, they self-censor and think twice about what they search for and how to word it.

In the past the usual question would have been, “what do I have to hide?”. Surprisingly for some (disappointingly for others), the answer is often quite straightforward and benign. For me, it’s simply because I prefer to keep myself to myself, which helps to eliminate a feeling of shyness and preserve the energy it would have otherwise consumed.

Times are changing

The major search engines have increasingly pushed the envelope on user privacy, often expanding their surveillance by stating it somewhere deep within the terms of service. Many, often non-technical, individuals may be completely unaware of the scope and scale of data mining happening on an individual’s behavior.

Still, the majority of people continue to sacrifice some of their privacy in order to use free services such as Google. This is understandable as Google remains top in search, but that is changing. A growing number of people are starting to wonder where to draw the line; what is acceptable and what is too invasive? At what point do they no longer feel comfortable with the level of intrusion that comes with using these ‘free’ services. More people are therefore seeking out alternatives that respect their privacy.

This has led to the rise of search engines such as DuckDuckGo, StartPage and others. These search engines not only provide ‘privacy as a service‘, but also burst ‘filter bubbles‘ that use online tracking to target and customize results and content. Without tracking, these bubbles are burst and you are shown content based on what you looked for and not your previous history. This helps to prevent confirmation bias.

Enough is enough

The Snowden leaks, Cambridge Analytica, advertising that follows you everywhere, filter bubbles, advertising/search companies bypassing your phone’s security to more thoroughly track you, personalized pricing based on a profile – where will it end?

It would appear these companies have no real intention to change their ways. If the past is anything to go by then these stories are only going to continue. It will only be stopped by regulation or people saying enough is enough, voting with their clicks, and choosing an alternative that sees and treats them as a customer, and not a product.

Perhaps we’ll reach a crunch point and there will be a max exodus from these companies, or maybe just enough people to allow healthy competition to flourish. Hopefully this will give rise to a whole new wave of companies that put the people who use them first, and their privacy at the forefront of everything they do.

And where better to start than with search – an activity everyone participates in and which is increasingly seen as the gateway (and unfortunately in some cases, the gatekeepers) to the Internet.

Checklist for performing basic SEO audit

Keeping up with the latest SEO practices can be overwhelming for someone who has just started out as a new website owner. Moreover, making sure that the existing SEO implementations are regularly taken care of as per the latest Google algorithm changes can be a nerve-wracking task for some.

To make sure that you are not missing out on them, you have to perform an SEO audit of your website/blog at regular intervals of time. By keeping a checklist handy, you maintain a standard SEO audit task list that makes sure that none of the crucial points are missed out.

In this blog post, we will be briefing you with a complete checklist that will help you perform the SEO audit of your website effectively and efficiently.

The advantages of having an SEO audit checklist are:

  • Helps you identify the site’s weakest points and work on them
  • Helps you with your site’s off and on-page optimization
  • Helps you increase your site’s loading speed after you implement conclusions drawn out of the checklist
  • Self-auditing your own website
  • Helps you increase the organic traffic for your business
  • The checklist helps you scan your website for any SEO omissions or SEO wrongdoings
  • This SEO audit checklist will help you improve the website ranking factors by providing checkpoints.

We have tried our best to create a comprehensive checklist so that you don’t miss out on the major aspects of SEO. However, there will be certain checkpoints that we have liberally skipped from the checklist.

Now, let’s dive right into the contents of the checklist for your next SEO audit.

  • Determine the objectives of your site’s SEO audit

  • This is the most important task for your SEO audit checklist where you determine the objective or the purpose as to why you are in need of an SEO audit. These objectives will eventually define the extent of this checklist and maybe, make things a bit different.

    For example, if your website is not yielding the desired speed results, you might want to conduct an SEO audit and figure out what’s causing the trouble. On the other hand, if there is a sudden downfall in the traffic metrics, you might want to fix things through an SEO audit. Well, there are many similar objectives behind this activity and by carefully assessing them, you will be able to perform an SEO Audit more effectively.

  • Keyword analysis

  • SEO begins with keywords so it is a good idea to begin your audit with the analysis of the keywords your site is using and if you’re targeting the right ones.

    Make sure that you are using less competitive, long tail keywords that are very specific to the industry you are dealing in. The right keyword will be relevant, measurable, and obviously not a shot at the moon i.e. it will help you rank better.

    A few tools that you can use are UberSuggest, SEMrush, Google keyword planner and Google trends.

    You must run a keyword strategy that targets time bound keywords simply due to the fact that as your website grows, you will need to amplify the keyword research and look for other extensive keywords that will help you rank better.

    Too many keywords on the website will kill your SEO game. Revamp the keyword count of needed. Include keywords in the URL to help your pages/posts rank better and make sure that these URLs are short.

    Keyword cannibalization is yet another SEO degrading factor for your website. It occurs when two pages of your website compete for the same keyword. Make sure that you implement decisions to take care of it.

    Later in the audit, you will be required to analyze the competition for these keywords.

  • Content analysis

  • We all know that the content pushed out by your site must be unique and free of plagiarism. Well, there are several other content analysis checks that must be ensured.

    Begin by checking the presence of duplicate content on your site and seek help from tools such as Siteliner.com. By providing a match percentage, it shows result for pages that match other pages. You can also choose to opt for Copyscape premium services to see if any of your content has been plagiarized over the Internet.

    A good content that helps your website rank better is engaging and SEO optimized. Make sure that you are using the target keywords on all the right places throughout a page on your website.

    • Keyword in the title of the post/page
    • Keyword in the first paragraph
    • Keyword appearances in the H1 Tag
    • Use of keyword in the meta description of the post
    • Keyword appearances in the URL structure
    • Keyword in the alt text of the first image used
    • Keyword appearances towards the end of the post’s content.

    To make sure that your content is engaging, always refresh the user generated data on the comment section of your blogs. Keep interacting with the audience so that you are able to churn a fair share of SEO juice from it as well.

    Making sure that your content is of the right length is just as important as using quality content. Create the right content strategy which is well balanced with short-form, mid-length, and long-form articles. Regularly publishing content is also a huge SEO booster. If you feel like your schedule needs a boost, you can revise it during the next SEO audit.

  • UX analysis

  • The UX furnished by a website is a huge determinant of the site’s traffic and the average time spent by visitors on it.

    Google Analytics can help you have a look at what the UX on your website likely is. Alternatively, you can figure out a great navigation for your site’s audience so that they can seamlessly surf through the pages of your website without having to struggle for the information they are looking for.

    Another crucial site UX ranking factor is the presence of any broken links or pages on your site. Link checker Tools can help you remove such links and ensure that your users don’t encounter them.

  • Image optimization check

  • Make sure that you are optimizing the images present on your website. There are various plugins available which can help you optimize images. And don’t forget to implement the following:

    • Use high resolution images
    • Resize them to declutter your site’s database
    • Use a relevant image that matches your text
    • Name your image file appropriately by using the target keyword
    • Use image alt text.
  • Content promotion analysis

  • If your website is employing content promotion services to share its updates via social media channels, make sure that they are well integrated and are working as intended.

  • Web hosting check

  • The right web hosting service boosts your site’s SEO efforts by complementing its server reputation, website uptime, and obviously the loading speed it offers. Every SEO audit must be an event where you review the performance of your existing web host and if there is a need, make the switch and choose a better service provider.

    Resources like HostingBooth can be very helpful when you are looking to compare the performance of the existing hosting service providers.

  • Website loading speed check

  • Use a tool like Pingdom to determine the existing speed of your website. If it is taking more than 2-3 seconds to load, make sure that you fix it by following the necessary steps at the earliest.

    Don’t skip out the website loading time for mobile devices.

  • Site uptime

  • During all of your site’s SEO audits, you must check the site uptime offered by your website to its users. Site uptime is the duration of the time your website is available to its users for access. A website with frequent downtime will cause the traffic to go down and affect your site’s ranking. To take care of your site’s uptime and maintenance, you can rely on tools like Uptime Robot, Site 24*7, and Pingometer.

  • Link analysis

  • Make sure that your backlinks are relevant and of high quality. Majestic is a great link checker if you want to ascertain the relevancy, authority, and the quality of the backlinks on your website. It’s Backlink History Checker tool will help you determine the number of backlinks detected by its sophisticated web robots for given domains, subdomains or URLs.

    You must also assess the presence of all the outbound and internal links present on your website and get rid of all the low-quality links.

  • URL check

  • The right URL structure is crucial for your website as it tells quite a lot about the SEO on your site. All the URLs on your website must be SEO optimized yet not over-optimized. Take care of all the capital vs. lowercase URLs on the website.

  • Google penalty check

  • The truth is that Google is at the liberty of penalizing your website for not adhering to its regulatory updates or other instructions. A Google Penalty can cause a lot of damage to your site’s online reputation and your site’s SEO may go haywire. And in a situation where you are not aware of such a penalty being levied on your website, any efforts to rectify low SEO score can go to the gutter.

    Hence, during every SEO audit, you must run a Google Penalty check to see if your website has been penalized. There are several tools available that can help you figure that out such as the Panguin Google Penalty Check Tool and the Fruition’s Google Penalty Checker Tool.

  • Sitemaps

  • Sitemaps help Google better understand the structure of your website by giving it the access to your site. Once you have a sitemap setup correctly, you will be required to keep it free of any errors. assuming it’s set up correctly.

    Begin by checking the presence of your site’s sitemap file by by adding sitemap.xml or sitemap.html to the browser. A Google Search Console search will help you figure out how many URLs were successfully indexed when you previously submitted the sitemap. It will also notify you of any problems or issues. Use tools like CodeBeautify and XMLValidation to check your sitemap for errors before submitting it.

  • Site security check

  • A site’s security is a compulsive factor that determines the ranking of your website. If yours is an ecommerce website, you better be way more careful. Since an SSL certificate is now an object of great importance for the impression of a website, make sure that your websites have it and while performing your next SEO audit, take steps to


    Once you are doing everything right, monthly or quarterly SEO audits will help you revamp your SEO strategy and make sure that you are not left behind. With the checklist furnished above, you will be able to perform a more efficient SEO audit for your site and ensure that it ranks well.

    Why AI and international paid media is a match made in hell

    When looking back on summer 2018, it’s hard to ignore the optimism that’s been in the air. Sunny weather? Check. England football triumph? Almost! AI as the next big thing in digital marketing? Try and count the number of articles, blog posts and sound bites that you’ve encountered over the last month which cite AI in a hype-tastic way.

    Now we’re all for a bit of well-reasoned optimism, and there is no doubt that AI is an extremely powerful toolkit that will positively impact all kinds of socio-economic activity. But we’re not so sure about the true value of AI in the context of digital marketing, and specifically for international paid media.

    Back to basics

    Cutting through the hype, let’s start by looking at exactly how AI and machine learning work in the context of international paid media. For example, on a keyword level, how much and what kind of data are needed for AI to make a good decision?

    Well, Google’s machine learning product Smart Bidding states that it “enables you to tailor bids based on each user’s context. Smart Bidding includes important signals like device, location and remarketing lists for better automation and performance”.

    This implies that the signals required by the algorithm can be culled from the sum of users’ behavior, and that its “learning capabilities quickly maximize the accuracy of your bidding models to improve how you optimize the long-tail [by evaluating] patterns in your campaign structure, landing pages, ad text, product information, keyword phrases and many more to identify more relevant similarities across bidding items to effectively borrow learnings between them”.

    This suggests that the ‘go to’ source of data is our own campaign. But what are these patterns, how long is ‘quickly’, and how on earth can landing page data would help with bid management?

    Staying with bid management as an example, we think it works like this:

    Primary data: the algorithm looks back at historic direct interactions with a keyword within a client campaign, and makes a cost/position decision based on pre-defined goals like ROI or CTR, and of enough data.
    One way to address a possible data volume problem would be to look back a long way. But this would ignore seasonality, promotions and changes in consumer behaviors over time.
    Secondary data – the algorithm has insufficient data to make a ‘good’ decision on the primary basis, so uses corroborative data (performance indicators from other campaigns which have similar characteristics (e.g. same vertical, same language) to make decisions.
    Do we even have enough data?

    The question is if, aside from very high-volume big category campaigns (think car insurance, credit cards), there is enough primary data to power effective AI decision making. AI needs a huge amount of data to be effective. When IBM’s Deep Blue learned chess, for instance, the developer relied on 5 million data sets. Most industry experts believe that AI’s biggest limitation will be access to high-quality data of enough scale.

    We also have no idea what a ‘good’ volume of data looks like. This is even more unlikely for international PPC, where campaigns are often very granular, multi-language, and designed to include lots of long tail keywords (which by definition do not have much volume).

    When it comes to secondary data, how relevant can the corroborative data be? For maximum relevance, taking CLIENT X as an example, we’d have to assume that the algorithm is quickly assimilating data from CLIENT X’s direct competitors and using that to better inform the bid management strategy.

    Surely that kind of cross-fertilized data would power all auction players’ bid tactics, creating a loop where no player has an advantage?

    If competitor data is not used, then what kind of secondary data is sufficiently relevant to power good AI decisions. This would easier if we knew definitively how the rules of the algorithms were constructed, but of course, we never will.

    Time for a reality check

    To recap, if we knew that 10, 100 or even 1,000 interactions were enough to deliver superior efficiency via AI, we’d be delighted. Campaigns could be planned and executed to use the optimum blend of AI and human capabilities, with best results for ad platforms, agencies and clients. AI could focus on brand and category level interactions, with human oversight and detailed management of long tail.

    It seems unlikely that adequate transparency as to how AI actually works, how much data is needed, how the ‘rules’ work, will be forthcoming unless significant changes in business models or practices occur.

    Instead, AI is optimistically overhyped as digital’s next big thing while blithely ignoring the basic premise of AI and the current practicalities of both domestic and international digital paid media

    Laying the foundations of good SEO: the most important tasks (part 2)

    SEO is not easy to master. It keeps evolving, with new specifics and techniques added almost daily.

    However, it is imperative to lay the foundations of good SEO by accomplishing time-tested tasks that both beginner and advanced SEOs usually follow in their daily routines.

    In part one of this series, the author explained the most important SEO tasks related to SEO tools, keyword research, and on-site optimization.

    It is now time to tackle technical SEO, content, and off-page optimization.

    Technical SEO

    Technical SEO is vital to your site’s success. For example, one single error in your robots.txt file can prevent your site from being indexed.

    Though technical SEO covers a broad range of subjects concerning the elements needed for optimization, you should primarily prioritize the following areas:

    Crawl errors. If you keep receiving Crawl error reports, this means that Google cannot crawl your site’s URLs and, consequently, cannot rank it. Regularly check Crawl error notifications in the Google Search Console and be sure to fix them as soon as possible
    Google’s access to pages. Unfortunately, crawling error reports do not necessarily indicate that all unreported pages have been scanned and indexed nicely. Sometimes Google is unable to access a page at all. Regularly check all of your pages to ensure they are visible to Google in the Google Search Console
    Broken links. Broken links are a big red flag to Google. Regular checks for broken links should become a vital part of your SEO routine. Fortunately, there is no shortage of tools to find and fix broken links (e.g. Netpeak Spider, Serpstat, Screaming Frog SEO Spider)
    HTTPS has been a ranking signal since 2014, so if you want to give your site an SEO boost, implement HTTPS. Additionally, it will provide your site with a layer of security, which your visitors will appreciate. Check out this guide to make sure you correctly migrate from HTTP to HTTPS
    Duplicate metatags. Google doesn’t appreciate duplicates of anything — be it content, URLs, or metatags. So access the Google Search Console and go to the HTML Improvements tab to find duplicates of title tags and meta descriptions, and then fix them
    Mobile-friendliness. Google has started to use the mobile version of a page for indexing and ranking, so making your site’s pages mobile-friendly is highly advised. If your site is not optimized for mobile, it can still rank nicely in search, but its chances of topping the SERPs will not be as high. So test your pages and make them mobile-friendly
    Loading speed. Page speed for both desktop and mobile is a key ranking factor. Make sure to check your website’s loading speed with the PageSpeed Insights tool, and then move forward by implementing any optimization suggestions it provides. Otherwise, you may risk appearing lower in the SERPs.

    Content is the fuel that feeds Google. Content is an important ranking factor, and sites that consistently craft high-quality content are more likely to have better rankings (but it is not a single decisive ranking factor).

    To succeed with content SEO-wise, you should prioritize the following areas:

    Duplicate content. Duplicate content is a big no-no. Check your site with Copyscape or Siteliner to find all pages that are similar or have content that is partially featured at a third-party website (i.e. plagiarized content). Otherwise, a Google penalty is inevitable
    Keyword use. Optimizing your content around core keywords (or a set of keywords) might be hard, but it is absolutely necessary to get your content seen by your target audiences. Figure out which core and support keywords to place on specific pages of your website, and track their performance regularly
    Content structure. It is hardly a secret that customers skim, rather than read content. For this reason it’s important to properly structure your content:
    Use shorter sentences
    Break down longer paragraphs
    Use subtitles and bulleted lists
    Include multimedia elements, such as images, videos, GIFs and audio files.
    This will make your content easier to digest and should keep visitors on a page for longer.

    Audience personas. Never start putting together a piece of content without having a clear picture of your audience persona in mind, including gender, age, occupation, responsibilities, challenges, and problems they need to solve. Additionally, be mindful of the stage in the buyers’ journey your audience may be at, and enhance your content accordingly
    In-depth, high-quality content. The importance of high-quality content cannot be overemphasized. If you cannot produce quality content, you will lose a considerable portion of your ranking potential in Google and any other search engine. Put out expert content that is supported by data and your own research, such as surveys, reviews, links or traffic analysis. Only high-quality content will matter to your readers, journalists, and eventually to search engines
    Schema markup. Want to help Google understand your content better? Use Schema markup. Though it is not a ranking factor per se, adding rich snippets may increase CTR and, accordingly, benefit your appearance in the SERPs. To do the heavy lifting, Google offers its Structured Data Testing tool
    Multimedia elements. The more images, videos, GIFs, Twitter embeds, and other visuals you use in your content, the better. They allow you to illustrate your point, keep users on the page, and help your site to rank better.
    Off-site optimization

    Links are the bread and butter of SEO. No matter how important other ranking signals may become, links will remain crucial to calculating a website’s ranking in the SERPs, since they are viewed as external citation authority.

    Off-site optimization and link building are not easy to master. To increase ranking, you need to attract high-quality backlinks from relevant, trustworthy resources – and links like these are not easy to come by.

    Here are a few practical steps you can take to start driving backlinks:

    Analysis of existing links. Before building new backlinks, look through existing ones. It will help you:
    Understand which sites link back to you
    See which pages attract backlinks and which do not
    Disavow backlinks that negatively impact your appearance in the SERPs because of their lack of relevance and trustworthiness
    Delete broken backlinks.
    To run the analysis, you can use Ahrefs, MajesticSEO, Netpeak Spider, or any other backlink analysis tool you may have access to

    Analysis of competitor links. One of the most efficient approaches to driving backlinks from your niche is to analyze your competitors’ backlinks and emulate their strategy. Your goal is to find which websites and pages are most linked to, and which of the linking sites drive the most traffic. After that, you need to drive backlinks from the top-performing resources
    Guest posting. Featuring your content in established media and on respected sites is one of the best long-term strategies for driving high-quality links and traffic to your website. Get invested with guest posting, and you may never run out of backlinks
    The problem with guest posting is that you need to become a contributor first, and only then will you get a coveted author box with a link to your website. To get on the radar of established media or industry thought leaders, you have to master outreach – featured posts, links, and mentions will follow (provided your guest posts are good enough)
    Directories and listings. Registering your business on directories and listings is the easiest way to improve your site’s positions in local search. All you need to do is identify top-tier business directories in your niche, fill out and optimize your directory accounts, and make sure that all information you submit is consistent across all directories, listings, and CDAs
    Google My Business. A claimed and properly optimized account at Google My Business (and Bing Places for Business) can make all the difference for your company. It will help you secure a spot in Google’s local three-pack, which means more local traffic and improved rankings
    Link-worthy content. Finally, it is worth noting that content can make or break your link building efforts, specifically when it comes to outreach and guest posting. You will not be able to create contributor accounts and garner backlinks if your content is repetitive and does not offer actionable advice to users. You need to stand out to succeed in the content department.

    In this article the author has shared perspectives on the most important SEO tasks with regard to technical SEO, content, and off-page optimization.

    These three areas of SEO knowledge are essential to master if you want to succeed in increasing your site’s SERPs, driving traffic, and attracting valuable leads. However, bear in mind that you need to set up and fine-tune your SEO tools, do your keyword research, and improve your on-page SEO first.

    How to minimize CPU usage in WordPress

    CPU usage problems are widespread in WordPress websites. They become more prevalent when you use shared resources or a hosting plan that doesn’t have that many resources. However, the CPU usage can also happen in a good hosting plan. When that happens, it slows down your website considerably as there are no resources to serve content to your site.

    CPU usage problems can lead you to not only poor user experience but can quickly impact your website ranking.

    In this article, we will learn how to minimize CPU usage in WordPress. All the tricks that we will share are simple and easy to follow. However, if you can always use a developer that can do it for you. So, without any delay, let’s get started.

    1. Get rid of the unnecessary plugins

    Plugins offer excellent value when it comes to adding features to the website. Any website would require a set of plugins to work correctly. However, it is clear that we as users install plugins that we don’t need. Sometimes, we install plugins to test and then forget to remove them. In other scenarios, some plugin features are overlapped to an extent. In both the cases, it is always good to remove those plugins and bring down the CPU usage.

    Note: Always uninstall plugins according to the official guidelines. If you are not sure always refer to the documentation as removing the plug in the wrong way can affect the site functionality undesirably.

    2. Configure WP Disable

    Another nifty way to reduce CPU usage is to use WP Disable. WP Disable is a plugin that lets you disable WordPress settings that consume unnecessary CPU cycles. For example, you can disable embeds, emojis and much more. Furthermore, it also helps you to reduce HTTP requests that can further improve the website performance. Once you install the plugin, you will get the option to do so with an easy to use interface.

    It also provides dashboard stats so that you can monitor important things right from the go. We recommend disabling things that you just don’t need. Remember, you can always enable them back by toggling on the option. Also, if you are not sure about an option and what it does, it is better to ignore it.

    3. Image optimization

    Another most significant bottleneck that modern websites go through is because of lousy image optimization. Let’s take an example to get a better understanding. When a page loads, it loads different elements including images. With images, their metadata is also loaded. However, the metadata doesn’t help (in some cases). So, you can remove metadata from images and make it easy for the CPU to process the page.

    You can use PNGGauntlet to do the job for you. It is a free plugin. However, if you can also use JPEGmini, a compressor that is paid. You can get it for just $19.99.

    Both the above-recommended plugins also let you compress images. Furthermore, you can also use plugins such as WP Smush.it, EWWW Image Optimizer and more.

    4. Configure WordPress crawl rules

    Believe it or not, your website is crawled by a lot of crawlers. But, not all crawlers are useful. Some are there to scrap your data for other use. As a crawler crawls a site, it uses precious CPU cycles. To ensure that it doesn’t happen, you can easily block them and only let the important ones crawl your website.

    You can use the “crawl rate limiting rules” function in Wordfence to ensure that a useful bot does crawl. Also, you can change settings for crawlers providing that your website doesn’t get slow due to unnecessary CPU usage. You can also block IP address if you think someone is continually hammering your servers for no reason whatsoever. This will improve the experience of legit visitors and enhance CPU performance in every scenario.

    5. Limit Google+ Bing Crawl

    Popular search engines crawling your website is critical. However, they are resource-hungry especially Google crawler. The good news is that you can limit site crawl rate for both Google and Bing. By doing so, you are not affecting your Google rankings. According to Google, you can customize the crawl rate according to your preference. For example, a news website should always have a higher crawl rate compared to a site that published once a day.

    If that’s the case, you should go to the Google Search Console and search for the “site settings.” There you can find the option of crawl rate to low. Bing also offers similar crawl rate control. You can do it by going to Bing Webmasters Tools and then change it in the “Crawl Control” settings.

    6. Database cleaning

    Another proper way to improve the CPU performance is to clean your database. If a database is bloated with unnecessary information, it takes more CPU time to process a simple query. Now, imagine how much database learning can impact if your website requires multiple queries per second.

    With regular queries, your database stores a lot of data that is not required for proper functioning of the website. Some of the examples include post revisions, trash, transients, and so on. Also, plugin data is stored even after a plugin is uninstalled. This can lead to a bloated database.

    The best approach is to use optimization plugins such as WP-Optimize and WP Rocket regularly. You can use both the plugins to automate the cleaning.

    7. Avoid high CPU usage plugins

    You can find a lot of plugins that can have a massive impact on site performance. These plugins should be avoided if you want to have your CPU usage in control. Some of the plugins that can slow your website or eat precious CPU cycle are Jetpack, SumoMe, Visual Composer, and so on. To know the impact of a plugin on your site, you can install it and then go to GTmetrix to check its website impact. If it is too much, it is better to avoid it and find an alternative.

    If you are not sure which plugins are CPU intensive, you can use free hosting to try out the plugins. There are various hosting providers like 000webhost who offers free hosting for beginners. You can also choose to use localhost and check the CPU usage. If everything looks good, you can then install the plugin on your live website.


    These seven tips will surely help you to minimize CPU usage. You can also use cache plugins to improve CPU usage further. There are many points that we are not able to cover. However, the above points will surely help you in getting a decent improvement.

    Google’s speed update now applies for all users – how does it affect you?

    As part of an ongoing user experience crusade that has also included the introduction of AMP, Google has now rolled out its speed update to all users.

    Improving the speed of your website is something that you should invest in regardless of SEO – Google’s own data shows that a 3 seconds load time increases bounce rates by 32% (when compared to a one second load time) and a 5 seconds load time can increase your bounce rates by 90%; and don’t forget, every bounce represents a potential customer lost.

    1Source: Google/SOASTA Research, 2017.

    Bounce rate can also affect organic search rankings in itself

    If Google sees that most of the visitors to a page are going straight back to the search results then it doesn’t consider that page to be giving a good user experience.

    Google sees a bounce as negative to their own user experience and drew a clear line in 2010 in the sand for site owners introducing site speed as a direct ranking factor. Being 2010, this was a desktop-only search update, at the time Google stated that only 1% of search queries would be affected, and it would only appear for searches in English on Google.com.

    At the start of this year, Google introduced site speed as a mobile ranking factor in a limited roll-out, and earlier in July they announced that the update would now apply to all users.

    Google once again stated that the update would only affect a small percentage of queries but with no accurate indication of percentages. Typically, vague and frustrating advice from Google that creates more questions than answers, but here’s some useful advice to see how Google’s speed update affects you.

    The speed update will only affect the slowest pages

    Or to be more precise, “it will only affect pages that deliver the slowest experience to users”.

    This update is not an opportunity to improve the rankings of pages that are already well optimised for speed. You are unlikely to gain any extra visibility by shaving a couple of hundred milliseconds off your load times.

    It also means that you should be very concerned if you do have pages with poor load times – especially if they have historically been receiving significant amounts of organic traffic.

    But how slow is slow?

    Google’s post on the speed update doesn’t explicitly state what they consider slow to be, but their testing tools may offer some clues.

    PageSpeed Insights considers a page load to be slow if it takes 3 seconds or more to deliver the First Contentful Paint or more than 4.2 seconds to deliver the DOMContentLoaded event.

    I’d recommend treating this as a red line to avoid.

    1 more red than green should be a cause for concern

    The tool also handily tells you approximately where you rank compared to other pages – if it tells you that your page is in the bottom third then you should do something about it.

    PageSpeed Insights is a great tool but it only allows you to check one page at a time and possibly as a result, some digital marketers make the mistake of only checking their homepage – all of your pages need to be up-to-speed if you want them to rank.

    There are 3rd party tools available like Pingdom and GTMetrix that make checking and monitoring the speed of multiple pages but for a price.

    If you don’t want to use a paid for tool, Google Analytics is also very useful. If you use it then you’re probably already aware of the Site Speed tools that it offers. Analytics makes it easy to see which pages are underperforming and also helps you to identify important trends as to whether things are getting better or worse – rather than seeing a snapshot.

    One thing to be aware of with Google Analytics is that by default it shows the total page load time – i.e. the time from clicking a link to the page having finished loading entirely – and while this can give you some good insights it’s not the metric that I find to be most useful.

    For the more interesting page speed metrics you need to go to the DOM Timings report within GA’s site speed menu, where you can find the average document content loaded time and average document interactive time metrics. These metrics tell you more about when content becomes available to the user – and to put it bluntly – when the page becomes useful. Users are less likely to bounce if they see progress.

    Once you have found pages that you are concerned about, I would recommend testing them using PageSpeed Insights as actually it tells you what you need to do to speed the page up.

    2PageSpeed Insights offers useful advice

    Avoiding slow pages

    So, we’ve done the easy part and identified pages that we think might fall afoul of the speed update. What now?

    Often, the easiest and biggest win is optimising images – and if you are a digital marketer who doesn’t know too much about coding, then this is something you can easily do yourself without having to take up valuable developer time.

    There is plenty that can be done – and in my experience plenty that gets neglected. One of the biggest challenges for many digital marketers – especially those on the agency side – can be getting buy-in to use developer time.

    Don’t make the mistake of just sending a list of recommendations copied and pasted from Google to your developers and expecting the changes to be made quickly. One tactic I’ve found very effective is to create a concise one-page document explaining what the problem is, what the potential impact is, and a brief for implementation. This helps to separate the issues out and stop them from getting lost in the day-to-day.

    Don’t forget to keep making great content

    Another useful insight that Google gives us about the speed update is “The intent of the search query is still a very strong signal, so a slow page may still rank highly if it has great, relevant content”.

    Despite the importance of speed, I don’t think it should take precedent over ensuring that your site’s content strategy is strong. Rubbish content that loads quickly still won’t rank.

    Play it safe though – if you have strong content that already performs well and your site speed is lagging, treat it as the threat that it is and sort it out.

    Factors that influence your website’s credibility

    Your online website is the digital portrayal of your business. Viewers go through it with an intention of peeking into the functioning of your business and even the reputation of it.

    One of the most decisive factors behind your brand gaining business through its website is the credibility it holds. The potential customers will only bank upon your business website after they trust it and the people behind it. In case you are an ecommerce website or a website that deals with customers’ sensitive information, you will have to put in extra efforts to gain customers’ trust.

    As per the Stanford Web Credibility Research, websites become more credible by being useful and easy to access. A website’s usefulness is marked by its features, functionalities, and UI. However, the website’s ease of use is determined by the implemented web design. Hence, a lot of factors together contribute to enhancing your site’s trust score.

    If you are clueless and would like to make your website more credible, here is a list of the factors that immensely influence your website’s credibility.

    Search engines’ perception of your website

    Search engines employ their algorithms and bots to assess the performance of your website and rank it, accordingly. If you are doing things in a right manner i.e. White SEO, following Google Algorithm updates and content policies, your website keeps moving up the search engine ranking ladder. That is a great first impression for your website visitors.

    With tools like the Alexa Traffic Rank tool, the online audience has access to your site’s global and location-based ranking. Your site’s search engine ranking is capable of heavily impacting your site’s credibility. So, make sure that you put efforts into that direction.

    Social proof of your business

    Establishing a social proof of your business website is crucial for its credibility impression. You need to put up links to all the social media pages of your business on your website. Apart from that, your website having a list of your clients/brands who have been associated with your business can be impactful.

    Client/customer testimonials and reviews

    Regardless of the nature of your business operations, i.e. whether it deals in product sales or services, you will always have customer reviews and testimonials coming in. Based on the customer experience granted by the business, these customers will either have a positive or a negative opinion of your business.

    You can, however, put up the positive client/customers testimonials or reviews for your audience to see. But, refrain from putting up fake testimonials because the internet audience is smart enough to spot that. To make these testimonials and reviews more reliable, you can link them to the social profiles of the related client/customer, if permitted.

    The presence of adverts on the website

    Many blogging websites choose to put up targeted advertisements in order to make ad money. As beneficial and seemingly rewarding it might seem to the website owners, these ads irritate the audience who have to deal with them popping up every now and then.

    To be honest, Ads make your website look less credible. Since it takes only 50 milliseconds for the users to form a first impression of a particular website, you wouldn’t want them to see these ads at least on the homepage of your website.

    An updated blog

    If you are starting career as a blogger and have just set up a website for it, you will obviously need more of your audience’s trust to help your blog grow. And if you are not a blogger but a mainstream business, you would still need a blog because that would eventually build your site’s credibility.

    So, an updated blog that posts regular updates, fresh posts, and engages with the comments made on it, is termed as more credible by the audience.

    Consistent website updates

    If we suppose that your website was set up in the year 2005 and has managed to look the same, you have successfully killed its purpose and probably its audience engagement as well.

    It is very important to keep updating the website content because it gives the message that your business is moving ahead and is growing. Without any updates, the site audience would be free to make an opinion that your business simply doesn’t care.

    Accessible contact information

    If your business or brand isn’t accessible to its audience, people will deem things to be fishy. It is very important for your website to make it easy for your audience to contact you. Hence, for building your credibility, put out your phone number, physical address, and an email address on the website.

    Even if your website is strictly accessible only on a membership-based model, it should make the contact information public for all the audience to see.

    A great web design that encourages seamless navigation

    A study mentioned that 94% of the negative website feedback was design related. If your website design is such that the visitors are having a hard time navigating through its pages or if they get lost while browsing through different sections, your business is in for a loss.

    On the other hand, if your audience is able to figure out the navigation and is able to quickly get to the part they are looking for, your site’s trust meter will go up.

    A fast loading website

    The truth is, if your website takes more than 2-3 seconds to load, visitors will be swift to abandon it. If they do so, they will never have the opportunity to go through the other content which would mean that you will miss out on those visitors who would have otherwise contributed to your site’s credibility score.

    Spelling and grammar

    My personal favorite on the cringe-o-meter: a website with bad grammar and wrong spellings. One couldn’t agree more that a website with such characteristics is a big-time blunder. I personally prefer to stay away from such websites because they don’t appear to be trustable since the website owner does not simply seem to care.

    If you are not a language expert, always rely on professional services to get your content crafted. Hiring an expert content writer will eliminate the chances of glaring mistakes and even help you create value-offering content.

    Detailed product information

    If you run an ecommerce website, you should put up detailed information about the products that you are showcasing. This information can include the physical attributes of the product, its usage, the variants, customer reviews and images to make it easy for the customers to make a choice. When your customers don’t have to look at external resources to retrieve information about the products listed on your site, your site looks more credible.

    Trust seals and website security certificates

    To build your site’s credibility, always consider getting Trust seals which are third-party seals are highly trusted by the online audience. Intended to display the trust score or the sales counter of a particular business, these trust seals are vouched by third-party internet security organizations. Also, having an SSL certificate for your website is a compulsion if you want to come across as a credible online business.

    It is never a bad idea to shell out some investment towards getting trust seals and SSL security certificates for your websites.

    Team members’ bios and photos with their social profiles

    If you are a business that targets the maximum number of leads with its website, you must let out information about the team members who are the people behind your business.

    Putting up bios along with professional looking photographs is a good idea to make your business website look more credible.

    Errors and links

    Broken links and pages that yield an error message every time a visitor clicks on them, make up a deadly combination and contribute to killing your site’s user experience. This could, in turn, make your site appear less reliable. So, figure them out and fix them before your site visitors come across them.