How advertisers can score big on the next device launch

On the cusp of the latest iPhone release, columnist Purna Virji from Bing offers insights to help advertisers cash in on new device launches.

We’ve all seen the news headlines. They queue, they camp, they chant. And sometimes, they even stampede. Just to get their hands on a new device.

And while these are the more extreme cases, it’s clear that consumers love technology, ready to cast off last year’s phone for one with a better battery, more pixels, and a blush-gold case.

But of course, the decision journey begins long before launch day, creating a significant opportunity for retail advertisers – as long as you get in on the action at the right time.

What makes this new device season even more interesting is the rise in opportunity for brands outside of Apple.

iPhone users are becoming less loyal to Apple over time, according to research by UBS analysts Steven Milunovich and Benjamin Wilson.

This chart is particularly surprising because it shows that Apple’s customer retention rates are heading toward parity with Android phones, including those made by Samsung (which suffered last quarter due to the recall of its flagship device, the Galaxy Note 7).

How can iPhone retailers win back sales? And how can other brands benefit from this launch season? Here’s your optimization plan based on research compiled by Bing Ads (disclosure: my employer) into the customer journey that surrounds new device launches.

Before the announcement

Bing Ads research shows that 33 percent of users start their device journeys well before the official announcement – around 90 days before the official release date – with phone searches peaking on the announcement day.

What does that mean for you?

Pre-launch announcement, your advertising campaign should focus on providing and linking to helpful, informative content and reviews that detail the features and benefits of the device or device plans.

Research and awareness is the name of the game, and your focus right now should be on building up the all-critical top of mind awareness.

After the announcement

On the day of the announcement, Bing Ads data shows that searching peaks, with about 32 percent of searchers starting their research that very day.

Looking at last year’s data, we saw that the announcements impacted behavior across all demographics, with most searches coming from the 35-49 range and women searching far more online than men.

Microsoft internal data, search volume, in selected categories related to iPhone new device launch – all devices, U.S., September 1, 2016 – October 31, 2016.

1. Be sure to layer on demographic bidding on top of your existing bid modifiers.

2. Your next order of business is to make it easy for shoppers to pre-order, upgrade, trade-in, and find accessories for their new device. Key tips:

  • Bid on non-brand and competitor brand keywords to capture switchers.
  • Update keywords and bids periodically as new phone information is revealed to capture incremental traffic volume.
  • Use Sitelink Extensions that point to different phone and plan options.
  • Best of all, for wireless carriers that subsidize the price of phones with contracts, Bing Shopping Campaigns will now accept $0 price products for mobile and tablet devices that are paid for in installments or as part of a contract.

3. You’ll also want to give searchers the ability to comparison shop. Bing Ads research shows that device shoppers like to compare new-to-new, new-to-old, and old-to-old models as well as brand-to-brand. As a frame of reference, last year’s top 10 iPhone-related comparison queries were:

  • iphone 7 vs iphone 7 plus
  • iphone 7 vs iphone 6
  • iphone 6 vs 6s
  • iphone 6 vs iphone 7
  • iphone 7 vs 7 plus
  • iphone 6s vs iphone 7
  • iphone 7 vs galaxy s7
  • iphone 7 vs iphone 6s
  • iphone 7 vs samsung s7
  • iphone se vs iphone 6s

When the device goes on sale

Devices typically go on sale a week or so after pre-order begins. Once that happens, shoppers are usually ready to buy, so you’ll want to make it easy for them to purchase online (or in-store).

  • Test out shorter ads with clear calls to action that take shoppers directly to the item for purchase.
  • Be sure to run Local Inventory Ads as part of your Shopping Campaigns to make it easier for nearby shoppers to find local store information.
  • When you’re setting up your campaign, be sure to set bids in anticipation of peaks when the device announcement hits. As a frame of reference, here’s how CPCs and CTRs were impacted during last year’s launch:
  • To drive foot traffic for in-store purchases, be sure to use location extensions and targeting.
  • When pre-orders ship

    Once the device starts to ship, you’ll want to continue advertising the device, plans, and accessories.

    You’ll also want to have budgets ready in anticipation of competing device launches. For example, the research shows that iPhone searches spiked when the Google Pixel was announced.

    Continue to test ad copy and image variations. And be sure to go above and beyond to populate your feed with as many recommended attributes as possible for each product offers.

    In summary, use the data from last year to help shape your strategy most effectively this season. For more information, the full Bing Ads insights deck includes a handy new device launch checklist and many other valuable tips and insights.

    Baidu SEO: How to optimize for China’s biggest search engine

    Speak to pretty much anyone about SEO and the rhetoric will largely be in regards to the “Big G”.

    It makes sense. Google has a worryingly large share of the search engine market, especially in Europe and the US, so why would SEOs spend additional time trying to capture traffic from the lesser-used search engines?

    However, unless you’ve been a conspiracy theory style recluse for the last two decades, you will have noticed that China is a country with a rather large amount of people, and this comes with new opportunities.

    The Chinese market represents significant opportunity across the board. Admittedly, the opportunities within ultra high growth manufacturing businesses may not be what they were a decade ago but for those willing to make the jump there is the opportunity to tap into one of the world’s largest economies.

    The problem? The Chinese search market is one of the only places in the world where Google is not King; they’re not even heir apparent (they almost totally exited China a number of years ago).

    Baidu rules the roost in China, so if you want to tap into the Chinese search market, you best get acquainted.

    First of all: Get used to the differences

    China is a very different country to those found in ‘The West’, to the point that there are a number of businesses which specialize in helping companies bridge the gap between the regions.

    Heavy state censorship is but one of the not-so-subtle differences. Much like doing business with, or living in a different country with different rules and culture, your best tactic is to accept the circumstances and adapt. For instance, Baidu’s heavy handed inclusion of their own sub products within SERPs would potentially cause conflicts with competition laws in the EU, but not in China.

    New laws in China have significantly reduced the amount of ads in Baidu’s SERPs, but there are still quite a few. Just take it for what it is and, in that most annoying recent British export: Carry on.

    The good news: there are similarities

    I’ll put my hands up and admit, I put off looking into Baidu SEO for longer than I would care to admit. My assumption (assumption being the mother of all f**k ups) was that I would literally have to learn my craft again. Nothing would be the same.

    How wrong I was. Thinking about it, I don’t know why I thought everything would be different. Binary hasn’t changed; yes, there are differences in coding languages and website platforms, but why would Baidu reinvent the wheel? Google arguably didn’t reinvent the wheel; they just added some shiny alloy rims to it.

    As such, your SEO 101 type stuff – Metadata, information architecture, Canonical URLs – all of this is still relevant. Baidu may have different weightings and slightly different rules for some aspects (meta descriptions are taken into account) but at least the fundamentals are similar.

    There are differences, so as a starter pack of sorts we have included some items for consideration when looking to conduct SEO for Baidu:

    Translator/native speaker

    This is absolutely critical. Without a professional translator or a native speaker within your SEO team you are going to find yourself struggling.

    As with Google, using an automated translation tool will result in content that is understandable, but there will inevitably be holes in it. Search engines (including Baidu) care about content and the quality of said content, so you are going to need someone who can create content to the required standard.

    Furthermore, Baidu’s Webmaster Tools are not shown in English, so hopefully it is obvious why you need a native speaker!

    Mobile

    The mobile trend does not stop with the public’s use of mobile devices, or Google’s mobile first indexing and accelerated mobile pages (AMP). In fact, mobile is even more important in China, where owning a desktop or even a laptop was never really commonplace; the mobile is most people’s first and only portal to the online world.

    As such, Baidu care deeply about mobile. They have their own version of AMP (called Mobile Instant Page – MIP) and you can bet that much like Google’s mobile first indexing, Baidu will continue to bake mobile deeper and deeper into their algorithm.

    Load speed and display on mobile devices will be crucial both now and into the future for ranking on Baidu, so make this a priority for your website.

    Simplified characters

    Whilst Baidu will index the more traditional Chinese characters, the search engine favors simplified Chinese characters. Don’t make Baidu work harder than it has to and use the simplified version.

    HTTPS

    In 2015 Baidu announced that HTTPS would be included as a ranking signal, and looking at the updates from Baidu in the latter half of 2016, there is a definite focus on security, especially mobile security. Look up the IceBucket and Skynet updates from Baidu in reference to their focus on mobile security.

    Inbound links

    A few years ago it was generally accepted that Baidu was some way behind Google in terms of their ability to decipher link signals. Think pre-Penguin Update style link building tactics: forget about quality, more is better (not that those tactics were ever justifiable in the long term).

    This is not the case in 2017. Baidu are clearly upping their game when it comes to link metrics. Again, look at the fantastically named ‘Green Radish’ update as an example; it is reminiscent of the Penguin update, targeting spammy link building practices.

    It stands to reason that Baidu will continue to observe Google’s updates and learn from them, subsequently implementing their own updates that focus on preventing such manipulation.

    Do not be afraid

    This may be somewhat controversial due to the fact that there are definitely differences in tactics for SEO teams targeting Google or Baidu. However, both search engines appear to be on a similar trajectory. In the end they both offer the same service: to provide the searcher with the most valuable and relevant result according to their search term.

    Yes, Google and Baidu may have different strategies when it comes to monetizing their traffic, but they still want their users to keep coming back. Both Google and Baidu are constantly improving their ability to highlight the best quality content, alongside an ever quickening shift towards mobile devices as the most critical priority.

    As mentioned previously, I completely understand the reluctance for those ‘Google’ SEOs when it comes to embarking on the learning curve required for Baidu. Furthermore, regardless of the similarities, there will absolutely be a learning curve.

    Even though there is considerable overlap, you will have to get to grips with the prominence attached to certain elements by Baidu and therefore assess where your time is best spent. Don’t forget that you’ll need a native speaker as well!

    What is clear, though, is that Baidu’s development has been significant in recent years, providing a platform that is focused on very similar core principles to Google. If you keep these core principles in mind, rather than looking to take advantage of potential gaps in Baidu’s algorithm, you will have a far more sustainable and long term Baidu SEO strategy.

    How to make sure your dealer locator tools are optimized for local SEO

    Brands with many distributor locations across the country use dealer locator tools to provide consumers information on how to find their products at local stores.

    But more often than not, these dealer locator mappers are missing the necessary local SEO optimizations that make their distributor locations appear prominently on search results pages. These brands are missing out on a huge opportunity to drive more visitors to their website, generate more retail traffic at these physical locations, and capture additional revenue as a result.

    Here are a few simple ways to optimize your dealer locator for local search, to capitalize on this opportunity and quit leaving local money on the table.

    Create dealer pages with search-friendly navigation

    Why do you need to worry about targeting individual dealer location when you’re a big brand?

    • Local branding – Consumers prefer to patronize brands the old-fashioned way: by choosing local locations and stores they know and trust.
    • Effective, geo-targeted content – In order for Google to index and deliver your content to your intended audience, you need to make it locally relevant.
    • No penalties involved – If executed correctly, local SEO for companies with multiple store locations will help you avoid link networking penalties.

    Create location landing pages for each of your dealer locations with an organized linking hierarchy based on city, region, or state. It’s best practice to have less than 100 links on a single page, but you’ll probably want to reduce the internal linking structure even more to improve your user experience on these location pages.

    If you have more than 100 distributor locations, start out at the state level. The state will link to participating cities, which then provide links to each individual location within that city. Make sure that the links in your navigation structure are search engine crawler-friendly; the shorter, the better.

    Example of See’s Candies Locator Breadcrumb Structure:

    Give consumers meaty local info: NAP & map

    For each of these location pages, provide relevant information about the dealer’s store or service location, including address, phone number and hours of operation. Make sure you include a map!

    Make each store’s profile page as unique as possible. For example, you can mention nearby landmarks, which helps the search engines understand your relevance to the local area. Include photos of each location, if possible.

    Optimize your dealer location page titles for people & crawlers

    Draw from your search query data within Search Console to better understand how users search for your business, and use that insight to craft your dealer location page titles.

    Use your brand name, the city, and business type within the title. For instance, “Dad’s Tools Hardware Store in San Diego, CA.”

    Help Google associate your dealer locations in maps to your landing pages

    Finally, using Google’s Business Location Bulk Upload tool, you can deliver your dealer locator information directly to Google Maps. This allows Google Maps to directly associate each of your locations to a unique landing page.

    Use speciality pages for visibility across more relevant queries

    How do you reach people who don’t already know your brand? Good local SEO takes advantage of unbranded keywords and demonstrates your relevance for queries related to certain products and services at the local level.

    For example, imagine you’re a pet retailer trying to reach pet owners actively seeking out services and items tailored to their pet’s needs, but not specifically for your store. These people might be searching for “dog grooming,” or “pet vaccinations near me.”

    Creating and optimizing speciality pages for services within each of your brand’s locations allows you to reach those potential customers while they’re actively looking for local services.

    Just how important are speciality pages? A national pet retailer and service provider used hyperlocal speciality pages to promote their dog training, grooming, vaccinations and aquatics services. These pages drove an additional 1.4 million site visits and helped the retailer achieve a 42% lift in search volume year-over-year.

    Optimizing all of your store locations can be time consuming, but has major benefits. Using a store location tool automates the process of creating these local listings to save you time, reduce errors, and help each location rank prominently when searchers are looking for businesses like yours.

    Google Search Console: What the latest updates mean for marketers

    analytics

    Google Search Console has long been a go-to platform for SEOs on a daily basis.

    It provides invaluable insight into how people are finding our websites, but also allows us to monitor and resolve any issues Google is having in accessing our content.

    Originally known as Google Webmaster Tools, Search Console has benefited from some significant upgrades over the past decade. That said, it is still far from perfect and few would argue that it provides a complete package in its current guise. A raft of industry updates, particularly those affecting mobile rankings, has left Search Console’s list of features in need of an overhaul.

    Therefore, Google’s recent announcement of some ongoing and upcoming changes to the platform was very warmly received by the SEO community. These changes go beyond the cosmetic and should help site owners both identify and rectify issues that are affecting their performance. There have also been some tantalizing glimpses of exciting features that may debut before the end of the year.

    So, what has changed?

    Google categorizes the initial Search Console changes into the following groups: Insights, Workflow, and Feedback Loops.

    Within the Insights category, Google’s new feature aims to identify common “root-cause” issues that are hampering the crawling and indexation of pages on a website. These will then be consolidated into tasks, allowing users to monitor progress and see whether any fixes they submit have been recognized by Google.

    This should be hugely beneficial for site owners and developers as it will accelerate their progress in fixing the big ticket items in the platform.

    On a broader level, this is in line with Google’s drive to use machine learning technologies to automate some laborious tasks and streamline the amount of time people need to spend to get the most out of their products.

    The second area of development is Organizational Workflow which, although not the most glamorous part of an SEO’s work, should bring some benefits that make all of our lives a little easier.

    As part of the Search Console update, users will now be able to share ticket items with various team members within the platform. Given how many people are typically involved in identifying and rectifying technical SEO issues, often based in different teams or even territories, this change should have a direct and positive impact on SEO work streams.

    Historically, these workflows have existed in other software packages in parallel to what occurs directly within Search Console, so bringing everything within the platform is a logical progression.

    The third announcement pertains to Feedback Loops and aims to tackle a longstanding frustration with Search Console. It can be difficult to get everyone on board with making technical fixes, but the time lag we experience in verifying whether the change was effective makes this all the more difficult. If the change does not work, it takes days to realize this and we have to go back to the drawing board.

    This lag is caused by the fact that Google has historically needed to re-crawl a site before any updates to the source code are taken into account. Though this will remain true in terms of affecting performance, site owners will at least be able to see an instant preview of whether their changes will work or not.

    Feedback is also provided on the proposed code changes, so developers can iterate very quickly and adjust the details until the issue is resolved.

    All of the above upgrades will help bring SEO to the center of business discussions and allow teams to work together quickly to improve organic search performance.

    In addition to these confirmed changes, Google has also announced some interesting BETA features that will be rolled out to a wider audience if they are received positively.

    New BETA features

    Google has announced two features that will be tested within a small set of users: Index Coverage report and AMP fixing flow.

    The screenshot below shows how the Index Coverage report will look and demonstrates Google’s dedication to providing a more intuitive interface in Search Console.

    Google_Search_Console_New

    As Google summarized in their announcement of this new report:

    “The new Index Coverage report shows the count of indexed pages, information about why some pages could not be indexed, along with example pages and tips on how to fix indexing issues. It also enables a simple sitemap submission flow, and the capability to filter all Index Coverage data to any of the submitted sitemaps.”

    Once more, we see the objective of going beyond simply displaying information to go to a deeper level and explain why these issues occur. The final, most challenging step, is to automate the prescription of advice to resolve the issues.

    Other platforms have stepped into this arena in the past, with mixed success. SEO is dependent on so many other contingent factors that hard and fast rules tend not to be applicable in most circumstances. Automated advice can therefore either be too vague to be of any direct use, or it can provide specific advice that is inapplicable to the site in question.

    Technical SEO is more receptive to black and white rules than other industry disciplines, however, so there is cause for optimism with this new Google update.

    The second BETA feature is the AMP fixing flow. AMP (Accelerated Mobile Pages) is Google’s open source initiative to improve mobile page loading speeds by using a stripped-back version of HTML code.

    With the weight of one of the world’s biggest companies behind it, AMP has taken hold with an increasing number of industries and looks set to widen its reach soon within both ecommerce and news publishers.

    Google has bet on AMP to see off threats from the likes of Facebook and Snapchat, so it stands to reason that they want to help webmasters get the most out of its features. Any new coding initiative will bring with it a new set of challenges too, and some developers will find a few kinks as they translate their content to AMP HTML.

    The AMP fixing flow will look similar to the screenshot below and will allow users to identify and tackle any anomalies in their code, before receiving instant verification from Google on whether the proposed fix is sufficient.

    What’s next?

    The one aspect of Search Console that all marketers would love to see upgraded is the lag in data processing time. As it stands, the data is typically 48 hours behind, leading to some agonizing waits as marketers hope to analyze performance on a search query level. Compared to the real-time data in many other platforms, including Google Analytics and AdWords, Search Console requires two days to source and process its data from a variety of sources.

    That may change someday, however. As reported on SE Roundtable, Google’s John Mueller has stated that they are investigating ways to speed up the data processing. Although Mueller added, “Across the board, we probably at least have a one-day delay in there to make sure that we can process all of the data on time”, this still hints at a very positive development for SEO.

    With so many changes focused on speed and efficiency, a significant decrease in the data lag time on Search Console would cap this round of upgrades off very nicely.

    How to optimize featured snippets for voice search

    SEO is an exceptionally fast-paced industry, and sometimes keeping on top of the latest updates and imminent changes can be a full-time job in itself.

    One factor that is having an unprecedented effect on organic search is voice search. The combination of an increase in mobile searches and the rise in voice assistants has meant that the way in which people are searching for information online is changing dramatically.

    Whether it’s Siri, Cortana, Amazon Echo, Google, or another robot friend, there is no questioning the importance of voice search. According to Google, more than half of queries will be voice search by 2020, and this requires a refreshed approach to SEO.

    Featured snippets

    One of the key aspects of this is featured snippets, as these are the results which are read aloud in response to voice searches. Featured snippets are often referred to as ‘Position Zero’, a phrase coined by Pete Meyers. They are the direct answer results that appear at the top of the SERPs in a box, and they often include a link back to the source of the answer.

    Source: Stone Temple

    According to Stone Temple Consulting, nearly 30% of 1.4 million Google queries tested now show Featured Snippets. That’s a lot of affected searches. Updating your search strategy to include optimization for featured snippets should therefore be a priority.

    The key point to remember with featured snippets is that if a searcher is using voice search and expecting a verbal reply, they will not be presented with a choice of results. Instead, only one result is read out and where there is a featured snippet, this will be the choice of the voice assistant.

    You could have the most mind-blowing and enticing meta description and title tags in the world, but if you’re not in that sought after position zero, then your lovingly crafted content will not be read aloud and will therefore remain both unseen and unheard by the searcher.

    Long-tail keywords

    Of course the key difference between voice search and standard search is the use of more natural, conversational language. This new style of search and query formats must be factored into your strategy. Cue long-tail keywords!

    The first step in your journey to the exclusive realm of featured snippets is to identify the informational queries related to your product or services. Use keyword tools, but also ask customers and consider the frequently asked questions you receive on a regular basis.

    Answer the Public is a fantastic tool for these more conversational search queries. The tool allows you to dig deeper into user intent by separating the results into question starters, such as ‘who’, ‘why’ and ‘how’. Build on this research by uncovering other similar queries, using the ‘People also asked’ feature on Google.

    Test the questions that you generate by typing them into Google and analyzing the results of the featured snippets. Use these as a basis, but work out what you could add or improve on to make the result even better. Perhaps the article source does not provide much further information, or perhaps the current answer displaying isn’t long enough.

    Optimizing content

    The next step is to write a response to these questions, which is likely to be in the form of a blog post. Sure, Google only shows a small percentage of the text from an article in the featured snippet, but this does not mean that your article should only answer the question directly.

    Include a direct response, but expand on this further in the article and provide depth. We all know that Google loves depth of information and given that featured snippets provide the option to click through to the source, you need to be offering additional information that could be of use to the reader.

    Adopt the formula of answering the question directly and then follow it up by covering other related search queries. This should help you cover all the bases and achieve that coveted spot. Ultimately, you may need to do a bit of testing and reiterating to see what works best.

    It is also worth integrating more Q&A style formats into your content. These do not have to just be limited to an FAQ page and it is worth revisiting how you can optimize some of your content to fit this Q&A style. The easier you can make it for Google to pull the featured snippet from your content, the more likely you are to appear in that prized position zero and therefore benefit from voice search as well.

    More than a third of featured snippets and knowledge boxes contain an image, so another tactic to experiment with is to utilize different formats, such as tables or graphs. Try using numbered points to break up the content into simple steps, as this helps to optimize the content for voice search.

    Plus, if there are too many steps to display in the featured snippet, then Google will include a read more button that links through to your website, which can be an effective way of converting people from voice search into website traffic.

    Local searches

    Voice search is characterized by its prevalence on mobile devices and its focus on local searches. This is of paramount importance to local search strategies, especially given that 50% of mobile visitors who perform a local search will visit a store within one day.

    Although these may not bring up featured snippets, they do reveal the Google My Business profiles, which can be read aloud as directions. In fact, directions are one of the most popular queries for voice search, unsurprising given the push towards hands free, particularly when driving.

    Source: Google Official Blog

    As far as local searches go, you need to make sure that all of your contact information is as accessible as possible on your site. Make Google’s life easy when it comes to crawling your site; you’ll be more likely to feature high up in the featured results and therefore be the winner of the voice search game. Ensure your contact information isn’t simply within an image or this will certainly damage your chances of appearing.

    You know what else you can do to make the Google bots even happier? Schema markup – it will significantly increase the chances of obtaining a featured snippet.

    It goes without saying that you need to keep your Google My Business up to date. Again, this should be part of your SEO efforts anyway, so no sweat there. Once your contact information is updated and where it should be, help the process along by submitting a sitemap to Google. Also submit any new content that you add to your site, just to speed up the process.

    Mobile ready

    Don’t forget to focus on mobile. Given that the vast majority of voice searches occur on mobile, it is essential that the mobile version of your site is fully optimized.

    I know what you’re thinking: but voice search entails that only the featured snippet will be read aloud, so what’s the point of a fancy pants mobile friendly website? Well, Google likes fancy pants mobile friendly websites, that’s why. Don’t believe me? Riddle me this – why are they moving to a mobile-first index?

    Even using voice search, there is always the chance that the user decides to delve into their query further to read more. In which case they’ll head straight to your site, and if it’s not delivering in terms of user experience then you won’t be staying in that top spot for long.

    Conclusion: User intent is key

    When it comes to trying to get that valuable featured snippet place, there is no special technique that is any different to standard SEO. In short, if you are thinking carefully about user intent then you should be fine.

    Given the recent updates like Hummingbird and Rankbrain, user intent should already be a priority in your search strategy. So if you’re doing this then your content should already be optimized for featured snippets and voice search.

    Using voice search for commercial intent is only in its infancy at the moment. However, we are only expecting this to evolve and develop over time as people become more and more comfortable with voice search.

    So you’d best be ready!

    What does Web 3.0 mean for search?

    Web_2.0

    The signs of fundamental change are all around us.

    Digital assistants reside within our living rooms, we consume Internet-based services everywhere, and we are creating data every second of the day.

    A sense pervades of being constantly connected through devices that communicate with each other. The experience of using the Internet is therefore markedly different to what it was 10 years ago.

    What we don’t quite have is a universally accepted label for this era of digital development.

    The phrase “Web 3.0” was first coined back in 2006. Viewed by some industry insiders back then as an “unobtainable dream“, the idea of Web 3.0 has remained elusive.

    However, as technology catches up and the tech giants figure out ways to make sense of the reams of unstructured data we create every second, the dream seems much more obtainable than ever before. In fact, many argue it is already a reality.

    So what exactly is Web 3.0? What makes it so different from Web 2.0? And what do marketers need to do today to prepare for this revolution?

    What is Web 3.0?

    This is a more contentious question than it might at first seem. Many opinions exist on the topic, but the general consensus is that Web 3.0 ushers in an entirely new way of creating websites, of interacting with them, and of utilizing the data that these interactions generate.

    Techopedia’s definition contains a clear depiction of how big this change is:

    “Web 3.0 will be a complete reinvention of the web, something that Web 2.0 was not. Web 2.0 was simply an evolution from the original Web.”

    Web 1.0 was essentially a repository of information that people could read passively, without being able to shape the information or add their own. The move to Web 2.0 was given concrete shape in everyday aspects of online life, such as submitting product reviews on Amazon or launching a personal blog. People were to become very active participants online, whether on social media or on reputable news sites.

    An overhaul in how the Web functions is necessary, if we look at the raw statistics. Global Internet traffic has passed one zettabyte (that’s one trillion gigabytes); over 4 billion people will have Internet access by 2020; over 60,000 searches are performed on Google every second.

    All that data creates possibilities, albeit only if we are equipped to harness them. We imagine hyper-personalized, fluid, targeted online interactions between brands and consumers, but bringing this idea to fruition is a very complex logistical task.

    By converting unstructured data into structured data (simple updates like Schema.org have helped with this), and by ensuring all databases communicate with each other in the same language, lots of new opportunities arise.

    Put succinctly, Web 3.0 will allow us to make sense of all the data that digital devices create.

    It can be seen as a Web that thinks for itself, rather than just following commands.

    This is built on a decentralized, secure platform that allows much more privacy for consumers than they currently have.

    It is easy to spot some threads within this narrative: the use of artificial intelligence, the potential for a blockchain-based solution for storing and sharing data, and the evolution of the semantic web to provide personalized experiences.

    We can summarize our definition by identifying five key factors that set Web 3.0 apart from its earlier incarnation:

    Artificial intelligence

    AI will be used in every walk of life to carry out computational tasks humans are incapable of completing. It will also make decisions for us, whether in driverless cars or in our digital marketing strategies.

    Virtual & augmented reality

    Brands are tapping into the possibilities these technologies bring, providing an entirely new way of connecting that goes far beyond what a static screen can provide.

    The semantic web

    By finally understanding the data each individual creates, technology companies can gain insight into context. This has been a significant push for Google for some time, particularly with the respective launches of Hummingbird and RankBrain. The aim is to go beyond the dictionary definition of each word and comprehend what consumers are using phrases to mean at that particular moment.

    Internet of things

    A true defining feature of Web 3.0 is the proliferation of Internet of Things (IoT) ‘smart’ devices. Examples such as Amazon Echo are well-known, but there are plans to add Internet connectivity to every aspect of our lives.

    Seamless connectivity

    Until now, data has been stored in various formats and communication between data sets can be challenging. Web 3.0 really comes into its own when data exchanges are seamless and ubiquitous.

    This is achieved when Internet-connected devices are omnipresent, from the home to the workplace and everywhere in between; but those devices need to be able to communicate with each other. When that happens, the digital assistant in your car can ask the fridge if you’re out of milk and if so, to order some from Amazon.

    How will Web 3.0 change online interactions?

    The way we source information and find products is still far from frictionless. For example, consider the planning of an upcoming holiday. We could buy a package deal and that would remove a lot of the administrative tasks, but it would be far from a tailored product.

    In reality, most of us will search for deals on flights, research hotels, read travel guides, and talk to people who have been to the destination before via social media.

    Google holiday

    That is a vast improvement on the holiday-booking process pre-Internet. However, Web 3.0 will take this much further.

    Instead of conducting multiple searches in different places, one prompt would be sufficient to pull together all the relevant information. To take our holiday example, we could say to an Internet connected device, “I’m looking for a holiday in Italy later this year with the family, what are my options?” The digital assistant will then dip into its vast interconnected list of databases to retrieve relevant information and organize it, based on your query and provide the best options in one interface.

    Everything from flights to meals to cultural attractions will be pulled together into a truly personalized list of recommendations.

    How will Web 3.0 affect search marketing?

    The example above provides a clear indication of how much things are changing. Optimizing title tags for a higher click-through rate won’t really cut it when an AI-powered digital assistant is bypassing these signals to identify the right content to answer a query.

    Search marketers’ focus should shift towards understanding the different preferences of their user base and creating multimedia content that responds to this. As people become more comfortable with using voice-based digital assistants, we can expect search trends to move away from the likes of [italy holidays 2017] and towards more specific, long-tail queries.

    Searcher behaviors are deeply entrenched and slow to change, but they do change. Recent research from Google showed the drop-off in “near me” queries as users come to expect that results will be local, without adding a geo-modifier.

    Google near me

    Added to August’s news that Microsoft’s speech recognition system has reached a new accuracy milestone, we get a sense that these long-heralded changes are finally coming to pass. Voice search is on the rise, mobile device usage shows no use of relenting, and search engines are using this data to create better interactions.

    Search marketers need to keep up. The first step is to ensure that all content is clearly labeled for search engines. Microdata can be used to achieve this and Schema.org mark-up remains just as vital as it has been for the past few years.

    The core objective when we create new content should be to facilitate its serving to users, no matter where they are or which device they are using. Keyword targeting still matters, but we need to maintain a more nuanced idea of what our consumers really mean.

    Google’s Quick Answers initiative is a particularly telling development in this sense. On the face of it, it seems a rather innocuous and helpful change, but at a deeper level it tells us a lot more. We are moving away from screen-based interfaces that provide lots of choices; consumers want the right answer to their query.

    Performance measurement will continue to change, of course. The idea of tracking keyword level ranking positions remains attractive, but its use as an accurate barometer of how a site is performing has waned significantly. SEO goals should be much more closely aligned to business objectives, which can only be a healthy development.

    We are moving into an age of flux, where the comforting-but-illusory constants of old are replaced by shifting and slippery notions of ‘meaning’ and context’. Those that are ready to adapt soonest will profit most.

    Web 3.0: What do search marketers need to know?

    • Web 3.0 will change how people search, how search engines process their queries, and how results are displayed. These changes have been in process for years now, but they are starting to have tangible impacts on how we find information online.
    • This is driven by improvements in how search engines understand the meaning of queries by harnessing huge amounts of unstructured data and transforming it into something structured and significant.
    • Web 3.0 will also bring with it a new way of creating digital assets. The old ideas of creating a static website will be replaced by hyper-personalized experiences that vary in their messaging and their media formats.
    • AI-powered digital assistants are starting to usher in new behaviors. What search marketers should focus on is creating the right digital assets for their consumers and ensuring that any search engine can locate and serve this content as seamlessly as possible.

    [Report] Who owns the flights market in search?

    Which brands dominate the US flights market in search?

    A new report by Pi Datametrics has analyzed the entire US flights market to discover the most organically valuable search themes and players with the greatest share of voice across the market.

    The search data was collected from across Google US with a view to identifying the search terms with the most commercial opportunity over the last four years, and trended to reveal demand peaks and declines across the travel industry.

    ‘International’ flights: Trended search themes | May 2016 – May 2017

    Image source: Pi Datametrics Market Intelligence

    So what does the data show, and what can marketers learn from it about the state of the flights market?

    The difference between organic value and search volume

    Trended search volume data is a strong indication of research and demand phases, but to determine when a search is most likely to actually convert, Pi has applied their proprietary Organic Value Score.

    Search volume alone doesn’t always indicate value. Pi’s Organic Value Score averages out all of the metrics critical to conversion – including adword data – to reflect the true value of individual search terms, and their overarching search themes.

    Looking at the search volume graph (above) in isolation, ‘Latin America & Caribbean’ appears to be the one of the most important search themes to focus strategy on within the ‘International flights’ market.

    But, if we overlay commercial value, the data tells a slightly different story. ‘Latin America & Caribbean’ devalues significantly, while ‘Europe & Middle East’ retains its competitive edge.

    Share of voice: Top sites across the entire ‘Flights’ market

    Date: 7th June 2017 | Top 20 sites

    Image source: Pi Datametrics Market Intelligence

    Using a datapool of the most valuable ‘International’ and ‘Domestic’ search terms, Pi generated a vast snapshot of the entire US ‘Flights’ market (12,286 sites), to reveal the players dominating the industry.

    Kayak own the US ‘Flights’ market

    Kayak perform best both internationally and domestically, closely followed by Tripadvisor – which has recently transformed into an integrated review / booking site.

    Here are just a few key insights:

    • The top 3 performers own 57% of the entire ‘Flights’ category.
    • All ‘Others’ beyond the top 20 own 10.1% of the ‘Flights’ market. Kayak, alone, owns more than double this.
    • The top 11 performers consist of online travel agencies, aggregators or integrated review and booking sites. These sites own 86% of the entire market.
    • An airline doesn’t appear until position 11, and only owns 0.6% of the category.

    Image source: Pi Datametrics Market Intelligence

    Which airline groups own the entire ‘Flights’ category?

    • Priceline Group owns 33.5% of the entire market – that’s four times more share than the entire remaining market, beyond the top 20
    • Expedia Inc owns 25.6% of the entire market
    • All ‘Others’, beyond the top 20, own a tiny 7.7% of the market
    • Airline providers can use this market share data to establish the best aggregators to resell their ‘Flights’

    When combined, Expedia Inc and Priceline Group own nearly 60% of the entire US ‘Flights’ market. This is astronomical, and has created an ‘illusion of choice’ across the digital travel landscape.

    • Priceline is the 6th largest internet company by revenue ($10.64bn USD).
    • Expedia is the world’s 10th largest internet company by revenue ($8.77bn USD).

    These revenue statistics just prove the success of their digital duopoly.

    What can marketers and SEOs in the travel industry learn from the data about the most valuable search terms? Knowing their most valuable content gives businesses the foresight to dictate strategy.

    From Pi’s trend chart, we can see that Europe and Middle Eastern flights have the highest Organic Value across the US ‘International flights’ market.

    Aggregators, airlines and integrated booking sites can use this data to plan marketing activity around the most valuable flights.

    Why is the online flights market so heavily dominated by just two companies?

    Priceline group and Expedia own significant search real estate, and dominate the flights industry.

    We can’t know exactly how these groups achieve their success, but we can presume that each brand prioritizes search throughout the business.

    What’s more, these groups have an array of interrelated digital assets, which provide greater opportunity for comprehensive link infrastructures. This would only serve to boost their presence across the search landscape.

    Based on the data, we can also see that online travel agencies, aggregators and booking sites decisively outrank airlines themselves in almost all cases. So why is this?

    Based on their business offering, aggregators and OTAs offer a variety of content covering all areas of the flights market.

    As direct providers, airlines may have less opportunity to match this offering, which could in turn impede market share.

    The full report can be downloaded from the Pi Datametrics website.

    Google Analytics: Misunderstandings that hold marketers back

    Google Analytics (GA) has done more than any other platform to bring the practice of data analytics to the center of organizations.

    By offering a free-to-use, intuitive solution to businesses of any size, it has offered the promise of full transparency into customer behavior.

    Moreover, as part of the broader marketing analytics movement, it has helped shape the language we use daily. Our handy guide explains some of the most frequently heard, but at time confusing, terms GA has brought into everyday parlance in the marketing world.

    Pitch decks and strategy sessions abound with references to “data-driven decisions” nowadays, which is a healthy trend for businesses overall. Beyond the buzzword status this phrase has attained, it is true that businesses that integrate analytics into the decision-making process simply get better results.

    Google reports that business leaders are more than twice as likely to act on insights taken from analytics:

    As Google continues to improve its offering, with Optimize and Data Studio available to everyone, and an ever more impressive list of paid products via the Analytics 360 suite, marketers need to understand the data in front of them.

    Unfortunately, there are some common misunderstandings of how Google collects, configures, processes, and reports data.

    The below are some of the commonly misunderstood metrics and features within the core Google Analytics interface.

    By avoiding these pitfalls, you will enable better decisions based on data you can trust.

    Bounce rate

    What is it?

    Bounce rate is a simple, useful metric that is triggered when a user has a single-page session on a website. That is to say, they entered on one URL and left the site from the same URL, without interacting with that page or visiting any others on the site.

    It is calculated as a percentage, by dividing the aggregate number of single-page sessions by the total number of entries to that page. Bounce rate can also be shown on a site-wide level to give an overview of how well content is performing.

    As such, it makes for a handy heuristic when we want to glean some quick insights into whether our customers like a page or not. The assumption is that a high bounce rate is reflective of a poorly performing page, as its contents have evidently not encouraged a reader to explore the site further.

    Why is it misunderstood?

    Bounce rate is at times both misunderstood and misinterpreted.

    A ‘bounce’ occurs when a user views one page on a site and a single request is sent to the Analytics server. Therefore, we can say that Google uses the quantity of engagement hits to classify a bounced session. One request = bounced; more than one request to the server = not bounced.

    This can be problematic, given that any interaction will preclude that session from counting as a bounce. Some pages contain auto-play videos, for example. If the start of a video is tracked as an event, this will trigger an engagement hit. Even if the user exits the page immediately, they will still not be counted as a bounced visit.

    Equally, a user may visit the page, find the exact information they wanted (a phone number or address, for example), and then carry out their next engagement with the brand offline. Their session could be timed out (this happens by default after 30 minutes on GA and then restarts), before they engage further with the site. In either example, this will be counted as a bounced visit.

    That has an impact on the Average Time on Page calculations, of course. A bounced visit has a duration of zero, as Google calculates this based on the time between visiting one page and the next – meaning that single-page visits, and the last page in any given session, will have zero Time on Page.

    Advances in user-based tracking (as opposed to cookie-based) and integration with offline data sources provide cause for optimism; but for now, most businesses using GA will see a bounce rate metric that is not wholly accurate.

    All of this should start to reveal why and how bounce rate can be misinterpreted.

    First of all, a high bounce rate not always a problem. Often, users find what they want by viewing one page and this could actually be a sign of a high-performing page. This occurs when people want very specific information, but can also occur when they visit a site to read a blog post.

    Moreover, a very low bounce rate does not necessarily mean a page is performing well. It may suggest that users have to dig deeper to get the information they want, or that they quickly skim the page and move on to other content.

    With the growing impact of RankBrain, SEOs will understandably view bounce rate as a potential ranking factor. However, it has to be placed in a much wider context before we can assume it has a positive or negative impact on rankings.

    How can marketers avoid this?

    Marketers should never view bounce rate as a measure of page quality in isolation. There really is no such thing as a ‘good’ or ‘bad’ bounce rate in a universal sense, but when combined with other metrics we can get a clearer sense of whether a page is doing its job well.

    Tools like Scroll Depth are great for this, as they allow us to see in more detail how a consumer has interacted with our content.

    We can also make use of Google Tag Manager to adapt the parameters for bounce rate and state, for example, that any user that spends longer than 30 seconds on the page should be discounted as a bounce. This is useful for publishers who tend to receive a lot of traffic from people who read one post and then go elsewhere.

    This is commonly known as ‘adjusted bounce rate’ and it helps marketers get a more accurate view of content interactions. Glenn Gabe wrote a tutorial for Search Engine Watch on how to implement this: How to implement Adjusted Bounce Rate (ABR) via Google Tag Manager.

    Bounce rate can be a very useful metric, but it needs a bit of tweaking for each site before it is truly fit for purpose.

    Channel groupings

    What is it?

    Channels are sources of traffic and they reflect the ways that users find your website. As a result, this is one of the first areas marketers will check in their GA dashboard to evaluate the performance of their different activities.

    There are many ways that people can find websites, so we tend to group these channels together to provide a simpler overview of traffic.

    Google provides default channel groupings out of the box, which will typically look as follows:

    You can find this by navigating this path: Admin > Channel Settings > Channel Grouping.

    Anything that sits outside of these sources will fall into the disconcertingly vague ‘(Other)’ bucket.

    From Google’s perspective, this is a reasonably accurate portrayal of the state of affairs for most websites. However, this is applied with broad brush strokes out of necessity and it shapes how marketers interpret very valuable data.

    Why is it misunderstood?

    Default channel groupings are often misunderstood in the sense that they are taken as the best solution without conducting further investigation.

    Vague classifications like ‘Social’ and ‘Referral’ ignore the varying purposes of the media that fall under these umbrellas. In the case of the former, we would at the very least want to split out our paid and organic social media efforts and treat them separately.

    We want channel groupings to provide a general overview, but perhaps it needn’t be quite so general.

    Leaving these groupings as they are has a significant impact, particularly when it comes to the eternal riddle of channel attribution. If we want to understand which channels have contributed to conversions, we need to have our channels correctly defined as a basic starting point.

    How can marketers avoid this?

    Make use of custom channel groupings that accurately reflect your marketing activities and the experience your consumers will have with your brand online. It is often helpful to group campaigns by their purpose; prospecting and remarketing, for example.

    Custom channel groupings are a great option because they alter the display of data, rather than how it is filtered. You can modify the default channel groupings if you feel confident about the changes you plan to make, but this will permanently affect how data is processed in your account. Always add a new view to test these updates before committing them to your main account dashboard.

    For most, custom channel groupings will be more than sufficient.

    Through the use of regular expressions (known commonly as regex), marketers can set up new rules. Regex is not a particularly complex language to learn and follows a clear logic, but it does take a little bit of getting used to. You can find a great introductory guide to regex expressions here. These rules will allow you to create new channels or alter the pre-defined groupings Google provides.

    Your new channel groupings will be applied to historical data, so you can easily assess the difference they make. These alterations will prove especially invaluable when you compare attribution models within GA.

    Custom Segments

    What are they?

    The array of segmentation options available is undoubtedly one of Google Analytics’ most powerful advantages. Customer segments allow us to view very specific behavioral patterns across demographics, territories and devices, among many others. We can also import segments created by other users, so there is a truly vast selection of options at our disposal.

    By clicking on ‘+ New Segment’ within your GA reports, you will be taken to the Segment Builder interface:

    Google provides a very handy preview tool that shows us what percentage of our audience is included under the terms we are in the process of defining. This will always begin at 100% and decrease as our rules start to hone in on particular metrics and/or dimensions:

    This is where it starts to get tricky, as the segment builder can start to produce unexpected results. A seemingly sound set of rules can return a preview of 0% of total users, much to the marketer’s chagrin.

    Why are they misunderstood?

    The underlying logic in how Google processes and interprets data can be complex, even inconsistent at times.

    When we set up a set of rules, they will be treated sequentially. A session will need to pass the first condition in order to reach the second round, and so on. We therefore need to consider very carefully how we want our experiments to run if we want them to be sound.

    To take a working example, if I want to see how many sessions have included a visit to my homepage and to my blog, I can set up an advanced condition to cover this. I filter by sessions and include a condition for Page exactly matching the blog URL and Page exactly matching the homepage:

    This creates what seems like a valid segment in the preview.

    Logically, I should be able to take this up one level to see what proportion of users meet these conditions. Within the GA hierarchy, users are a superset of sessions, which are in turn a superset of hits.

    However, this is not how things play out in reality. Just by switching the filter from ‘Sessions’ to ‘Users’, the segment is rendered invalid:

    Why does this occur?

    Google uses a different logic to calculate each, which can of course be quite confusing.

    In the former example, Google’s logic allowed room for interpretation, so the AND condition loosely meant that a session could include visits to each page at different times. As such, some sessions meet the requisite conditions.

    In the latter example, the AND rule means that a user must meet both conditions simultaneously to be included. This is impossible, as they cannot be on two pages at once.

    We can still arrive at the same results, but we cannot do so using the AND condition. By removing the second condition and adding a new filter in its place, we can see the same results for Users that we received for Sessions:

    In other words, we need to be very specific about what exactly we mean if we want accurate results from segments created for users, but not quite so explicit with sessions.

    It is better to err on the safe side overall, as the logic employed for Users was rolled out more recently and it is here to stay. The complexity is multiplied when a segment contains filters for users and for sessions, so it is essential to maintain some consistency in how you set these up.

    How can marketers avoid this?

    By understanding the hierarchy of User – Session – Hit, we can start to unpick Google’s inner workings. If we can grasp this idea, it is possible to debug custom segments that don’t work as expected.

    The same idea applies to metrics and dimensions too, where some pairings logically cannot be met within the same segment. Google provides a very comprehensive view of which pairings will and will not work together which is worth checking out.

    Although it does involve quite a bit of trial and error at first, custom segments are worth the effort and remain one of the most powerful tools at the analyst’s disposal.

    The 10 most common WordPress SEO challenges and how to solve them

    If you’re new to the business of SEO and are just figuring out how to optimize your WordPress site for search, navigating the landscape of SEO can seem like a nightmare.

    You’ll have seen a thousand different articles on SEO: on-page optimization tips, off-page optimization tips, SEO basics, email marketing tips, etc. online and implemented them – only to see them fail, or worse, backfire.

    Don’t worry, you’re not alone. SEO can be tricky, and there is always a huge scope for overdoing or underdoing things. While I can’t fully solve this problem for you, I’ll make an attempt to round up the most commonly faced SEO challenges with WordPress so that you can look into your site and make some amends.

    The important thing to understand here is that the same factors can prove to be a boon as well as a challenge when it comes to SEO. The key is to understand your own website intricately and devise plans depending upon what works best for you. Listed below are a few things that are commonly done wrong.

    1. Finding the right theme

    Ask yourself, how did you choose your WordPress theme while creating your website? Odds are you picked the most visually attractive theme that you thought would appeal to your customers.

    Another common mistake people make is picking the most premium or commonly-used themes, as they think these are shortcuts to success. This is where you’re going wrong. Many complicated themes are filled with poor code that slows down your website. And loading time is a small but significant factor that affects your SEO rankings.

    So pick a theme that works best for the nature of your website. Minimalist themes can be just as effective as complicated themes. And remember to check how often these themes are updated; you do not want an outdated theme dragging your site down.

    2. The plugin game

    WordPress plugins can truly be a boon for website SEO. But people tend to overdo it by adding too many of them and as a result, the website becomes heavier and slower to load. In order to improve user experience and your website ranking, it is imperative to pick and install only the right plugins for your website.

    Multiple plugins also tend to occupy excessive server resources. Therefore, many managed WordPress hosts do not allow websites that consume too many resources.

    3. The sitemap issue

    As a basic WordPress website doesn’t give you too many features and controls, you’re bound to install SEO plugins, most of which have the option of sitemaps. You can even create multiple sitemaps by getting additional plugins to allow you further control over your site.

    But here’s the problem. Many people forget to submit their sitemaps to Google Search Console. Once you fail to do that, search engines stop recognizing your sitemaps and needless to say, you won’t show up anywhere despite all your customized plugins.

    4. Link stuffing gone wrong

    Adding links to your site is one of the most important SEO tactics, and can do wonders for your website ranking. Many themes come with pre-set links to help you out. But there are two ways this can go wrong:

    Over stuffing – Nothing overdone is attractive, and adding links is no exception. Adding too many links can distract your user and also turn them off your site. A good rule of thumb to go by is using up to 20 links. This way you’re well within your bounds.
    Stuffing nonsense – The relevance of the content you feed to your customers is more important than you think. Offer original and relevant content that is useful to your customers so that they spend more time on your site, thus improving your rankings.
    5. Schema gone wrong

    Schema markup is the primary code that allows Google (and other search engines) to understand what your website is about. You showcase your Name, Address and Phone Number (NAP) so that Google can run it through its algorithms and display your site if it has local relevance. So this is the single most important thing that helps Google understand who you are and what kind of services you provide.

    This obviously improves your rankings and visibility. But if you get this wrong, it could work against you, as it confuses Google.

    The best way out of this challenge is to gain a deep understanding of how Schema works. Various online resources can help you learn that. For starters, you can check-out the ‘Organization of Schema‘ page to look for the list of most common types of schema markup and the ‘Full Hierarchy‘ page for the schema types that you will need.

    6. Underestimating alt text for images

    People often focus all their attention on optimizing text content and miss out visual content, i.e. images. It’s a big blunder. Without fail, make the time to give your images proper names and descriptions. This will go a long way in improving your site’s functionality, accessibility and ranking.

    So if you’ve overlooked this, rename all your images now and add proper descriptions. Another shortcut to do this is using the SEO Friendly Images WordPress Plugin.

    7. Wrongly done permalinks

    Despite the huge amount of information available on permalinks over the internet, it is one of the most difficult things to get right. And your website takes a really big hit by doing this wrong.

    So here’s an over-simplified tip for you. The ideal permalink will allow you to include two very basic yet important things: post name and category. It should look something like this: “/%category%/%postname%/“.

    What this does is allows search engines as well as your site visitors to clearly understand what your website is about.

    8. Ignoring H1 tags

    As your webpage grows, you might end up having a lot more duplicate content than is advisable. Even if this doesn’t affect you initially, it will in the long run.

    With growing popularity and content, you might feel you have no option but to use the same H1 tags for multiple pages. But this makes search engines alert and eventually averse to your site. So as far as possible, get precise and innovative and provide only unique content for your site.

    And don’t even think of employing the age-old technique of overusing keywords in your meta tags. This might have worked in the past, but Google is very smart and now identifies it.

    9. The sin of using duplicate content

    It’s not an exaggeration when I say it’s a sin to use duplicate content. The problem is that you might be doing this without even knowing that you are.

    The most common mistake in this department is over-categorizing and over-tagging: Google identifies content with multiple common tags and flags them as duplicate content. As a rule, a post should typically be in no more than one or two categories, and tagging should be limited only to the most relevant topics covered in the post.

    Furthermore, if you find no obvious way in which you can tag a specific post, don’t tag it. Not every post needs tagging.

    However, it is easy to tackle this. WordPress offers plugins like All-In-One-SEO or SEO Plugin Yoast to avoid this error. These plugins add ‘No Follow’ tags to pages that help search engines categorize pages appropriately.

    10. Forgetting internal links and related posts

    Linking one article to other relevant content across your site increases the average time spent by a user on your website, and also acts as a search engine ranking signal. However, adding unrelated links or poor-quality content will do the opposite and put them off.

    If you do not wish to use too many internal links, another smart way to go about it is by adding related posts. Get a plugin to pick the right kind of posts to display as related posts to keep your relevance and integrity intact. The best way to do this is getting the right balance between internal links and related posts.

    So read this article through again and thoroughly examine your SEO practices to identify how many of these aforementioned things are you getting wrong, and how many you are doing right.

    Another factor that significantly affects your user experience is your host. A slow host will increase your loading time and therefore affect users. Keep this in mind while picking your web hosting company.

    Your goal should always be to give your visitors rich quality and relevant content, delivered in the right manner and at the right speed. That is the only true way to keep your customers happy and run a thriving website.

    5 SEO features to make sure your ecommerce platform supports

    BigCommerce

    When choosing an ecommerce platform to power your online store, it’s important to consider search engine optimization (SEO) features in your decision-making process.

    No matter how experienced you are with SEO, when you put the power into a platform’s hands, you may or may not end up with ability to control elements of your site that are essential to your ranking success.

    Let’s take a look at five core features that your ecommerce platform needs to have, along with examples of platforms that offer each feature. Hopefully this guide will make it easier for you select a home for your merchandise that is capable of dominating search.

    Editable robots.txt files

    The Robots.txt file allows you to tell search bots which pages and directories to ignore when crawling your site to index it for the search engines. I bet you’re thinking it’s not a big deal if you can’t control this file, but let me show you why it matters.

    When a customer makes a purchase from you, let’s say you send them to a thank you page, offering them a chance to subscribe to your exclusive newsletter where they can stay up to date with the most current sales and promotions before the general public finds out. When someone visits this page, it’s an indication that they’ve successfully completed a purchase, a signal that you can use to track, analyze and optimize buyer journeys.

    That’s definitely a page you wouldn’t want indexed in the search engines – and unless you block access to it with the robots.txt file, it’ll become discoverable by the general public. Basically, any page you don’t want users to see without completing a certain action needs to be blocked from crawlers.

    While many ecommerce platforms don’t allow their users to directly access this file, BigCommerce does.

    BigCommerce also allows you to easily integrate your store with Google Shopping, Facebook, and eBay, as well as a range of other shopping comparison sites, so you can get additional SEO boosts automatically, without having to manually submit your product listings to various engines.

    Independent page titles and URLs

    To avoid issues with duplicate content, and to ensure you have the best chance at ranking for certain keywords and phrases, it’s best to make sure you can control these metadata elements at the page level. Some ecommerce platforms don’t allow you to have this control, meaning you have a generic title in every single page on your shop.

    The URLs may be different for each page, but may end up produced like /randomcharacters989j.htm rather than something like /pink-t-shirts.

    Chances are you’ll want to use specific keywords for each page on top of the ones you use site-wide. You’ll definitely have long tail keywords to use on each product category page as well. Unless you have the ability to control the titles and URLs at the page level, you’re stuck essentially targeting the same keywords on every single page in your store.

    Magento is a robust ecommerce platform that allows users to control the page titles and URLs so they can make the most of their SEO efforts with keywords.

    Magento is, of course, not the only platform that does this, but it is a key feature to look for. That said, Magento isn’t the most intuitively user-friendly option available, and unless you’re already an experienced developer, you may find yourself hiring one to make it work for you.

    An integrated blogging platform

    Because it’s the best way to publish rich, dynamic, link-worthy content on your domain, blogging is an integral part of ecommerce marketing today. Businesses that blog 11 or more times per month receive two to three times the traffic compared to those who blog less often or not at all.

    It can be complex to set your shop up on one platform and your blog on another, and then figure out the best way to link the two together – not just for user experience, but also for SEO purposes.

    You could have your shop on the main domain, and install WordPress in the /blog directory. That works, but can be a bit of hassle in terms of unified design as well as ongoing management workflow. You’ll have to login to your ecommerce platform to handle products, orders, and general store management. Then, you’ll have to login to your WordPress or other blogging platform to add, edit, and manage your blog content. As you grow, it can be harder to handle that at scale.

    Shopify makes it easy by including a blog in the ecommerce platform. This way you can keep everything streamlined. Shopify’s SEO features are solid overall, and the platform also allows for independent page titles and URLs.

    Shopify

    Canonical URLs

    Canonical URLs allow your content to be syndicated in various places online, while telling Google to pay attention to only one URL. It helps Google to determine the page you want rank, with syndicates using the tag to convey to search bots that you deserve all the SEO juice. This solution is also useful in cases where you want to use multiple URLs for the same product category.

    For instance, if you want people to see a list of all the yellow dresses in your store, the URL could be:

    http://www.domain.com/store/dresses/yellow/yellowdresses instead of something like: http://www.domain.com/store/dresses/formal?gclid=98675.

    Canonical URLs allow you to tell the search engines that similar URLs are the same – allowing you to have products that are accessible under multiple URLs. For instance, that yellow dress can be found at /yellowdresses, /formaldresses, and of course on its individual product page. It may also be found on other pages depending on the other filters you make available to your customers.

    When you choose your canonical URL, pick the page that you believe is the most important. Then add a rel=canonical tag when linking from the non-canonical one to the canonical one. Essentially, this redirects the search engines to the important one, so it is more likely to rank – without directing the users away from any of the pages. If you’re having to do this manually, it can become a painstakingly time-consuming task.

    WooCommerce is an ecommerce platform that allows you to convert your WordPress installation into a full-fledged shopping experience. It’s available for free, but the premium version comes with additional features and themes. The support for canonical URLs is built-in to WooCommerce, regardless of which version you choose to use.

    Automatic redirect management

    If your business uses multiple domains, then you’re going to have to spend some time setting up canonical URLs or 301 redirects to forward users and search engines to the right place. If you don’t, you’ll detract from the user experience and risk losing ranking with the search engines because you’re sending traffic to broken links.

    If figuring out your 301 redirects and canonical URLs is driving you crazy, then a platform that use automatic redirects could be a priority for you.

    SquareSpace is a hosted ecommerce platform like BigCommerce and Shopify. It will automatically redirect users and search engines to your primary domain and use canonical URLs to help you. It’s the automatic redirect that allows you to use a custom domain without your built-in Squarespace domain showing up in the search engine results.

    Which one is the best?

    Honestly, what works best for you will depend on what your budget is, and what your ecommerce products are.

    The bottom line is that regardless of what you choose, you need a platform that is helpful for SEO. If it’s not set up for SEO-friendly URLs, then there’s not much point in using it, because if you can’t get it ranking to bring in organic traffic, you’ll spend a great deal more on customer acquisition.