By Dan Taylor
It’s estimated that 46 percent of all searches performed on Google have a local intent, and the Map Pack appears for 93 percent of these.
In September 2016 Google unveiled a new local search algorithm, dubbed Possum, and it pretty much went unnoticed in comparison to the real-time Penguin update released in the same month.
In short, Possum made it harder for businesses to fake being in locations that they’re not (through the likes of virtual offices), as well as tackling Google My Business spam.
Possum, however, isn’t a “single” algorithm update, as it affected both localized search results as well as the Map Pack, which of course are two separate algorithms both triggered by search queries that are interpreted as having a local search intent.
The Google “Fred” update, which hit SERPs back in March, has also had an impact on local search, much like the Phantom updates before it.
A lot of local SERPs are extremely spammy, where websites have been built cheap and location names have been liberally applied to every menu link and keyword on the page, such as this home page sidebar menu:
This of course, is only a snapshot of the page – the menu and tile icons go on a lot more. Spam such as this still ranks on page one, because Google still has to provide results to its users.
Take advantage of the market conditions
A lot of locally-focused websites aren’t built by agencies; the vast majority tend to be self-built or built by bedroom level developers who can churn out a full website for £300 (or less).
Some verticals have seen some significant online investment in recent years, while others lag behind considerably. By investing in a good website and avoiding the same spammy tactics of your competitors, you can create a powerful resource offering user value that Google will appreciate.
Directory submissions and citations
Just to be clear, I’m not talking about just backlinks. Recent studies have shown that citations with a consistent NAP (Name, Address & Phone number) are important to both local algorithms.
There is no magic number to how many directory submissions you should have, but they need to be relevant.
I’ve worked on local campaigns in the UK where they have been previously submitted to directories in Vietnam, Thailand and Australia. Yes, it’s a backlink, but it’s not relevant in the slightest.
Think local with your directories, and exhaust those before moving onto national ones. The number of local directories should also outweigh the nationals were possible. To do this properly, it’s a manual process and to ensure quality it can’t be automated.
Review volume, velocity and diversity factors are important, and in my opinion, they’re going to become more important in the coming months – particularly following the recent release of verified customer reviews for online businesses.
In Google’s Search Quality Evaluator Guidelines, the evaluators are instructed to research a website/brand’s online reputation from external sources in order to assess the quality of the website.
This is why getting reviews on your Google My Business listing, Facebook pages, positive tweets, Yell, Trip Advisor reviews etc are all great. Having testimonials and reviews on your website is great for users, but you wouldn’t publish bad reviews on your own website, would you?
Google accepts that negative reviews appear, but as long the good outweighs the bad, you shouldn’t have anything to worry about. If you do get a negative review, demonstrate your customer service and respond to it. You can set up Google Alerts to monitor for your brand and flag up any external reviews.
Google My Business & Bing Places
Believe it or not, Google My Business is considered a directory, as is Bing Places. It’s important that you have one if you’re a local business, and that you’ve optimised it correctly. This means the correct business name, address and phone number (keep your NAP as consistent as possible), choose an appropriate category and include a thorough description.
localBusiness structured data mark-up
Structured data mark-up (or schema) is an addition to a website’s code that enables Google’s RankBrain (and other AI algorithms from other search engines) to better understand a website’s context by providing it with additional information.
Not all websites are currently utilizing this schema (or any schema), and Google wants you to use it.
If you don’t have developer resource to hand, and you’re not a coder you can use Google’s Data Highlighter to mark-up content – you will need a verified Google Search Console however to make this work.
As well as focusing locally, you need to also consider other ranking factors such as SERP click-through rates.
Optimizing your meta title and description to appeal to local users can have a huge impact on click-through rates, and the change could be as simple as including the phone number in the title tag.
You also need to be on https and have a secure website. Getting hacked, suffering a SQL injection or having malware put on your site can seriously damage your reputation within Google and take a long, long time to recover.
Source:: Search Engine Watch RSS