Wednesday, 25 February 2015

Google: Top Heavy Update

Google: Top Heavy Update

Top Heavy was launched in January 2012 by Google as a means to prevent sites that were “top heavy” with ads from ranking well in its listings. Top Heavy is periodically updated. When a fresh Top Heavy Update happens, sites that have removed excessive ads may regain lost rankings. New sites deemed too “top heavy” may get caught.

Google Updates Its Page Layout Algorithm To Go After Sites “Top Heavy” With Ads


google-flyswatter-penalty-featured

Google’s head of search spam, Matt Cutts, announced that Google has released a refresh of its Page Layout Algorithm. The filter, also known as the Top Heavy algorithm, downgrades the ranking of a web page with too many ads at the top or if the ads are deemed too distracting for users.
Cutts said the algorithm was refreshed last Thursday, February 6. Here’s his tweet:
This would be the third confirmed update to the Top Heavy algorithm, with the full release schedule as follows:
  • Top Heavy 1: Jan. 19, 2012 (impacted less than 1% of English searches)
  • Top Heavy 2: Oct. 9, 2012 (impacted 0.7% of English searches)
  • Top Heavy 3: Feb. 6, 2014 (impact not stated)

Background On & Recovering From Top Heavy

What is the page layout algorithm? As we quoted from Google originally:
We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.
So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.
Such sites may not rank as highly going forward.
See also our original article for when Top Heavy was first released, for advice about how a site that’s caught may have to wait until the next release for any changes it’s made to restore rankings.
We have not seen many complaints within the SEO community around February 6th or 7th about any update like this, which suggests it impacted fewer sites than when Google updates other filters like the Panda or Penguin algorithms.

The Top Heavy Update: Pages With Too Many Ads Above The Fold Now Penalized By Google’s “Page Layout” Algorithm


Do you shove lots of ads at the top of your web pages? Think again. Tired of doing a Google search and landing on these types of pages? Rejoice. Google has announced that it will penalize sites with pages that are top-heavy with ads.

Top Heavy With Ads? Look Out!

The change — called the “page layout algorithm” — takes direct aim at any site with pages where content is buried under tons of ads.
From Google’s post on its Inside Search blog today:
We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.
So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.
Such sites may not rank as highly going forward.
Google also posted the same information to its Google Webmaster Central blog.
Sites using pop-ups, pop-unders or overlay ads are not impacted by this. It only applies to static ads in fixed positions on pages themselves, Google told me.

How Much Is Too Much?

How can you tell if you’ve got too many ads above-the-fold? When I talked with the head of Google’s web spam team, Matt Cutts, he said that Google wasn’t going to provide any type of official tools similar to how it provides tools to tell if your site is too slow (site speed is another ranking signal).
Instead, Cutts told me that Google is encouraging people to make use of its Google Browser Size tool or similar tools to understand how much of a page’s content (as opposed to ads) is visible at first glance to visitors under various screen resolutions.
But how far down the page is too far? That’s left to the publisher to decide for themselves. However, the blog post stresses the change should only hit pages with an abnormally large number of ads above-the-fold, compared to the web as a whole:
We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content.
This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page.
This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.

Impacts Less Than 1% Of Searches

Clearly, you’re in trouble if you have little-to-no content showing above the fold for commonly-used screen resolutions. You’ll know you’re in trouble shortly, because the change is now going into effect. If you suddenly see a drop in traffic today, and you’re heavy on the ads, chances are you’ve been hit by the new algorithm.
For those ready to panic, Cutts told me the change will impact less than 1% of Google’s searches globally, which today’s post also stresses.

Fixed Your Ads? Penalty Doesn’t Immediately Lift

What happens if you’re hit? Make changes, then wait a few weeks.
Similar to how last year’s Panda Update works, Google is examining sites it finds and effectively tagging them as being too ad-heavy or not. If you’re tagged that way, you get a ranking decrease attached to your entire site (not just particular pages) as part of today’s launch.
If you reduce ads above-the-fold, the penalty doesn’t instantly disappear. Instead, Google will make note of it when it next visits your site. But it can take several weeks until Google’s “push” or “update” until the new changes it has found are integrated into its overall ranking system, effectively removing penalties from sites that have changed and adding them to new ones that have been caught.
Google’s post explains this more:
If you decide to update your page layout, the page layout algorithm will automatically reflect the changes as we re-crawl and process enough pages from your site to assess the changes.
How long that takes will depend on several factors, including the number of pages on your site and how efficiently Googlebot can crawl the content.
On a typical website, it can take several weeks for Googlebot to crawl and process enough pages to reflect layout changes on the site.
Our Why Google Panda Is More A Ranking Factor Than Algorithm Update article explains the situation with Panda, and how it took time between when publishers made changes to remove “thin” content to when they were restored to Google’s good graces. That process is just as applicable to today’s change, even though Panda itself now has much less flux.

Meanwhile, Google AdSense Pushes Ads

Ironically, on the same day that Google’s web search team announced this change, I received this message from Google’s AdSense team encouraging me to put more ads on my site:
This was in relation to my personal blog, Daggle. The image in the email suggests that Google thinks content pretty much should be surrounded by ads.
Of course, if you watch the video that Google refers me (and others) to in the email, it promotes careful placement, that user experience be considered and, at one point, shows a page top-heavy with ads as something that shouldn’t be done.
Still, it’s not hard to easily find sites using Google’s own AdSense ads that are definitely pushing content down as far down on their pages as they can or trying to hide it. Those pages, AdSense or not, are subject to the new rules, Cutts said.

Pages Ad-Heavy, But Not Top-Heavy With Ads, May Escape

As a searcher, I’m happy with the change. But it might not be perfect. For example, here’s something I tweeted about last year:
Yes, that’s my finger being used as an arrow. I was annoyed that to find the actual download link I was after was surrounded by AdSense-powered ads telling me to download other stuff.
This particular site was heavily used by kids who might easily click on an ad by mistake. That’s potentially bad ROI for those advertisers. Heck, as net-savvy adult, I found it a challenge.
But the problem here wasn’t that the content was pushed “below the fold” by ads. It was that the ratio of ads was so high in relation to the content (a single link), plus the misleading nature of the ads around the content.

Are Google’s Own Search Results Top Heavy?

Another issue is that ads on Google’s own search results pages push the “content” — the unpaid editorial listings — down toward the bottom of the page. For example, here’s exactly what’s visible on my MacBook Pro’s 1680×1050 screen:
(Side note, that yellow color around the ads in the screenshot? It’s much darker in the screenshot than what I see with my eyes. In reality, the color is so washed-out that it might as well be invisible. That’s something some have felt has been deliberately engineered by Google to make ads less noticeable as ads).
The blue box surrounds the content, the search listings that lead you to actual merchants selling trash cans, in this example. Some may argue that the Google shopping results box is further pushing down the “real content” of listings that lead out of Google. But the shopping results themselves do lead you to external merchants, so I consider them to be content.
The example above is pretty extreme, showing the maximum of three ads that Google will ever show above its search results (with a key exception, below). Even then, there’s content visible, with it making up around half the page or more, if you include the Related Searches area as content.
My laptop’s screen resolution is pretty high, of course. Others would see less (Google’s Browser Size tool doesn’t work to measure its own search results pages). But you can expect Google will take “do as I say, not as I do” criticism on this issue.
Indeed, I shared this story initially with the main details, then started working on this section. After that was done, I could see this type of criticism already happening, both in the comments or over on my Google+ post and Facebook post about the change.
Here’s a screenshot that Daniel Weadley shared in my Google+ post about what he sees on his netbook:
In this example, Google’s doing a rare display of four ads. That’s because it’s showing the maximum of three regular ads it will show with a special Comparison Ads unit on top of those. And that will just add fuel to criticisms that if Google is taking aim at pages top-heavy with ads, it might need to also look closer to home.
NOTE: About three hours after I wrote this, Google clearly saw the criticisms about ads on its own search results pages and sent this statement:
This is a site-based algorithm that looks at all the pages across an entire site in aggregate. Although it’s possible to find a few searches on Google that trigger many ads, it’s vastly more common to have no ads or few ads on a page.
Again, this algorithm change is designed to demote sites that make it difficult for a user to get to the content and offer a bad user experience.
Having an ad above-the-fold doesn’t imply that you’re affected by this change. It’s that excessive behavior that we’re working to avoid for our users.

Algorithms? Signals?

Does all this talk about ranking signals and algorithms have you confused? Our videobelow explains briefly how a search engine’s algorithm works to rank web pages:
Also see our Periodic Table Of SEO Ranking Factors, which explains some of the other ranking signals that Google uses in its algorithm:

Name The Update & More Info

Today’s change is a new, significant ranking factor for our table, one we’ll add in a future update, probably as Va, for “Violation, Ad-Heavy site.”
Often when Google rolls out new algorithms, it gives them names. Last year’s Panda Update was a classic example of this. But Google’s not given one to this update (I did ask). It’s just being called the “page layout algorithm.”
Boring. Unhelpful for easy reference. If you’d like to brainstorm a name, visit our posts on Google+ and on Facebook, where we’re asking for ideas.
Now for the self-interested closing. You can bet this will be a big topic of discussion at our upcoming SMX West search marketing conference at the end of next month, especially on the Ask The Search Engines panel. So check out our full agenda and consider attending.
Postscript: Some have been asking in the comments about how Google knows what an ad is. I asked, and here’s what Google said:
We have a variety of signals that algorithmically determine what type of ad or content appears above the fold, but no further details to share. It is completely algorithmic in its detection–we don’t use any sort of hard-coded list of ad providers.

Google: EMD Update

Google: EMD Update

The EMD Update — for “Exact Match Domain” — is a filter Google launched in September 2012 to prevent poor quality sites from ranking well simply because they had words that match search terms in their domain names. When a fresh EMD Update happens, sites that have improved their contentmay regain good rankings. New sites with poor content — or those previously missed by EMD — may get caught. In addition, “false positives” may get released. Our latest news about the EMD Update is below.

Deconstructing The Google EMD Update


Well, it’s official – no more free lunch for EMD, now that theGoogle EMD Update has launched. It worked well, for a long time. A whole industry of exact match domain tools and brokers came up. Huge premium prices for good names just went through the roof when it was a real “money keyword.”
For quite a while, it was possible to rank in the TOP3 with literally no backlinks, compared to non EMDs, often after only a few days in literally every niche you can dream of.
Exact Match Domains (AKA Keyword Domains) are, in general, domain names that exactly match the keyword a website wants to compete for. For example, if a website wants to rank for the term [minor weather report], an exact match domain would be [MinorWeatherReport.com].
For years, it was a free lunch for those in-the-know, and now this loophole is closed, just like other loopholes have been closed years before. But instead of complaining about a free lunch being taken away, SEOs should be thankful for having had it. This cheap way to get traffic, thanks to Google being pretty slow to close the loophole, is now gone.

What The EMD Bonus Included

Two of the first SEO techniques that are taught are on-page keyword optimization andlink building. Interestingly, those two things are very closely related to typical sorts of webspam techniques. Perhaps, it is time to focus SEO efforts on abiding by some standards, and make websites for people, not search engines.
EMDs are more than likely being targeted for violating keyword stuffing, other simple webspam techniques, and not building quality sites. The keyword phrase is in the exact match domain name, and this seems like an easy breezy technique. Start with the keyword from the Exact Match Domain and repeat it throughout the page.
The same is true for links. For a long time, the huge EMD bonus was that the website’s name is the money keyword one wants to rank for. It seemed OK to really overdo linking for the money keyword, as we thought Google couldn’t differentiate between that keyword being a brand name or not. Having lots of links and mentions in the body for your brand name (i.e., CEMPER) makes a lot of sense.
However, if you have a website www.BuyCheapSomething.com, it seemed that Google took Buy Cheap Something also as a brand name and ranked you fast for it.

Google Knew It, We Knew It

Don’t think for a minute that this EMD algorithm just came out of the blue. The patent for the EMD algorithm, Systems and methods for detecting commercial queries, was filed way back in September of 2003, and finally approved a year ago on October 25, 2011. Matt Cutts even talked about how they were going to change the EMD game in a video on March 7, 2011. Is Google being transparent by warning us that they are giving too much weight to EMDs?
We went through the weekly winners and losers list from SEOLytics to look for EMDs that had dropped sharply from rather stable rankings from the week before the update. We also did the opposite and found EMDs that actually gained in rankings after the update.

An Actual Example Of An EMD Loser

So, let’s just take a quick look at a few things a top loser did, and see if we can find if the site violated any of the guidelines.
We randomly picked one of the analyzed users and found this website:
www.businessliabilityinsurance.org
At first appearance, this site looks professional and has some reputable insurance company logos. Of course, we see the keyword phrases [business liability insurance] and [liability insurance] throughout the homepage. Perhaps, it isn’t overly done.
However, the FAQs page returns 38 matches for [business liability insurance] and 47 matches for [liability insurance]. Wow! The Guides page returns a total of 100 matches for [business liability insurance].
Are SEOs creating these types of text areas for users or search engines?
Google EMD loser 01
If you click thru those articles, you see that they are just spun content throughout the whole site, and are targeting every possible location with the keyword [Business Liability Insurance]. Not surprising that this site tanked — if you just look at the on-page factors of overdoing the commercial keyword.

What About The Link Profile?

Looking at the domain’s Power*Trust, we see a poor value of 6. More on that in a bit.
The anchor texts they used are striking when you look at this. There is not even an approach to mention the URL at substantial amounts (3% only, to be precise). The rest of the keywords are all money keywords in various combinations.
Link Profile Pie Chart


The Trust of those links are less than mediocre. There is only one business.com link at 5 and the remainders mostly below 3.

Other Trust Factors

An interesting Unable to connect error came up when trying to access the Get Quotes page. This error to the netquotes.com site could also be a factor. I would classify this as a mix between a sneaky redirect and cloaking. Perhaps, the netquotes.com site is not a trusted site. While doing our research, it did not appear to load.
Google EMD loser 02
Lastly, I really had to laugh at this deceptive maneuver shown below. As users, we value testimonials on websites.

Google EMD loser 03
When we see a photo of the person next to the testimonial, the value and perception that this company is trustworthy is even greater.
However, I know these people and they are not who they say they are. LOL! They are stock photos. This is not cool in my book!
Even the testimonials are keyword optimized.

What Is Trust & Why Is It Important?

Following the recent Penguin 3 update, I went back and reread the Another step to reward high-quality sites blog post on Tuesday, April 24, 2012. Again, I am highlighting this, because I believe it is important to read what Google tells us they are doing and notread a bunch of conspiracy and hate comments.
The fact is that Google wants highly-trusted sites in its rankings. And any attempt at manipulating the game will be dealt with. If you don’t believe me that Google is taking webspam seriously, here is a valuable quote from a paper that was written in 2005.
“Please note that according to our definition, all types of actions intended to boost ranking (either relevance, or importance, or both), without improving the true value of a page, are considered spamming.” (Web Spam Taxonomy by Zoltán Gyöngyi & Hector Garcia-Molina)
It is probably worth noting, Zoltán Gyöngyi is a research scientist at Google who went to Stanford and studied with Professor Hector Garcia-Molina, who was the principal investigator for the Stanford Digital Library Project, in which the Google search engine emerged.
We know from our research that by now Google is definitely targeting webspam and low-quality sites with its algorithms. So, what does it take to develop a high-quality site?
Following are some excerpts from our updated EMD Case Study, you can get the full report here.

Domains Compared By Power * Trust™

Since Google is always mentioning the overall quality of a site, I could think of no better metric than the CEMPER Power*Trust metric. (Admittedly, I am slightly biased here.)
Power means the strength based on the number and power of links (better than PageRank™). Trust indicates the implied Trust of the page in Google, according to a system similar to the Trust Rank patent. By combining both metrics, you can easily rate the overall quality of a domain.
Cemper Power*Trust
The above chart is very clear; the average winner has double the amount of the Power*Trust compared to the losers. This results from a huge number of highly-trusted and very strong backlinks. While most of the losers’ backlinks are potentially low quality, it is pretty clear that the winners have way more links with high Power*Trust.
Our example, businessliabilityinsurance.org, has an even lower Power*Trust value 6 for their domain than the average loser with a Power*Trust value 8.

Domains Compared By Facebook Shares

Social media activity is (and should be) an ever increasing factor in rating the quality of a website. It’s a pretty clear factor to figure out if the audience likes the content and wants to share it with other people.
In general, a very popular website or brand automatically grows in social networks as soon as the reach higher rankings in Google.
Facebook Shares
This chart shows the huge gap between an average winner’s site with up to 180 shares, whereas the losers have only 17. That might result from the growing priority of a social network, as well as the fact that high quality websites in many verticals simply tend to attract more shares than low quality websites.

Domains Compared By Domain Popularity Growth

The Domain Popularity Growth (i.e., growth of new linking root domains per month) is one of the most important factors for a common natural link profile. A healthy website automatically grows over time because more and more webmasters tend to place a link to a higher quality site.
Avg Domain Popularity Growth EMD Win Lose
What we see here is that the winners have constantly more than double the linking domain growth per month than then losers. While the losers have built an average of 1239 new linking domains over the last 2 years (51 per month), the winners have built an average of 3193 new linking domains over the last 2 years (133 per month).

Is Google Really Evil?

Many SEO professionals and webmasters consider Google to be giant, ruthless regime. Many say they don’t follow their own Don’t do evil mantra when it comes to SEOs. Google is the Internet police, etc. etc… Like most regimes, Google constantly tells us that they are trying their best to be transparent. That is what the people want to hear. Google is a transparent corporation that aims to serve its people.
Surprisingly, most governments or corporations are transparent. You just need to read the correct information or listen in a different way. Large, hidden secrets are not implemented without first being written down.
Therefore, it should not be a surprise that the reduce ranking algorithms Google develops are sort of spelled out. Actually, they really are. Mainly, you just need to notviolate the quality guidelines and build higher-quality websites.
Organize the world’s information and make it universally accessible and useful. That is Google’s mission.
Stop for a moment, and dream of a day when all search results on the first page are relevant to want you desire. Is it too much to ask for people to read and abide by some laws?

Google EMD Update – A Shock?

So, it comes as a bit of a shock when I hear outrage and panic about Google’s recent Exact-Match Domain algorithm. SEOs are angry, scared, and confused about how this update is harming their sites. Following the Sept. 29th EMD update, it was amusing to read all the caps and cursing at Google in the blogs.
What did people expect? Learn a few basic SEO techniques, abuse the guidelines, and expect not to get caught?

Key Takeaways

  • Review Google’s Quality Guidelines and make sure you are not in violation of any obvious tactics. For EMDs, keyword stuffing appears to be very common. Similar rules for normal brand domains apply. Many who got away with webspam techniques in the past, lost their free EMD lunch bonus.
  • The overall domain strength is still a very important factor, which means the winners simply got higher quality links and more of them. This makes sense. The typical Quick EMD strategy didn’t need high quality links to rank very high.
  • For all you social fans, it does appear that shares could provide a boost in rankings. Further research will need to be done to confirm this. However, I believe that Google takes the social signals and reads this a value of building a high-quality site. See Google bullet point, “Are the topics driven by genuine interest of readers of the site.”
  • The domain popularity growth of the winners is so much higher and reflects the overall link and site quality.
  • It’s not enough to just build links. More links and better quality lead to a better site, even if it’s an EMD. At least, after the long overdue EMD update.
Finally, Matt Cutts is really sticking to his words to crack down on Web spam in 2012. He closed a lot of loopholes, which surprised me, that were still working during the early part of 2012. I wonder if it’s his secret 30-day challenge to crack down on one old tactic per month?
Personally, I think there’s still a lot more to fix. Based on data we get from our Link Detox technology, which is part of our Link Research Tools software suite (independently reviewed here on SEL), there are still many sites with really bad link profiles and outdated SEO practices that are very easy to detect. I suppose that is also why we should expect these updates to continue on a regular basis.
Again, people, you should thank Matt and Google for letting you get away with EMDs for so long.

Google Pirate Update

Google: Pirate Update


Google’s Pirate Update is a filter introduced in August 2012 designed to prevent sites with many copyright infringement reports, as filed through Google’s DMCA system, from ranking well in Google’s listings. The filter is periodically updated. When this happens, sites previously impacted may escape, if they’ve made the right improvements. The filter may also catch new sites that escaped being caught before, plus it may release “false positives” that were caught.

Google Payday Update

Google: Payday Update

Launched on June 11, 2013 – the “Payday Update” was a new algorithm targeted at cleaning up search results for traditionally “spammy queries” such as [payday loan], pornographic and other heavily spammed queries.

Google is now rolling out the 3rd version of the PayDay Loan algorithm.
Last night we reported that Google is going to be launching PayDay Loan 3.0 and Google’s Matt Cutts posted on Twitter moments ago that it is now rolling out.
Cutts, Google’s head of search quality said, “it’s rolling out now!”
To catch you up, PayDay Loan 2.0 launched a few weeks ago around May 17th and 18th. That specifically targeted very spammy sites in the porn, pills and casino markets.
Payday 3.0 specifically targets spammy queries, versus spammy sites. What exactly that means is not 100% clear. But the types of queries this targets includes terms like [payday loans], [casinos], [viagra] and other forms of highly spammy queries.

Google pigeon Update

Google: Pigeon Update

Google: Pigeon Update

What Is The Google Pigeon Update?

Launched on July 24, 2014 for U.S. English results, the “Pigeon Update” is a new algorithm to provide more useful, relevant and accurate local search results that are tied more closely to traditional web search ranking signals. Google stated that this new algorithm improves their distance and location ranking parameters.

Google Penguin update

Google: Penguin Update

Google: Penguin Update

What Is The Google Penguin Update?

Google launched the Penguin Update in April 2012 to better catch sites deemed to be spamming its search results, in particular those doing so by buying links or obtaining them through link networks designed primarily to boost Google rankings. When a new Penguin Update is released, sites that have taken action to remove bad links (such as through the Google disavow links tool or to remove spam may regain rankings. New sites not previously caught might get trapped by Penguin. “False positives,” sites that were caught by mistake, may escape.

    Google Panda Update

    Google: Panda Update

    Google: Panda Update

    What Is The Google Panda Update?

    Google’s Panda Update is a search filter introduced in February 2011 meant to stop sites with poor quality content from working their way into Google’s top search results. Panda is updated from time-to-time. When this happens, sites previously hit may escape, if they’ve made the right changes. Panda may also catch sites that escaped before. A refresh also means “false positives” might get released.

    Google Hummingbird Update

    Google: Hummingbird

    Google: Hummingbird

    What Is Google Hummingbird?

    “Hummingbird” is the name of the new search platform that Google is using as of September 2013, the name comes from being “precise and fast” and is designed to better focus on the meaning behind the words. Read our Google Hummingbird FAQ here.
    Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.
    Google Hummingbird is designed to apply the meaning technology to billions of pages from across the web, in addition to Knowledge Graph facts, which may bring back better results.

    Google Algorithms

    Google: Algorithm Updates

    Google: Algorithm Updates
    Google has a long history of famous algorithm updates, search index changes and refreshes.
    Below are links to some of the most important resources for search marketers:

    Local SEO 2015

    Local SEO In 2015 – Look At The Big Picture

    When you're doing Local SEO, columnist Greg Gifford reminds us, you must take a step back and look at the big picture if you want to be successful.

    You can't do Local SEO with a microscope
    Local SEO is getting more and more complicated, and as we roll into the new year, I want to share my biggest, most important tip:

    STOP Using Your Microscope In Local Search!

    What do I mean by that? Put simply: A microscope is a great research tool, but it’s a flat-out awful marketing tool.
    It seems like we’re hearing more and more of the same kind of question over the past few months: “I’ve done X on my site, why isn’t it working?” or “I read your post about Y, does it make sense for me to drop everything and go do it?”
    Local SEO is not, nor has it ever been, based upon a single tactic. Recently, it seems like many business owners will read a post or watch a video, realize that they’re not utilizing the tactic mentioned, and immediately drop everything and concentrate on the shiny new object.
    Local SEO is not just citations. Local SEO is not just Google My Business (aka Places) optimization. Local SEO is not just inserting your city and state in title tags and H1 headings.
    When you use your microscope to focus on a single tactic, your field of view is so zoomed in that you’re blinded to the other factors that are just as important. It’s a new year, so try a new process — toss that microscope in the trash, take a step back, and look at the big picture.

    Recent Updates Shed Light On The Big Picture

    You’ve got to spread out all the pieces and figure out how they fit together if you want to assemble the puzzle. Let’s take a look at some important recent updates in the Local Search arena that will help you get a better view of the big picture in 2015:
    • Moz’s Local Search Ranking Factors study is a great place to start. David Mihm’s study shows a simple pie chart that arranges the signal groups by weight in the algorithm. Check out the factors and compare them to your current efforts. If your workflow and task list aren’t a fairly close match to the weighted areas of the pie chart, you might want to shift your priorities a bit.
    • The most important Google update this year for local businesses was Pigeon. Google drastically changed the way that local results are calculated, and business owners and Local SEOs all across the country collectively freaked out.• Last week,Pigeon rolled out in Canada, Australia, and the UK, and the collective freak-out went international. If you’re still holding out hope that Pigeon was just a temporary test, you need to let it go. SEOs in all three countries are reporting the same results that we’ve been seeing for the last 5 months. Pigeon is here to stay.
    • Google threw us for another loop a few weeks ago when it dropped the Local Carousel for many types of businesses. Instead of the Carousel, Google is now displaying an expanded 3-pack.
    • The new expanded 3-pack still shows a thumbnail photo like the Carousel, but the listing only shows the business name, review stars, and a short description. Like the Carousel, if you click on one of the listings, you’re taken to another SERP for that specific business… instead of the business’ website or Google My Business page.
    • Another critical recent update: Google’s updated Google My Business guidelines. Google is cracking down on category selections, removing the option to add a descriptor to business names, and finally flat-out disallowing virtual offices.

    So What Does It All Mean? What’s The Big Picture?

    Google is putting significant effort into local search. Whether we like Pigeon or not, it’s a part of the game now, and we all have to adapt.
    A few years ago, you could rock some citations and do nothing else, and you’d still rank at the top of the map pack. Now, there might not even be a map pack for your vertical.
    With Pigeon and the new expanded 3-pack, it’s clear that mobile behavior is having a huge impact on how Google is approaching local search.
    As the scales tip and users switch to mobile devices, local results will be increasingly unique because Google knows exactly where each user is located. You’ve got to have the right (current) relevancy signals both on your site and off your site if you want to compete in 2015.
    You’ve got to stop focusing on one or two tactics and instead work on everything.
    • Make your content amazing and informative, but optimize it with local signals
    • If you’ve got location pages, make sure they’re unique and well-written
    • Create local content for your blog
    • Optimize the heck out of your Google My Business listing
    • Get some good links to your site
    • Clean up your citations and go grab any that you’re lacking
    • Get more positive reviews on your GMB listing, and on other third party sites
    • Use social media to interact with your customers
    In a nutshell, you should click the “local” button up in the menu and read everything. Twice.
    In 2015, you’ve got to be a jack of all trades if you want to succeed in Local SEO. You can’t just bang out a bunch of citations and win any more. You can’t just put your city and state in a few title tags and H1 headings. You can’t just spam a few Google Places categories.
    You have to look at the big picture and do it all if you want to succeed.