Showing posts with label Google Algorithms. Show all posts
Showing posts with label Google Algorithms. Show all posts

Friday, 15 May 2015

How To Remove Backlinks Of the Website



Backlinks are Most Essential Part of the Website ,Backlinks Increases the Traffic of website as well as its visibility factor in Search Engines. Its Good to have quality Backlinksto the website from Higher page rank or domain Authority of the website . Its better to have a backlinks to the website from our niche .Removing Backlinks Is not Going to be an easy task .Tthere are few Steps will Discuss Below

1. Manual Method

2. Email to webmasters

3. Disavow Method

Before starting This Let me tell you About Google penguin, This is the Google Algorithm which penalises the website which has bad or spam badlinks .If your Website got affected with Penguin algorithm Need not worry You can remove Your backlinks with these following steps .




1.Manual Method: In  this manual method of removing backlinks first we should collect all the backlinks from webmastertools or from some external tools like Href,Backlinkwatcher etc etc...

maintain an excel sheet of those becklink websites and find out manually which link or domain is working and which is not working.


Contact the people who has done off-page of this website and take the credentials from them and log on to each website and manually deleate your AD. If some websites doesnt have log-in then see the step 2. This is the ideal method and simple method for removing the backlinks of the certain websites.


2. Email To webmasters: This is the second method and challenging method in this backlink Removal


After collecting the websites list and maintaining the Excel sheet  ,only thing we can do is search for the contact details of the webmaster of the particular websites and Email Them Requesting them removal of your link from there websites .

Few Examples Of how Template should be

Template 1
From: yourname@mysite.com
Subject line:  Please remove a link.
Hi,
I’m working on cleaning my website, and I need your help in removing some links from your site. Your site is probably perfectly legitimate, but I’m just trying to eliminate as many links as possible.

Here’s the page on your site with the link:  www.example.com/randompage

Here’s the page on my site that you’ve linked to:  www.mysite.com/randompage
The link need to be actually removed, rather than just disavowed. Even if they are “nofollow,” I’d still like them removed.
Once you’ve removed the link, please send me a quick note so I can create a record of it.
Thanks in advance! I hope to hear from you soon.
Kind Regards,
Yourname

Template 2

Hello,

My name is [Insert name] and I am responsible for the website [Insert URL].
The reason why I am contacting you is the content of the following link that points towards [Name of company]:
[Insert Link]
I would like to thank you very much for writing about our company but at the same time ask you to add a rel=”nofollow” attribute or eliminate the link.
I would also like to clarify that you are not the only person that we contact and we just want to make sure that our website’s reliability will not be decreased due to the new Penguin updates.
Please get in touch as soon as possible to confirm that you received the email and removed the link. The alternative solution for us would be to send a Disavow Link report to Google which is something that would also decrease your rankings.
Thank you for the understanding.
[Insert Signature]


Request The webmaster with your first mail and wait for 1month for his reply if you didn't get any reply from the webmaster then Park him the Second mail  which will be somewhat warning to him .
If you still Didnt Got any reply from the webmasters then refer step 3.

Note: before reporting every Link Just go through the website completely and report it. Only thing we have to do is to find the website which are SPAMS.


Some Examples of the Websites which can be considered as Spam Websites 

1. Websites which opens in the different language and its not translatable (ex: chinese language websites)

2. Websites when opens finds nothing in that except a search button to find something

3. when you open a website it automatically opens another website ,where the given URl and displayed URL is Different 



3. Disavow Method:  This is the last and final Method in Backlink Removal . We should try this method only after Trying all the Above Methods. In this method we have to report domains or links which are not working, which are spamming and website domains which are not willing to remove our link after so many requests from our side .


So How to do Disavow method 

domain:ABC.com
domain:bcd.eu
domain:123.net

This is the format we will be reporting the domains to the Disavow tool in webmasters.

collect all the list of domains which you are going to report and save it as .TXT file with UTI code 8


So Be careful About removing Backlinks and Dont go to the Disavow tool directly with out following First two steps .


Tuesday, 24 March 2015

Google To Launch New Doorway Page Penalty Algorithm

Google will take algorithmic action on more doorway pages in the near future. The new algorithm adjustment will impact these pages trying to increase their search footprint.



Google announced they are releasing a new “ranking adjustment” to their doorway page classifier to better handle doorway pages in the search results.
In short, Google does not want to rank doorway pages in their search results. The purpose behind many of these doorway pages is to maximize their search footprint by creating pages both externally on the web or internally on their existing web site, with the goal of ranking multiple pages in the search results, all leading to the same destination.
Google’s Brian White said:
Over time, we’ve seen sites try to maximize their “search footprint” without adding clear, unique value. These doorway campaigns manifest themselves as pages on a site, as a number of domains, or a combination thereof. To improve the quality of search results for our users, we’ll soon launch a ranking adjustment to better address these types of pages. Sites with large and well-established doorway campaigns might see a broad impact from this change.
Google will be launching over this new ranking adjustment shortly and those who have created doorway pages may see them really soon.
How do you know if your web pages are classified as a “doorway page?” Google said asked yourself these questions:
  • Is the purpose to optimize for search engines and funnel visitors into the actual usable or relevant portion of your site, or are they an integral part of your site’s user experience?
  • Are the pages intended to rank on generic terms yet the content presented on the page is very specific?
  • Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic?
  • Are these pages made solely for drawing affiliate traffic and sending users along without creating unique value in content or functionality?
  • Do these pages exist as an “island?” Are they difficult or impossible to navigate to from other parts of your site? Are links to such pages from other pages within the site or network of sites created just for search engines?

Wednesday, 25 February 2015

Google: Top Heavy Update

Google: Top Heavy Update

Top Heavy was launched in January 2012 by Google as a means to prevent sites that were “top heavy” with ads from ranking well in its listings. Top Heavy is periodically updated. When a fresh Top Heavy Update happens, sites that have removed excessive ads may regain lost rankings. New sites deemed too “top heavy” may get caught.

Google Updates Its Page Layout Algorithm To Go After Sites “Top Heavy” With Ads


google-flyswatter-penalty-featured

Google’s head of search spam, Matt Cutts, announced that Google has released a refresh of its Page Layout Algorithm. The filter, also known as the Top Heavy algorithm, downgrades the ranking of a web page with too many ads at the top or if the ads are deemed too distracting for users.
Cutts said the algorithm was refreshed last Thursday, February 6. Here’s his tweet:
This would be the third confirmed update to the Top Heavy algorithm, with the full release schedule as follows:
  • Top Heavy 1: Jan. 19, 2012 (impacted less than 1% of English searches)
  • Top Heavy 2: Oct. 9, 2012 (impacted 0.7% of English searches)
  • Top Heavy 3: Feb. 6, 2014 (impact not stated)

Background On & Recovering From Top Heavy

What is the page layout algorithm? As we quoted from Google originally:
We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.
So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.
Such sites may not rank as highly going forward.
See also our original article for when Top Heavy was first released, for advice about how a site that’s caught may have to wait until the next release for any changes it’s made to restore rankings.
We have not seen many complaints within the SEO community around February 6th or 7th about any update like this, which suggests it impacted fewer sites than when Google updates other filters like the Panda or Penguin algorithms.

The Top Heavy Update: Pages With Too Many Ads Above The Fold Now Penalized By Google’s “Page Layout” Algorithm


Do you shove lots of ads at the top of your web pages? Think again. Tired of doing a Google search and landing on these types of pages? Rejoice. Google has announced that it will penalize sites with pages that are top-heavy with ads.

Top Heavy With Ads? Look Out!

The change — called the “page layout algorithm” — takes direct aim at any site with pages where content is buried under tons of ads.
From Google’s post on its Inside Search blog today:
We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.
So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.
Such sites may not rank as highly going forward.
Google also posted the same information to its Google Webmaster Central blog.
Sites using pop-ups, pop-unders or overlay ads are not impacted by this. It only applies to static ads in fixed positions on pages themselves, Google told me.

How Much Is Too Much?

How can you tell if you’ve got too many ads above-the-fold? When I talked with the head of Google’s web spam team, Matt Cutts, he said that Google wasn’t going to provide any type of official tools similar to how it provides tools to tell if your site is too slow (site speed is another ranking signal).
Instead, Cutts told me that Google is encouraging people to make use of its Google Browser Size tool or similar tools to understand how much of a page’s content (as opposed to ads) is visible at first glance to visitors under various screen resolutions.
But how far down the page is too far? That’s left to the publisher to decide for themselves. However, the blog post stresses the change should only hit pages with an abnormally large number of ads above-the-fold, compared to the web as a whole:
We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content.
This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page.
This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.

Impacts Less Than 1% Of Searches

Clearly, you’re in trouble if you have little-to-no content showing above the fold for commonly-used screen resolutions. You’ll know you’re in trouble shortly, because the change is now going into effect. If you suddenly see a drop in traffic today, and you’re heavy on the ads, chances are you’ve been hit by the new algorithm.
For those ready to panic, Cutts told me the change will impact less than 1% of Google’s searches globally, which today’s post also stresses.

Fixed Your Ads? Penalty Doesn’t Immediately Lift

What happens if you’re hit? Make changes, then wait a few weeks.
Similar to how last year’s Panda Update works, Google is examining sites it finds and effectively tagging them as being too ad-heavy or not. If you’re tagged that way, you get a ranking decrease attached to your entire site (not just particular pages) as part of today’s launch.
If you reduce ads above-the-fold, the penalty doesn’t instantly disappear. Instead, Google will make note of it when it next visits your site. But it can take several weeks until Google’s “push” or “update” until the new changes it has found are integrated into its overall ranking system, effectively removing penalties from sites that have changed and adding them to new ones that have been caught.
Google’s post explains this more:
If you decide to update your page layout, the page layout algorithm will automatically reflect the changes as we re-crawl and process enough pages from your site to assess the changes.
How long that takes will depend on several factors, including the number of pages on your site and how efficiently Googlebot can crawl the content.
On a typical website, it can take several weeks for Googlebot to crawl and process enough pages to reflect layout changes on the site.
Our Why Google Panda Is More A Ranking Factor Than Algorithm Update article explains the situation with Panda, and how it took time between when publishers made changes to remove “thin” content to when they were restored to Google’s good graces. That process is just as applicable to today’s change, even though Panda itself now has much less flux.

Meanwhile, Google AdSense Pushes Ads

Ironically, on the same day that Google’s web search team announced this change, I received this message from Google’s AdSense team encouraging me to put more ads on my site:
This was in relation to my personal blog, Daggle. The image in the email suggests that Google thinks content pretty much should be surrounded by ads.
Of course, if you watch the video that Google refers me (and others) to in the email, it promotes careful placement, that user experience be considered and, at one point, shows a page top-heavy with ads as something that shouldn’t be done.
Still, it’s not hard to easily find sites using Google’s own AdSense ads that are definitely pushing content down as far down on their pages as they can or trying to hide it. Those pages, AdSense or not, are subject to the new rules, Cutts said.

Pages Ad-Heavy, But Not Top-Heavy With Ads, May Escape

As a searcher, I’m happy with the change. But it might not be perfect. For example, here’s something I tweeted about last year:
Yes, that’s my finger being used as an arrow. I was annoyed that to find the actual download link I was after was surrounded by AdSense-powered ads telling me to download other stuff.
This particular site was heavily used by kids who might easily click on an ad by mistake. That’s potentially bad ROI for those advertisers. Heck, as net-savvy adult, I found it a challenge.
But the problem here wasn’t that the content was pushed “below the fold” by ads. It was that the ratio of ads was so high in relation to the content (a single link), plus the misleading nature of the ads around the content.

Are Google’s Own Search Results Top Heavy?

Another issue is that ads on Google’s own search results pages push the “content” — the unpaid editorial listings — down toward the bottom of the page. For example, here’s exactly what’s visible on my MacBook Pro’s 1680×1050 screen:
(Side note, that yellow color around the ads in the screenshot? It’s much darker in the screenshot than what I see with my eyes. In reality, the color is so washed-out that it might as well be invisible. That’s something some have felt has been deliberately engineered by Google to make ads less noticeable as ads).
The blue box surrounds the content, the search listings that lead you to actual merchants selling trash cans, in this example. Some may argue that the Google shopping results box is further pushing down the “real content” of listings that lead out of Google. But the shopping results themselves do lead you to external merchants, so I consider them to be content.
The example above is pretty extreme, showing the maximum of three ads that Google will ever show above its search results (with a key exception, below). Even then, there’s content visible, with it making up around half the page or more, if you include the Related Searches area as content.
My laptop’s screen resolution is pretty high, of course. Others would see less (Google’s Browser Size tool doesn’t work to measure its own search results pages). But you can expect Google will take “do as I say, not as I do” criticism on this issue.
Indeed, I shared this story initially with the main details, then started working on this section. After that was done, I could see this type of criticism already happening, both in the comments or over on my Google+ post and Facebook post about the change.
Here’s a screenshot that Daniel Weadley shared in my Google+ post about what he sees on his netbook:
In this example, Google’s doing a rare display of four ads. That’s because it’s showing the maximum of three regular ads it will show with a special Comparison Ads unit on top of those. And that will just add fuel to criticisms that if Google is taking aim at pages top-heavy with ads, it might need to also look closer to home.
NOTE: About three hours after I wrote this, Google clearly saw the criticisms about ads on its own search results pages and sent this statement:
This is a site-based algorithm that looks at all the pages across an entire site in aggregate. Although it’s possible to find a few searches on Google that trigger many ads, it’s vastly more common to have no ads or few ads on a page.
Again, this algorithm change is designed to demote sites that make it difficult for a user to get to the content and offer a bad user experience.
Having an ad above-the-fold doesn’t imply that you’re affected by this change. It’s that excessive behavior that we’re working to avoid for our users.

Algorithms? Signals?

Does all this talk about ranking signals and algorithms have you confused? Our videobelow explains briefly how a search engine’s algorithm works to rank web pages:
Also see our Periodic Table Of SEO Ranking Factors, which explains some of the other ranking signals that Google uses in its algorithm:

Name The Update & More Info

Today’s change is a new, significant ranking factor for our table, one we’ll add in a future update, probably as Va, for “Violation, Ad-Heavy site.”
Often when Google rolls out new algorithms, it gives them names. Last year’s Panda Update was a classic example of this. But Google’s not given one to this update (I did ask). It’s just being called the “page layout algorithm.”
Boring. Unhelpful for easy reference. If you’d like to brainstorm a name, visit our posts on Google+ and on Facebook, where we’re asking for ideas.
Now for the self-interested closing. You can bet this will be a big topic of discussion at our upcoming SMX West search marketing conference at the end of next month, especially on the Ask The Search Engines panel. So check out our full agenda and consider attending.
Postscript: Some have been asking in the comments about how Google knows what an ad is. I asked, and here’s what Google said:
We have a variety of signals that algorithmically determine what type of ad or content appears above the fold, but no further details to share. It is completely algorithmic in its detection–we don’t use any sort of hard-coded list of ad providers.

Google: EMD Update

Google: EMD Update

The EMD Update — for “Exact Match Domain” — is a filter Google launched in September 2012 to prevent poor quality sites from ranking well simply because they had words that match search terms in their domain names. When a fresh EMD Update happens, sites that have improved their contentmay regain good rankings. New sites with poor content — or those previously missed by EMD — may get caught. In addition, “false positives” may get released. Our latest news about the EMD Update is below.

Deconstructing The Google EMD Update


Well, it’s official – no more free lunch for EMD, now that theGoogle EMD Update has launched. It worked well, for a long time. A whole industry of exact match domain tools and brokers came up. Huge premium prices for good names just went through the roof when it was a real “money keyword.”
For quite a while, it was possible to rank in the TOP3 with literally no backlinks, compared to non EMDs, often after only a few days in literally every niche you can dream of.
Exact Match Domains (AKA Keyword Domains) are, in general, domain names that exactly match the keyword a website wants to compete for. For example, if a website wants to rank for the term [minor weather report], an exact match domain would be [MinorWeatherReport.com].
For years, it was a free lunch for those in-the-know, and now this loophole is closed, just like other loopholes have been closed years before. But instead of complaining about a free lunch being taken away, SEOs should be thankful for having had it. This cheap way to get traffic, thanks to Google being pretty slow to close the loophole, is now gone.

What The EMD Bonus Included

Two of the first SEO techniques that are taught are on-page keyword optimization andlink building. Interestingly, those two things are very closely related to typical sorts of webspam techniques. Perhaps, it is time to focus SEO efforts on abiding by some standards, and make websites for people, not search engines.
EMDs are more than likely being targeted for violating keyword stuffing, other simple webspam techniques, and not building quality sites. The keyword phrase is in the exact match domain name, and this seems like an easy breezy technique. Start with the keyword from the Exact Match Domain and repeat it throughout the page.
The same is true for links. For a long time, the huge EMD bonus was that the website’s name is the money keyword one wants to rank for. It seemed OK to really overdo linking for the money keyword, as we thought Google couldn’t differentiate between that keyword being a brand name or not. Having lots of links and mentions in the body for your brand name (i.e., CEMPER) makes a lot of sense.
However, if you have a website www.BuyCheapSomething.com, it seemed that Google took Buy Cheap Something also as a brand name and ranked you fast for it.

Google Knew It, We Knew It

Don’t think for a minute that this EMD algorithm just came out of the blue. The patent for the EMD algorithm, Systems and methods for detecting commercial queries, was filed way back in September of 2003, and finally approved a year ago on October 25, 2011. Matt Cutts even talked about how they were going to change the EMD game in a video on March 7, 2011. Is Google being transparent by warning us that they are giving too much weight to EMDs?
We went through the weekly winners and losers list from SEOLytics to look for EMDs that had dropped sharply from rather stable rankings from the week before the update. We also did the opposite and found EMDs that actually gained in rankings after the update.

An Actual Example Of An EMD Loser

So, let’s just take a quick look at a few things a top loser did, and see if we can find if the site violated any of the guidelines.
We randomly picked one of the analyzed users and found this website:
www.businessliabilityinsurance.org
At first appearance, this site looks professional and has some reputable insurance company logos. Of course, we see the keyword phrases [business liability insurance] and [liability insurance] throughout the homepage. Perhaps, it isn’t overly done.
However, the FAQs page returns 38 matches for [business liability insurance] and 47 matches for [liability insurance]. Wow! The Guides page returns a total of 100 matches for [business liability insurance].
Are SEOs creating these types of text areas for users or search engines?
Google EMD loser 01
If you click thru those articles, you see that they are just spun content throughout the whole site, and are targeting every possible location with the keyword [Business Liability Insurance]. Not surprising that this site tanked — if you just look at the on-page factors of overdoing the commercial keyword.

What About The Link Profile?

Looking at the domain’s Power*Trust, we see a poor value of 6. More on that in a bit.
The anchor texts they used are striking when you look at this. There is not even an approach to mention the URL at substantial amounts (3% only, to be precise). The rest of the keywords are all money keywords in various combinations.
Link Profile Pie Chart


The Trust of those links are less than mediocre. There is only one business.com link at 5 and the remainders mostly below 3.

Other Trust Factors

An interesting Unable to connect error came up when trying to access the Get Quotes page. This error to the netquotes.com site could also be a factor. I would classify this as a mix between a sneaky redirect and cloaking. Perhaps, the netquotes.com site is not a trusted site. While doing our research, it did not appear to load.
Google EMD loser 02
Lastly, I really had to laugh at this deceptive maneuver shown below. As users, we value testimonials on websites.

Google EMD loser 03
When we see a photo of the person next to the testimonial, the value and perception that this company is trustworthy is even greater.
However, I know these people and they are not who they say they are. LOL! They are stock photos. This is not cool in my book!
Even the testimonials are keyword optimized.

What Is Trust & Why Is It Important?

Following the recent Penguin 3 update, I went back and reread the Another step to reward high-quality sites blog post on Tuesday, April 24, 2012. Again, I am highlighting this, because I believe it is important to read what Google tells us they are doing and notread a bunch of conspiracy and hate comments.
The fact is that Google wants highly-trusted sites in its rankings. And any attempt at manipulating the game will be dealt with. If you don’t believe me that Google is taking webspam seriously, here is a valuable quote from a paper that was written in 2005.
“Please note that according to our definition, all types of actions intended to boost ranking (either relevance, or importance, or both), without improving the true value of a page, are considered spamming.” (Web Spam Taxonomy by Zoltán Gyöngyi & Hector Garcia-Molina)
It is probably worth noting, Zoltán Gyöngyi is a research scientist at Google who went to Stanford and studied with Professor Hector Garcia-Molina, who was the principal investigator for the Stanford Digital Library Project, in which the Google search engine emerged.
We know from our research that by now Google is definitely targeting webspam and low-quality sites with its algorithms. So, what does it take to develop a high-quality site?
Following are some excerpts from our updated EMD Case Study, you can get the full report here.

Domains Compared By Power * Trust™

Since Google is always mentioning the overall quality of a site, I could think of no better metric than the CEMPER Power*Trust metric. (Admittedly, I am slightly biased here.)
Power means the strength based on the number and power of links (better than PageRank™). Trust indicates the implied Trust of the page in Google, according to a system similar to the Trust Rank patent. By combining both metrics, you can easily rate the overall quality of a domain.
Cemper Power*Trust
The above chart is very clear; the average winner has double the amount of the Power*Trust compared to the losers. This results from a huge number of highly-trusted and very strong backlinks. While most of the losers’ backlinks are potentially low quality, it is pretty clear that the winners have way more links with high Power*Trust.
Our example, businessliabilityinsurance.org, has an even lower Power*Trust value 6 for their domain than the average loser with a Power*Trust value 8.

Domains Compared By Facebook Shares

Social media activity is (and should be) an ever increasing factor in rating the quality of a website. It’s a pretty clear factor to figure out if the audience likes the content and wants to share it with other people.
In general, a very popular website or brand automatically grows in social networks as soon as the reach higher rankings in Google.
Facebook Shares
This chart shows the huge gap between an average winner’s site with up to 180 shares, whereas the losers have only 17. That might result from the growing priority of a social network, as well as the fact that high quality websites in many verticals simply tend to attract more shares than low quality websites.

Domains Compared By Domain Popularity Growth

The Domain Popularity Growth (i.e., growth of new linking root domains per month) is one of the most important factors for a common natural link profile. A healthy website automatically grows over time because more and more webmasters tend to place a link to a higher quality site.
Avg Domain Popularity Growth EMD Win Lose
What we see here is that the winners have constantly more than double the linking domain growth per month than then losers. While the losers have built an average of 1239 new linking domains over the last 2 years (51 per month), the winners have built an average of 3193 new linking domains over the last 2 years (133 per month).

Is Google Really Evil?

Many SEO professionals and webmasters consider Google to be giant, ruthless regime. Many say they don’t follow their own Don’t do evil mantra when it comes to SEOs. Google is the Internet police, etc. etc… Like most regimes, Google constantly tells us that they are trying their best to be transparent. That is what the people want to hear. Google is a transparent corporation that aims to serve its people.
Surprisingly, most governments or corporations are transparent. You just need to read the correct information or listen in a different way. Large, hidden secrets are not implemented without first being written down.
Therefore, it should not be a surprise that the reduce ranking algorithms Google develops are sort of spelled out. Actually, they really are. Mainly, you just need to notviolate the quality guidelines and build higher-quality websites.
Organize the world’s information and make it universally accessible and useful. That is Google’s mission.
Stop for a moment, and dream of a day when all search results on the first page are relevant to want you desire. Is it too much to ask for people to read and abide by some laws?

Google EMD Update – A Shock?

So, it comes as a bit of a shock when I hear outrage and panic about Google’s recent Exact-Match Domain algorithm. SEOs are angry, scared, and confused about how this update is harming their sites. Following the Sept. 29th EMD update, it was amusing to read all the caps and cursing at Google in the blogs.
What did people expect? Learn a few basic SEO techniques, abuse the guidelines, and expect not to get caught?

Key Takeaways

  • Review Google’s Quality Guidelines and make sure you are not in violation of any obvious tactics. For EMDs, keyword stuffing appears to be very common. Similar rules for normal brand domains apply. Many who got away with webspam techniques in the past, lost their free EMD lunch bonus.
  • The overall domain strength is still a very important factor, which means the winners simply got higher quality links and more of them. This makes sense. The typical Quick EMD strategy didn’t need high quality links to rank very high.
  • For all you social fans, it does appear that shares could provide a boost in rankings. Further research will need to be done to confirm this. However, I believe that Google takes the social signals and reads this a value of building a high-quality site. See Google bullet point, “Are the topics driven by genuine interest of readers of the site.”
  • The domain popularity growth of the winners is so much higher and reflects the overall link and site quality.
  • It’s not enough to just build links. More links and better quality lead to a better site, even if it’s an EMD. At least, after the long overdue EMD update.
Finally, Matt Cutts is really sticking to his words to crack down on Web spam in 2012. He closed a lot of loopholes, which surprised me, that were still working during the early part of 2012. I wonder if it’s his secret 30-day challenge to crack down on one old tactic per month?
Personally, I think there’s still a lot more to fix. Based on data we get from our Link Detox technology, which is part of our Link Research Tools software suite (independently reviewed here on SEL), there are still many sites with really bad link profiles and outdated SEO practices that are very easy to detect. I suppose that is also why we should expect these updates to continue on a regular basis.
Again, people, you should thank Matt and Google for letting you get away with EMDs for so long.

Google Pirate Update

Google: Pirate Update


Google’s Pirate Update is a filter introduced in August 2012 designed to prevent sites with many copyright infringement reports, as filed through Google’s DMCA system, from ranking well in Google’s listings. The filter is periodically updated. When this happens, sites previously impacted may escape, if they’ve made the right improvements. The filter may also catch new sites that escaped being caught before, plus it may release “false positives” that were caught.

Google Payday Update

Google: Payday Update

Launched on June 11, 2013 – the “Payday Update” was a new algorithm targeted at cleaning up search results for traditionally “spammy queries” such as [payday loan], pornographic and other heavily spammed queries.

Google is now rolling out the 3rd version of the PayDay Loan algorithm.
Last night we reported that Google is going to be launching PayDay Loan 3.0 and Google’s Matt Cutts posted on Twitter moments ago that it is now rolling out.
Cutts, Google’s head of search quality said, “it’s rolling out now!”
To catch you up, PayDay Loan 2.0 launched a few weeks ago around May 17th and 18th. That specifically targeted very spammy sites in the porn, pills and casino markets.
Payday 3.0 specifically targets spammy queries, versus spammy sites. What exactly that means is not 100% clear. But the types of queries this targets includes terms like [payday loans], [casinos], [viagra] and other forms of highly spammy queries.