Monday, 2 March 2015

SEO Interview Questions

Search Engine Optimization and Google Analytic Interview Questions



Hi All,
I have visited so may online website which mention SEO and Google analytics related Interview Questions. All have given good information but did not get the information regarding deep knowledge about SEO and GA. So I have decided to share the deep analysis with you guys for SEO as well as GA. Hope this will help you out to increase your knowledge for SEO and GA.
  1. How will we set a Goal for Pageviews?
  2. How will you do SEO for one product?
  3. How you will rank your keywords without On-Page Optimization?
  4. What is User ID in Google Analytics?
  5. How you will track a Visitor for different Devices?
  6. How you will add View in a profile?
  7. What is sampling data?
  8. How you will set a roll up property for your different profiles?
  9. Is it possible to use USER ID without changing GA Code?

How you will track clicks for search box of your website which is not relate to your site search?
Go through for all and let me know your valuable feedback for all. Looking forward for your early response. Have a great day.

Advanced Senior SEO Interview Questions And Answers 2015

Junior Seo Interview Questions And Answers 2015

Q.1. What tools do you use for doing seo?
Ans. Generally I use free tools like Small Seo Tools, Backlink Checker, Google Webmaster tools, Keyword Planner, Google analytics, Alexa, Website grade etc. Many paid SEO tools are also available like Moz etc.

Q.2. What is Google Sandbox?
Ans. Google Sandbox is an imaginary area where new and less authoritative sites are kept for a specific time period until they establish themselves of being displayed on the Google engine search results. It’s generally happen due to building high volume of links in short time.

Q.3. What is PR or Page Rank?
Ans. PR or Page Rank is provided by Google and its help to understand the value of a website or ervery webpage. The Page Rank is calculated by Google algorithm which is based on  number of quality backlinks received for particular page, Unique visitor’s reaction, bounce rate etc.

Q.4. What do you mean by backlinks?
Ans. All the backlinks created for your website on another website called backlinks. You’re taking links from high authority sites  in do follow high authority sites like .org, .edu and links from high authority sites. And some other sites squidoo, tumblr, digg, disque etc.

Q.5. What do you mean by outbound links?
Ans. Links to other website or webpage from our website.

Q.6. What is Keyword Density?
Ans. Keyword density let you know how many time you have used the particular keyword or keyword phrase in web page.

Q.7. What is Anchor Text?
Ans. Anchor text is the clickable text in a hyperlink . For example <a href=" http://kaushikitsolutions.blogspot.in"> Seo Company In Faridabad</a>.

Q.8. What do you mean by organic results?
Ans. Particular web page which comes in search engine result page and not through any off page optimization is known as organic results.

Q.9. What is Googlebot?
Ans. Googlebot is Google’s own software. Googlebot is free index, crawl & caching of webpage & update its collected data and detail time to time.

Q.10. What are the limitations of title and description tags in Ask.com, Google, Yahoo, Bing ?
Ans. Every search engine has its own algorithm and word limitation meta tag limitation for different search engines are as follows:
Ask.com: Title length 70 character & description length 312 characters.
Google.com: Title length  50- 60 character & description length 150-160 characters.
Yahoo.com: Title length  60-72 character & description length 155-165 characters.
Bing.com: Title length  65 character & description length 155 characters.

Q.11. How will you increase the PageRank of a page?
Ans. It’s build high quality link from relevant web pages try to take link from high pr website to increase the PR of your website or webpage and you use always quality content and so defiantly increase PR of webpage.


Q.12. How will you treat Web standards while optimizing a website?
Ans. I personally use for web standards from the (W3C) whenever optimize any website, which help me to get more points from search engine(Google) to improvise webpage rank.

Q.13. What is 301 redirect?
Ans. It’s use to permanent redirect users from old page webpage to new page webpage.

Q.14. What is Cloaking?
Ans.  It’s procedure of using some deceptive techniques that allows user with a different version of the website than that presented to the search engines.

Q.15. What is robots.txt file?
Ans. It’s a text file which allow search engines to which page, directory, domain have to index or not search engine.

Wednesday, 25 February 2015

Google: Top Heavy Update

Google: Top Heavy Update

Top Heavy was launched in January 2012 by Google as a means to prevent sites that were “top heavy” with ads from ranking well in its listings. Top Heavy is periodically updated. When a fresh Top Heavy Update happens, sites that have removed excessive ads may regain lost rankings. New sites deemed too “top heavy” may get caught.

Google Updates Its Page Layout Algorithm To Go After Sites “Top Heavy” With Ads


google-flyswatter-penalty-featured

Google’s head of search spam, Matt Cutts, announced that Google has released a refresh of its Page Layout Algorithm. The filter, also known as the Top Heavy algorithm, downgrades the ranking of a web page with too many ads at the top or if the ads are deemed too distracting for users.
Cutts said the algorithm was refreshed last Thursday, February 6. Here’s his tweet:
This would be the third confirmed update to the Top Heavy algorithm, with the full release schedule as follows:
  • Top Heavy 1: Jan. 19, 2012 (impacted less than 1% of English searches)
  • Top Heavy 2: Oct. 9, 2012 (impacted 0.7% of English searches)
  • Top Heavy 3: Feb. 6, 2014 (impact not stated)

Background On & Recovering From Top Heavy

What is the page layout algorithm? As we quoted from Google originally:
We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.
So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.
Such sites may not rank as highly going forward.
See also our original article for when Top Heavy was first released, for advice about how a site that’s caught may have to wait until the next release for any changes it’s made to restore rankings.
We have not seen many complaints within the SEO community around February 6th or 7th about any update like this, which suggests it impacted fewer sites than when Google updates other filters like the Panda or Penguin algorithms.

The Top Heavy Update: Pages With Too Many Ads Above The Fold Now Penalized By Google’s “Page Layout” Algorithm


Do you shove lots of ads at the top of your web pages? Think again. Tired of doing a Google search and landing on these types of pages? Rejoice. Google has announced that it will penalize sites with pages that are top-heavy with ads.

Top Heavy With Ads? Look Out!

The change — called the “page layout algorithm” — takes direct aim at any site with pages where content is buried under tons of ads.
From Google’s post on its Inside Search blog today:
We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.
So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.
Such sites may not rank as highly going forward.
Google also posted the same information to its Google Webmaster Central blog.
Sites using pop-ups, pop-unders or overlay ads are not impacted by this. It only applies to static ads in fixed positions on pages themselves, Google told me.

How Much Is Too Much?

How can you tell if you’ve got too many ads above-the-fold? When I talked with the head of Google’s web spam team, Matt Cutts, he said that Google wasn’t going to provide any type of official tools similar to how it provides tools to tell if your site is too slow (site speed is another ranking signal).
Instead, Cutts told me that Google is encouraging people to make use of its Google Browser Size tool or similar tools to understand how much of a page’s content (as opposed to ads) is visible at first glance to visitors under various screen resolutions.
But how far down the page is too far? That’s left to the publisher to decide for themselves. However, the blog post stresses the change should only hit pages with an abnormally large number of ads above-the-fold, compared to the web as a whole:
We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content.
This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page.
This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.

Impacts Less Than 1% Of Searches

Clearly, you’re in trouble if you have little-to-no content showing above the fold for commonly-used screen resolutions. You’ll know you’re in trouble shortly, because the change is now going into effect. If you suddenly see a drop in traffic today, and you’re heavy on the ads, chances are you’ve been hit by the new algorithm.
For those ready to panic, Cutts told me the change will impact less than 1% of Google’s searches globally, which today’s post also stresses.

Fixed Your Ads? Penalty Doesn’t Immediately Lift

What happens if you’re hit? Make changes, then wait a few weeks.
Similar to how last year’s Panda Update works, Google is examining sites it finds and effectively tagging them as being too ad-heavy or not. If you’re tagged that way, you get a ranking decrease attached to your entire site (not just particular pages) as part of today’s launch.
If you reduce ads above-the-fold, the penalty doesn’t instantly disappear. Instead, Google will make note of it when it next visits your site. But it can take several weeks until Google’s “push” or “update” until the new changes it has found are integrated into its overall ranking system, effectively removing penalties from sites that have changed and adding them to new ones that have been caught.
Google’s post explains this more:
If you decide to update your page layout, the page layout algorithm will automatically reflect the changes as we re-crawl and process enough pages from your site to assess the changes.
How long that takes will depend on several factors, including the number of pages on your site and how efficiently Googlebot can crawl the content.
On a typical website, it can take several weeks for Googlebot to crawl and process enough pages to reflect layout changes on the site.
Our Why Google Panda Is More A Ranking Factor Than Algorithm Update article explains the situation with Panda, and how it took time between when publishers made changes to remove “thin” content to when they were restored to Google’s good graces. That process is just as applicable to today’s change, even though Panda itself now has much less flux.

Meanwhile, Google AdSense Pushes Ads

Ironically, on the same day that Google’s web search team announced this change, I received this message from Google’s AdSense team encouraging me to put more ads on my site:
This was in relation to my personal blog, Daggle. The image in the email suggests that Google thinks content pretty much should be surrounded by ads.
Of course, if you watch the video that Google refers me (and others) to in the email, it promotes careful placement, that user experience be considered and, at one point, shows a page top-heavy with ads as something that shouldn’t be done.
Still, it’s not hard to easily find sites using Google’s own AdSense ads that are definitely pushing content down as far down on their pages as they can or trying to hide it. Those pages, AdSense or not, are subject to the new rules, Cutts said.

Pages Ad-Heavy, But Not Top-Heavy With Ads, May Escape

As a searcher, I’m happy with the change. But it might not be perfect. For example, here’s something I tweeted about last year:
Yes, that’s my finger being used as an arrow. I was annoyed that to find the actual download link I was after was surrounded by AdSense-powered ads telling me to download other stuff.
This particular site was heavily used by kids who might easily click on an ad by mistake. That’s potentially bad ROI for those advertisers. Heck, as net-savvy adult, I found it a challenge.
But the problem here wasn’t that the content was pushed “below the fold” by ads. It was that the ratio of ads was so high in relation to the content (a single link), plus the misleading nature of the ads around the content.

Are Google’s Own Search Results Top Heavy?

Another issue is that ads on Google’s own search results pages push the “content” — the unpaid editorial listings — down toward the bottom of the page. For example, here’s exactly what’s visible on my MacBook Pro’s 1680×1050 screen:
(Side note, that yellow color around the ads in the screenshot? It’s much darker in the screenshot than what I see with my eyes. In reality, the color is so washed-out that it might as well be invisible. That’s something some have felt has been deliberately engineered by Google to make ads less noticeable as ads).
The blue box surrounds the content, the search listings that lead you to actual merchants selling trash cans, in this example. Some may argue that the Google shopping results box is further pushing down the “real content” of listings that lead out of Google. But the shopping results themselves do lead you to external merchants, so I consider them to be content.
The example above is pretty extreme, showing the maximum of three ads that Google will ever show above its search results (with a key exception, below). Even then, there’s content visible, with it making up around half the page or more, if you include the Related Searches area as content.
My laptop’s screen resolution is pretty high, of course. Others would see less (Google’s Browser Size tool doesn’t work to measure its own search results pages). But you can expect Google will take “do as I say, not as I do” criticism on this issue.
Indeed, I shared this story initially with the main details, then started working on this section. After that was done, I could see this type of criticism already happening, both in the comments or over on my Google+ post and Facebook post about the change.
Here’s a screenshot that Daniel Weadley shared in my Google+ post about what he sees on his netbook:
In this example, Google’s doing a rare display of four ads. That’s because it’s showing the maximum of three regular ads it will show with a special Comparison Ads unit on top of those. And that will just add fuel to criticisms that if Google is taking aim at pages top-heavy with ads, it might need to also look closer to home.
NOTE: About three hours after I wrote this, Google clearly saw the criticisms about ads on its own search results pages and sent this statement:
This is a site-based algorithm that looks at all the pages across an entire site in aggregate. Although it’s possible to find a few searches on Google that trigger many ads, it’s vastly more common to have no ads or few ads on a page.
Again, this algorithm change is designed to demote sites that make it difficult for a user to get to the content and offer a bad user experience.
Having an ad above-the-fold doesn’t imply that you’re affected by this change. It’s that excessive behavior that we’re working to avoid for our users.

Algorithms? Signals?

Does all this talk about ranking signals and algorithms have you confused? Our videobelow explains briefly how a search engine’s algorithm works to rank web pages:
Also see our Periodic Table Of SEO Ranking Factors, which explains some of the other ranking signals that Google uses in its algorithm:

Name The Update & More Info

Today’s change is a new, significant ranking factor for our table, one we’ll add in a future update, probably as Va, for “Violation, Ad-Heavy site.”
Often when Google rolls out new algorithms, it gives them names. Last year’s Panda Update was a classic example of this. But Google’s not given one to this update (I did ask). It’s just being called the “page layout algorithm.”
Boring. Unhelpful for easy reference. If you’d like to brainstorm a name, visit our posts on Google+ and on Facebook, where we’re asking for ideas.
Now for the self-interested closing. You can bet this will be a big topic of discussion at our upcoming SMX West search marketing conference at the end of next month, especially on the Ask The Search Engines panel. So check out our full agenda and consider attending.
Postscript: Some have been asking in the comments about how Google knows what an ad is. I asked, and here’s what Google said:
We have a variety of signals that algorithmically determine what type of ad or content appears above the fold, but no further details to share. It is completely algorithmic in its detection–we don’t use any sort of hard-coded list of ad providers.

Google: EMD Update

Google: EMD Update

The EMD Update — for “Exact Match Domain” — is a filter Google launched in September 2012 to prevent poor quality sites from ranking well simply because they had words that match search terms in their domain names. When a fresh EMD Update happens, sites that have improved their contentmay regain good rankings. New sites with poor content — or those previously missed by EMD — may get caught. In addition, “false positives” may get released. Our latest news about the EMD Update is below.

Deconstructing The Google EMD Update


Well, it’s official – no more free lunch for EMD, now that theGoogle EMD Update has launched. It worked well, for a long time. A whole industry of exact match domain tools and brokers came up. Huge premium prices for good names just went through the roof when it was a real “money keyword.”
For quite a while, it was possible to rank in the TOP3 with literally no backlinks, compared to non EMDs, often after only a few days in literally every niche you can dream of.
Exact Match Domains (AKA Keyword Domains) are, in general, domain names that exactly match the keyword a website wants to compete for. For example, if a website wants to rank for the term [minor weather report], an exact match domain would be [MinorWeatherReport.com].
For years, it was a free lunch for those in-the-know, and now this loophole is closed, just like other loopholes have been closed years before. But instead of complaining about a free lunch being taken away, SEOs should be thankful for having had it. This cheap way to get traffic, thanks to Google being pretty slow to close the loophole, is now gone.

What The EMD Bonus Included

Two of the first SEO techniques that are taught are on-page keyword optimization andlink building. Interestingly, those two things are very closely related to typical sorts of webspam techniques. Perhaps, it is time to focus SEO efforts on abiding by some standards, and make websites for people, not search engines.
EMDs are more than likely being targeted for violating keyword stuffing, other simple webspam techniques, and not building quality sites. The keyword phrase is in the exact match domain name, and this seems like an easy breezy technique. Start with the keyword from the Exact Match Domain and repeat it throughout the page.
The same is true for links. For a long time, the huge EMD bonus was that the website’s name is the money keyword one wants to rank for. It seemed OK to really overdo linking for the money keyword, as we thought Google couldn’t differentiate between that keyword being a brand name or not. Having lots of links and mentions in the body for your brand name (i.e., CEMPER) makes a lot of sense.
However, if you have a website www.BuyCheapSomething.com, it seemed that Google took Buy Cheap Something also as a brand name and ranked you fast for it.

Google Knew It, We Knew It

Don’t think for a minute that this EMD algorithm just came out of the blue. The patent for the EMD algorithm, Systems and methods for detecting commercial queries, was filed way back in September of 2003, and finally approved a year ago on October 25, 2011. Matt Cutts even talked about how they were going to change the EMD game in a video on March 7, 2011. Is Google being transparent by warning us that they are giving too much weight to EMDs?
We went through the weekly winners and losers list from SEOLytics to look for EMDs that had dropped sharply from rather stable rankings from the week before the update. We also did the opposite and found EMDs that actually gained in rankings after the update.

An Actual Example Of An EMD Loser

So, let’s just take a quick look at a few things a top loser did, and see if we can find if the site violated any of the guidelines.
We randomly picked one of the analyzed users and found this website:
www.businessliabilityinsurance.org
At first appearance, this site looks professional and has some reputable insurance company logos. Of course, we see the keyword phrases [business liability insurance] and [liability insurance] throughout the homepage. Perhaps, it isn’t overly done.
However, the FAQs page returns 38 matches for [business liability insurance] and 47 matches for [liability insurance]. Wow! The Guides page returns a total of 100 matches for [business liability insurance].
Are SEOs creating these types of text areas for users or search engines?
Google EMD loser 01
If you click thru those articles, you see that they are just spun content throughout the whole site, and are targeting every possible location with the keyword [Business Liability Insurance]. Not surprising that this site tanked — if you just look at the on-page factors of overdoing the commercial keyword.

What About The Link Profile?

Looking at the domain’s Power*Trust, we see a poor value of 6. More on that in a bit.
The anchor texts they used are striking when you look at this. There is not even an approach to mention the URL at substantial amounts (3% only, to be precise). The rest of the keywords are all money keywords in various combinations.
Link Profile Pie Chart


The Trust of those links are less than mediocre. There is only one business.com link at 5 and the remainders mostly below 3.

Other Trust Factors

An interesting Unable to connect error came up when trying to access the Get Quotes page. This error to the netquotes.com site could also be a factor. I would classify this as a mix between a sneaky redirect and cloaking. Perhaps, the netquotes.com site is not a trusted site. While doing our research, it did not appear to load.
Google EMD loser 02
Lastly, I really had to laugh at this deceptive maneuver shown below. As users, we value testimonials on websites.

Google EMD loser 03
When we see a photo of the person next to the testimonial, the value and perception that this company is trustworthy is even greater.
However, I know these people and they are not who they say they are. LOL! They are stock photos. This is not cool in my book!
Even the testimonials are keyword optimized.

What Is Trust & Why Is It Important?

Following the recent Penguin 3 update, I went back and reread the Another step to reward high-quality sites blog post on Tuesday, April 24, 2012. Again, I am highlighting this, because I believe it is important to read what Google tells us they are doing and notread a bunch of conspiracy and hate comments.
The fact is that Google wants highly-trusted sites in its rankings. And any attempt at manipulating the game will be dealt with. If you don’t believe me that Google is taking webspam seriously, here is a valuable quote from a paper that was written in 2005.
“Please note that according to our definition, all types of actions intended to boost ranking (either relevance, or importance, or both), without improving the true value of a page, are considered spamming.” (Web Spam Taxonomy by Zoltán Gyöngyi & Hector Garcia-Molina)
It is probably worth noting, Zoltán Gyöngyi is a research scientist at Google who went to Stanford and studied with Professor Hector Garcia-Molina, who was the principal investigator for the Stanford Digital Library Project, in which the Google search engine emerged.
We know from our research that by now Google is definitely targeting webspam and low-quality sites with its algorithms. So, what does it take to develop a high-quality site?
Following are some excerpts from our updated EMD Case Study, you can get the full report here.

Domains Compared By Power * Trust™

Since Google is always mentioning the overall quality of a site, I could think of no better metric than the CEMPER Power*Trust metric. (Admittedly, I am slightly biased here.)
Power means the strength based on the number and power of links (better than PageRank™). Trust indicates the implied Trust of the page in Google, according to a system similar to the Trust Rank patent. By combining both metrics, you can easily rate the overall quality of a domain.
Cemper Power*Trust
The above chart is very clear; the average winner has double the amount of the Power*Trust compared to the losers. This results from a huge number of highly-trusted and very strong backlinks. While most of the losers’ backlinks are potentially low quality, it is pretty clear that the winners have way more links with high Power*Trust.
Our example, businessliabilityinsurance.org, has an even lower Power*Trust value 6 for their domain than the average loser with a Power*Trust value 8.

Domains Compared By Facebook Shares

Social media activity is (and should be) an ever increasing factor in rating the quality of a website. It’s a pretty clear factor to figure out if the audience likes the content and wants to share it with other people.
In general, a very popular website or brand automatically grows in social networks as soon as the reach higher rankings in Google.
Facebook Shares
This chart shows the huge gap between an average winner’s site with up to 180 shares, whereas the losers have only 17. That might result from the growing priority of a social network, as well as the fact that high quality websites in many verticals simply tend to attract more shares than low quality websites.

Domains Compared By Domain Popularity Growth

The Domain Popularity Growth (i.e., growth of new linking root domains per month) is one of the most important factors for a common natural link profile. A healthy website automatically grows over time because more and more webmasters tend to place a link to a higher quality site.
Avg Domain Popularity Growth EMD Win Lose
What we see here is that the winners have constantly more than double the linking domain growth per month than then losers. While the losers have built an average of 1239 new linking domains over the last 2 years (51 per month), the winners have built an average of 3193 new linking domains over the last 2 years (133 per month).

Is Google Really Evil?

Many SEO professionals and webmasters consider Google to be giant, ruthless regime. Many say they don’t follow their own Don’t do evil mantra when it comes to SEOs. Google is the Internet police, etc. etc… Like most regimes, Google constantly tells us that they are trying their best to be transparent. That is what the people want to hear. Google is a transparent corporation that aims to serve its people.
Surprisingly, most governments or corporations are transparent. You just need to read the correct information or listen in a different way. Large, hidden secrets are not implemented without first being written down.
Therefore, it should not be a surprise that the reduce ranking algorithms Google develops are sort of spelled out. Actually, they really are. Mainly, you just need to notviolate the quality guidelines and build higher-quality websites.
Organize the world’s information and make it universally accessible and useful. That is Google’s mission.
Stop for a moment, and dream of a day when all search results on the first page are relevant to want you desire. Is it too much to ask for people to read and abide by some laws?

Google EMD Update – A Shock?

So, it comes as a bit of a shock when I hear outrage and panic about Google’s recent Exact-Match Domain algorithm. SEOs are angry, scared, and confused about how this update is harming their sites. Following the Sept. 29th EMD update, it was amusing to read all the caps and cursing at Google in the blogs.
What did people expect? Learn a few basic SEO techniques, abuse the guidelines, and expect not to get caught?

Key Takeaways

  • Review Google’s Quality Guidelines and make sure you are not in violation of any obvious tactics. For EMDs, keyword stuffing appears to be very common. Similar rules for normal brand domains apply. Many who got away with webspam techniques in the past, lost their free EMD lunch bonus.
  • The overall domain strength is still a very important factor, which means the winners simply got higher quality links and more of them. This makes sense. The typical Quick EMD strategy didn’t need high quality links to rank very high.
  • For all you social fans, it does appear that shares could provide a boost in rankings. Further research will need to be done to confirm this. However, I believe that Google takes the social signals and reads this a value of building a high-quality site. See Google bullet point, “Are the topics driven by genuine interest of readers of the site.”
  • The domain popularity growth of the winners is so much higher and reflects the overall link and site quality.
  • It’s not enough to just build links. More links and better quality lead to a better site, even if it’s an EMD. At least, after the long overdue EMD update.
Finally, Matt Cutts is really sticking to his words to crack down on Web spam in 2012. He closed a lot of loopholes, which surprised me, that were still working during the early part of 2012. I wonder if it’s his secret 30-day challenge to crack down on one old tactic per month?
Personally, I think there’s still a lot more to fix. Based on data we get from our Link Detox technology, which is part of our Link Research Tools software suite (independently reviewed here on SEL), there are still many sites with really bad link profiles and outdated SEO practices that are very easy to detect. I suppose that is also why we should expect these updates to continue on a regular basis.
Again, people, you should thank Matt and Google for letting you get away with EMDs for so long.