If your site has experienced a sudden and significant drop in organic traffic for a number of key search terms you once ranked highly for, you may have been hit by a Google penalty. According to Matt Cutts – the former head of Google’s web spam team, there are over 400,000 manual penalties that are applied to websites that violate Google webmaster guidelines every month and are deemed egregious enough to trigger sanctions by the Google Search Quality team.
There are two types of search engine penalties as discussed here by Matt Cutts, the former head of Google’s spam team.
If a manual penalty has been applied to your site, you’ll receive a notification from Google’s spam team within your Google Search Console account. If the team has found something they believe is manipulative or against the Google webmaster guidelines, it can impose a penalty.
In most cases, manual penalties are caused by off-site factors such as spammy backlinks. In extreme cases, your entire domain may be removed entirely from Google’s index. The only way to respond to a manual penalty is via a reconsideration request, which Google must then approve before the penalty is removed.
Manual Review Penalty Symptoms
Manual review penalties are usually more severe and tougher to recover from than algorithmic penalties. In extreme cases, Google recommends to retire the domain name and start all over again. When a manual penalty has been applied to your site, you need to respond by removing as many manipulative links to your site as possible.
It often involves auditing your entire inbound link profile and then contacting webmasters that own sites that link to yours, and politely ask them to remove the link. If the webmaster ignores you, you may have resort to using the disavow links tool to remove the spammy links. Thereafter, you can then send a well-written reconsideration request to Google.
An algorithmic penalty typically occurs after Google updates its algorithms. These types of penalties are typically harder to detect because you don’t get any notification from Google within Google Webmaster Tools. However, you can simply analyse your website’s traffic data in Google Analytics to see if your rankings (and traffic) took a noticeable dive on or around the time of a specific algorithmic update. If so, there’s a good chance your site may have been affected by the update.
The most popular algorithmic updates are Google Panda, which punishes on on-page infractions such as poor content quality, over-optimization of specific on-page elements, etc. and Google Penguin, which is mainly focused on your backlink profile and anchor text distribution.
Following are symptoms of an algorithmic penalty
A specific group of links suddenly stop providing value. If you find that some of your web pages have suddenly dropped out of the search engine results pages altogether, those pages could have been affected by a recent algorithmic penalty.
This is what happens when an entire private blog network that has been feeding those webpages with link juice has been identified as spam and deindexed. Consequently, your site no longer receives the link juice that those penalized links are providing.
Entire domain starts ranking lower for all or most of its target keywords or phrases. This is a clear indication that the website in question has breached Google’s webmaster guidelines. In this case, every keyword will rank 30 to 50 spots below where it did before it was hit with the penalty. What this means in effect is that Google adds 30 positions to your site every time it comes up in the search results, pushing it down three or more pages.
If you notice a drop in organic search traffic that corresponds with known algorithm updates, you can be pretty clear which algorithm update has been applied to your site. For clarification, use your analytics data and compare it to Moz’s Google Algorithm Change History.
Identifying Search Engine Penalties
The first step to recovering from a manual or algorithmic penalty is to find out exactly which penalty has been applied to your site. SEO recovery tools make it easy for you to find out exactly which penalty has been applied to your site.
Manual on-page penalties
This set of penalties are applied to sites with issues discovered on the site itself. Webmasters with these types of penalties get a message in the Search Console.
Thin Content Penalty
In 2013, Google introduced the thin content manual penalty. This penalty is designed to de-index websites that create content with little or no value. This type of content includes long-form, keyword-rich articles that provide no real value, and are primarily designed to rank for specific keywords. If Google’s algorithms detect that a particular page’s content is keyword-stuffed, duplicated or has a very high bounce rate, it will be categorized as thin content.
It is important to note that once thin content is detected on a website, the entire site could be penalized and removed from the search results. To be re-included in search results, you’ll have to identify all of the content that can be categorized as thin content and replace this content with original content.
Here are examples of thin content as defined by Google:
- Automatically generated content
- Thin affiliate pages
- Scraped content or low quality blog posts
- Doorway pages
In this video, Google’s Matt Cutt clarifies what is meant by thin content:
If you receive a message that the penalty is sitewide, it means that Google considers the entire site in violation of its quality guidelines, and the whole site is penalized. A partial match penalty means only a portion of the site is considered in violation.
For more info on thin content, check out this article by Moz’s Dr Pete.
Major Spam Problems
If you have received the manual action notification highlighting “major spam problems”, it means Google has identified the site as entirely spammy, with no value whatsover to users. In a majority of cases, this type of manual action results in a complete removal of the website from the Google index. The major spam penalty is most often applied to sites with scraped content and/or gibberish sites.
When Google issues a notification highlighting spam problems, it means the website isn’t completely bad. It refers to a series of pages on the site that are considered thin, duplicate or low quality content. The penalty also looks at how useful and engaging the landing page’s contents are to users. This is not a site-wide penalty, and only the offending pages of the site will be penalized.
This penalty does not result in a complete removal from the Google index, but it will be much less visible in Google search results until the offending pages are removed. Once the offending content is removed, a reconsideration request must be submitted to Google.
User-generated spam tends to affect large, user-driven sites that have been exploited by spammers and Google issues the message as a warning to the site to stamp out the offending content. In this case, Google considers the site useful but neglected. The message generally includes a sample URL where user-generated spam has been detected. Consequently, the entire site isn’t penalized.
Hacked Content Spam
If your site has been hacked due to poor security, Google will hit your site with the hacked content spam penalty. In their message to you, Google’s will include a sample URL, which will give you an idea where to start the investigation and what type of content to check while cleansing the site of spam.
The site will also get a prominent label from Google in the search results that warns users of the possible threat if they open the website. This will lead to loss of potential traffic from Google search. Submitting a compelling reconsideration request is the first step toward resolving the problem and removing the “hacked” SERP label.
Spammy Structured Markup Penalty
The prospect of getting a rich snippet is really enticing, and attempts to game the system through the use of deceptive or inflated structured data is very much on Google’s radar. If you violate Google’s structured data markup guidelines, you’ll get a notification in Google Search Console highlighting spammy structured data, and your rich snippets will no longer appear in search results.
In March 2015, Google updated its rating and reviews Rich Snippet policies, stating that these types of snippets must be placed only on specific items, not on “category” or “list of items” landing pages.
Here is an example of a manual Structured Data penalty message sent by Google in the Search Console.
The penalty message reads as follows:
Spammy structured markup
Markup on some pages on this site appears to use techniques such as marketing up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google’s Rich Snippet Quality guidelines.
A penalty can be algorithmic or manual. A manual penalty can be partial or site-wide. Google has stated:
In cases where we see structured data that does not comply with these standards, we reserve the right to take manual action (e.g., disable rich snippets for a site) in order to maintain a high-quality search experience for our users.
Recovering from this penalty requires submission of a reconsideration request, however, once you’ve been hit with this penalty, your rich snippets will no longer reappear even after Google removes the penalty.
Unnatural Outbound Links
The unnatural outbound link penalty was issued by the Google manual actions team in April 2016, and it was designed to penalize sites that contain patterns of “unnatural artificial, deceptive or manipulative outbound links”. In general, this penalty is aimed at blogs that are specifically setup to sell links and link to all manner of sites which are not editorial references. Google penalizes the site by devaluing the site’s outbound links. This means none of the site it links to will get any SEO benefits.
Unnatural Inbound Links
This is the most frequently experienced Google manual penalty linked to the Penguin algorithm, and it is a pretty severe one. You’ll receive the dreaded “unnatural inbound links” notification in your Search Console. Affected sites are considered to be engaging in link schemes against Google’s webmaster guidelines such as being a member of manipulative link networks. The penalty’s impact can be partial or affect the entire domain. If you get hit, you could lose all of your organic traffic from Google literally overnight.
It is vital to conduct a full and in-depth backlink audit, and you must identify all of the toxic links. This means you will need to use different backlink analysis tools so you can find all of the links.
Recovering from this penalty is not easy, but this also depends on how bad your backlink profile is. To begin with, your first reconsideration request will most likely fail even if you have done everything right. Google does this to make webmasters look even deeper and clear things up that might be borderline OK.
Documentation of everything you’ve done to fix the situation is one of the primary things you need to do. Google doesn’t take reconsideration requests very seriously if that intricate documentation isn’t included. You really must show Google that you’ve done everything you can to correct the problem by showing a detailed report of what you’ve done to get rid of the toxic links. List out every dofollow link and list its contact information and action taken.
Upload the disavow file first (and receive confirmation of the change) before submitting documentation outlining all relevant steps taken to resolve the issues Google has highlighted. Include the outcomes of each attempt in your disavow file such as: no reply, removed, 5th try, pending, not contact info, etc.. Also good to list the type of link. Directory, blog post, comment, article etc. All of this will demonstrate to Google that you have really made an attempt.
Note that the domain may never fully recover unless you replace the bad links with links that have the same ranking power as the bad links. Once most of the links have been re-indexed, the algorithm must then be updated with the new information.
Now that Penguin is part of the over 200 signals used in Google’s core algorithm, Penguin-like updates more frequently so .
The Google Panda algorithm was released on February 24th 2011, and was based entirely around the concept of content. It is essentially a content quality filter that analyses the quality of an entire website’s content. It was specifically designed to lower the rank of sites with “low-quality” or “thin-content”. It targets sites with duplicate, plagiarized or thin content; user-generated spam; and keyword stuffing.
As Matt Cutts, Google’s head of spam, put it in a blog post when announcing the first iteration of Panda in 2011:
“This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
In a nutshell, Google wants webmasters to focus on delivering the best user experience possible on their websites so they can send users to the most relevant pages with the highest quality available on the web. If your site has a certain amount of what Google describes as poor quality content that provides little or no value, the entire site will be categorized as a low quality site, and it will be filtered from ranking high in the search results. Panda rollouts have become more frequent, so both penalties and recoveries now happen faster.
The Panda also made “user engagement” a ranking factor. Here are some of the factors considered by the algorithm:
- how does the visitor engage with the website’s content when they receive it?
- does the visitor share it?
- does she comment on it?
- does she stay on the site a long time?
- does she access more than a page on that site?
- does she leave within 30 seconds of arriving on the site?
- does she return to the site or mention it independently afterward?
Today, even pages that are a perfect keyword match may now be filtered from the search results due to weak user engagement. So, ranking at the top of the first page of Google is not just about creating high quality content and getting more social signals and relevant backlinks pointing to your site. It’s also about how visitors to your site engage with it.
Panda is a site-wide penalty, not a page penalty. This means that if a site has just a certain amount of poor quality content that has been penalized by Panda, then the entire site falls below Panda’s quality algorithm, and the whole site is filtered out of the top ten search results.
Note that unless you have a considerable amount of low quality content, you won’t get hit with a manual penalty from Google. You just won’t rank high no matter what you do. If you’re experiencing ranking issues, don’t automatically assume that you need more links. Rather, consider performing a comprehensive content audit to identify whether or not your site does have a lot of what Google defines as thin, low quality content that provides little value.
Google Penguin looks for spammy and irrelevant links. The algorithm works by analyzing the inbound link profile of every website for over-optimized anchor text. If a backlink profile contains backlinks without branded anchors, naked URLs, or universal anchors (ie, “click here, more info, read more or here), then the link profile is heavily optimized, and the site is likely to be susceptible to a Penguin penalty. Having non-descriptive text links such as “read more”, “Click here,” “check out this website,” and “visit us here” are great ways to keep your profile looking natural and richly diverse.
In addition, there are three main backlink factors that can be used to identify these types of link patterns:
- Link quality – Sites with a natural link profile will include both high and low quality links. Manufactured link profiles tend to have lots of just low quality links or only high authority links (like from a private blog network).
- Link growth –Sites with manufactured link profiles tend to build lots of links within a very short period. Sites that build links naturally tend to build links steadily over time. Avoid unusual spikes in link growth.
- Link diversity – Legitimate sites attract links from diverse sources (contextual, blog comments, news sites, resource sites, etc.). However, links from very few sources (such as blog comments and directories) are considered manipulative.
Rather than affecting the ranking of an entire website, Penguin now devalues spam by affecting the ranking of individual pages based on spam signals.
On April 21, 2015, Google released the mobile-friendly ranking algorithm. The update was designed to boost mobile-friendly pages in Google’s mobile search results. This update primarily boosts the rankings of the most mobile-friendly sites, so if your site is not mobile-friendly, rather than being penalized, it will be pushed down in the search results.
One of the best ways to prepare is to test that Google considers your web pages to be mobile-friendly by using its Mobile-Friendly Test tool.
On top of those mistakes, here are a few more general mobile-friendly principles to keep in mind:
- Avoid software that most mobile devices can’t render, e.g, Flash.
- Use responsive design
- Use a text size that is easily readable on a small screen (typically 16px or more)
Google’s Top Heavy Update looks at your page layout and if it finds that the ads above the fold are excessive, it can be penalize your site and downgrade it in the search results.
According to Google’s Webmaster Central Blog when the first update came out in 2012, Google stated that they had received “complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. Such sites may not rank as highly going forward.”
This is a site-based penalty. That means that either all of your content is penalized or none of it is. Google has also confirmed that they will not penalize all sites with above-the-fold ads, but just those sites that occupy too much real estate vs. useful content in the top section of a webpage.
Google released a special tool at browsersize.googlelabs.com to help you visualize if your site may or was impacted by this.
Google released the Payday Loan update to identify and penalize web sites that use black hat techniques to improve their rankings for heavily trafficked search key word queries like “payday loans,” “Viagra,” “casinos” and various pornographic terms. The update targeted spammy queries mostly associated with shady industries like super high interest loans and payday loans, porn, and other heavily spammed queries. The first payday loan update occurred in June of 2013. Payday loan update 2.0 occurred on May 16, 2014, with Payday 3.0 following shortly thereafter in June 2014.
The “Pirate” algorithm was released in 2012, and was specifically designed to algorithmically penalize the growing number of torrent sites that were mainly used for pirating media and software. Google took a strong stance on piracy, which is essentially stealing copyrighted content.
The algorithm works based on copyright reports. If a site has a lot of copyright violations, it will be penalized by this algorithm. While new torrent sites can be established, they will be removed from the search results each time the algorithm is run if they have accumulated enough violations.
Google Fred was a fairly significant algorithm update released in March 2017. The majority of sites that were affected were blogs with thin and low quality content on all sorts of topics. The sites had a large amount of ads or affiliate links spread throughout the site, and seem to have been created for the express purpose of generating revenue rather than solving problems for users.