What is Google Penguin?
It’s a cryptonym for a Google algorithm update announced for the first time on 24 April 2012. The Google update is aimed at lowering search engine rankings of sites that violate the Webmaster Guidelines of Google by using now declared techniques called black-hat SEO involved in increasing artificially a webpage ranking by trying to manipulate the number of links that are pointing to the page. These tactics are commonly called link schemes. Google has announced all Penguin filter updates to the public, according to Google staffer John Mueller.
Google Penguin 4.0
On 23 September 2016 Google made an announcement stating that Google Penguin had become part of the Google’s core algorithm: that means that it is updating in real-time. Therefore, Google will no longer make announcements that relate to future refreshes. Also, real-time means that sites are evaluated in real-time plus rankings are impacted in real-time as well.
During the last years, Google’s webmasters instead had to always wait for the next update roll-out to escape a Penguin penalty. Google Penguin 4.0 is also more granular than the previous updates, because it may affect a whole site on the basis of URL as opposed to affecting a whole site.
Also, Penguin 4.0 is different from the previous Penguin versions because it doesn’t demote a site if it finds bad links: it discounts the links instead. That means that it ignores them and the websites no longer count towards ranking.
As a result, there is less need of using the disavow file since Google uses both human reviewers and algorithm to identify artificial (unnatural) links, deceptive or manipulative. It includes these in its websites’ Manual Actions report. Although links are still highly relevant as a factor for ranking, all links aren’t treated in the same manner and link quality has, therefore, gained importance after this update.
What are the Triggers for Penguin?
Penguin aimed at two specific practices:
• Keyword Stuffing
This is populating a page with repetitions of keywords or a large number of keywords in an effort to manipulate rank through the appearance of importance to specific search phrases.
• Link Schemes
The purchase, acquisition or development of backlinks from unrelated or low-quality sites, creating an unnatural picture of relevance and popularity in an attempt of manipulating Google into giving high rankings. For instance, a company could decide to fill the internet with spam comments that are linking to itself showing that it’s the best company. A company might also pay to have links that are stating it’s the best, and the links appear on an unrelated 3rd party article concerning a different topic.
How Can I discover if I Have been hit by Penguin?
The first thing, it is essential to recognize the difference between a manual penalty for artificial linking and Penguin. In short, Penguin is typically a Google index filter that is applicable to all sites, while a manual penalty is always specific to a single site that has been determined by Google to be spamming. The manual penalties may be due to a site being reported for spam by Google users. Also, it is speculated that Google can manually monitor some industries (such as payday loan companies) more compared to others (such as cupcake bakeries).
If analytics on your website show a decline in traffic or rankings on a date that is connected to a Penguin update, there is a possibility that you’ve been affected by Penguin filter. Ensure you have ruled out traffic fluctuations that are expected from phenomena such as seasonality (for example, Christmas tree farms in April). Also, thoroughly evaluate whether your linking practices or keyword optimization would be considered spammy by Google and make your website vulnerable to an update such as Penguin.
How Can I Recover from Penguin?
With a manual link penalty, you have to file a request for reconsideration with Google once you have cleaned house. You don’t file such a request when you want a Penguin penalty to be lifted. Rather, when you take action to remedy problems, it will normally earn you ‘forgiveness’ the next time that Googlebot comes crawling to your website.
These recovery steps include the following:
• The removal of any artificial links over which you’ve controlled. This includes links you have built yourself or links you have caused to be put on third-party sites
• Spammy links disavowal that you cannot control
• Revision of the content of your website to remedy over-optimization, making sure that keywords have been naturally implemented rather than robotically, nonsensically or repetitively on web pages where there’s no relationship between the keywords being used and the topic
Penguin was created to solve a critical weakness Google’s system that allowed their algorithm to be deceived by large numbers of links of low quality as well as the keyword over-optimization of web pages. If you want to avoid your site being devalued by Google because of spam practices, all the content that you are publishing should reflect natural language. Also, your link-earning-and-building have to be regarded as “safe.