Release of Google Penguin 2.0 was a much anticipated event this year for watching the impact of modified Google algorithm. It has been quite sometime after Penguin 2.0 has gone live on 22 May 2013 and many sites have already experienced its effects on their SEO performance. This updated Google algorithm targets at spammy websites, which use ‘black hat’ techniques, such as advertorials, link spamming and other nefarious approaches to achieve higher rank in Google’s Search Engine Result Pages (SERP).
Google Penguin 2.0 aims at digging deeper into the website content for deciding which sites should have higher ranks and which ones are to be pulled down from SERP rankings. Compared to its predecessors, this new version of Penguin delve into details of backlink building, such as how the backlinks work, their indexing patterns, where they are fetched from etc. As per Google’s distinguished engineer Matt Cutt’s prediction, the websites that have original and meaningful content and submit natural links need not worry about the Penguin 2.0 hit. Following are some early findings about the effects of Google Penguin 2.0 on SEO:
Google Penguin 2.0 goes deeper into spammy links:
Just like its earlier versions, Penguin 2.0 also targets at unnatural links. In addition to this, it takes a closer look at more than just the home page. A study was conducted to check the web contents and link profiles of 13 websites, which lost their rankings as an after-effect of Penguin 2.0 release. The results showed that heavy usage of match anchor text was the key reason for getting identified as unnatural link.
Devaluation of websites:
Websites with poor content quality are being penalised by Google Penguin 2.0. The blogs, where unnatural links and irrelevant contents are published together, are being significantly devalued by the new algorithm of Google.
Devaluation of backlinks:
The traditional SEO methods are gradually losing importance after the release of Google Penguin 2.0. This includes creation of keyword rich text link in the content, links in author bio section and content with embedded links. Today, Penguin 2.0 favours the original and informative contents, since they have the ability of naturally garnering links, which results in better SEO ranking.
This is another method adopted by Google today for keeping a constant watch on the content quality. If an author continuously publishes fluff, filler or spun content, Google is able to track those links and sites to depreciate the author rank, along with the devaluation of spammy sites and unnatural links.
Social media signal:
Penguin 2.0 pays less attention to the volume of social shares of a commercial product or service. Rather,it pays much emphasis on the particular profiles that share the contents in social media sites. It is expected that this will eventually stop increased traffic by paid services, and will encourage natural promotion of a product.
Enhanced co-occurrence for improved ranking:
Anchor texts still play an important role in influencing web rankings. But, after the release of Penguin 2.0, Google has put much emphasis on the co-occurrence of words, phrases or links within the search queries or textual contents. This means that the words, phrases or links should naturally co-occur within web content and should have normal frequency with no hyperlink.
The scope of Penguin 2.0 varies from one language to another, as languages with more web spam will face more impact. As of now, about 2.3% of US-English queries have been affected to a degree that can be noticed by regular users. In short, Penguin 2.0 is an effective approach by Google for punishing ‘black hat’ SEO tactics, along with promoting quality contents and natural links to climb Google’s SERP.
Kevin Cull takes keen interest in reading about Google Penguin 2.0 on SEO and its new trends. He loves to do research on latest SEO trends and what are the changes required after every Google Algorithm update.