29 Sep Penguin 4.0 is finally here, Google confirms
After a couple of years waiting, and various algorithm fluctuations described as ‘normal turbulence’, Google has finally confirmed today that its Penguin algorithm update is rolling out in all languages.
The last update in 2014 – Penguin 3.0 – may have only affected less than 1% of US/UK searches, but that ultimately translated to 12 billion queries.
Here we’ll detail all the changes you can expect from Penguin 4.0 according to Google’s blog post. But first a little refresher…
According to Adam Stetzer in his post on the delayed Penguin update, Google first launched the Penguin update in April 2012 to catch sites spamming the search results. Specifically the ones who used link schemes to manipulate search rankings.
Penguin basically hunts down inorganic links; the ones bought or placed solely for the sake of improving search rankings.
Before Penguin, bad links were simply devalued and needed to be replaced in order to recover search rankings.
But according to Chuck Price, after Penguin, bad links became ‘toxic’, requiring a link audit and removal or disavow of spammy links and a Penguin refresh was usually required before one could see any signs of recovery. This could take a while.
Thankfully this one of the things addressed in today’s update…
What to expect from Penguin 4.0
The following improvements were among webmasters’ top requests to Google:
Penguin is now real-time
As we stated earlier, the list of sites affected by Penguin was only periodically refreshed at the same time.
According to Google..
“Once a webmaster considerably improved their site and its presence on the internet, many of Google’s algorithms would take that into consideration very fast, but others, like Penguin, needed to be refreshed.”
But now, Penguin data is refreshed in real time, so any changes will be made as soon as the affected page has been recrawled and reindexed.
Google also states that it not going to comment on future refreshes.
Penguin is now more granular
Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site.
So any penalties will be delivered to a specific page rather than an entire domain, which seems much fairer in the long run.