Google’s new update (Penguin 2.0) and what it means for your websites SEO
Google’s Penguin 2.0 update will be the biggest update to it’s search engine since Penguin 1.0 back in April, 2012.
The update is aimed at decreasing search engine rankings of websites that violate Google’s Webmaster Guidelinesby using now declared black-hat SEO techniques to artificially increase the ranking of a webpage by manipulating the number of links pointing to the page. Such tactics are commonly described as link schemes.
Google are referring to the update as a webspam update which will be better at detecting blackhat SEO, whilst increasing the ranking of sites with authority in their field, keeping users flowing to relevant and useful sites.
So what does Google consider ‘blackhat SEO’:
- Automatically Generated rubbish
- Cloaking – showing different content to users than to Google
- Scraping Content from others – your website should be full of organic rich content not rubbish from everyone else
- Hidden text or keyword stuffing – relevant keywords should be found organically in the content on your site
- User generated spam – from forums, guestbooks and user profiles
- Thin Content with little/no value – words for words sake
- Unnatural links to a site – the result of purchasing links to increase ranking
- Spammy free web hosts – what else is your website sharing its IP address with?
- Sneaky redirects – taking users from where they thought they would land on
But how will this effect your website?
Well if you website is SEO’d correctly and content rich then the answer is it won’t effect it, but it may help it beat other ‘cheating’ websites up the rankings. If you have high quality organic content then you have nothing to worry about.
Google has even put in extra effort to ensure that sites that have authority in their industry rank higher along with some additional work on link analysis that will be coming in the next few months. Along with the update Google will also be changing the way some of its results work past the first page, for example a user is less likely to stumble upon a clump of results from the same domain on the first page but are more likely to in the following pages as results get more specific and then results from that domain will thin out once again.