I used the following methods to protect my sites in advance from Google Panda/Penguin - major algorithm changes Google made recently to penalize and de-index over-optimized websites during the summer of 2012.
If you got hit by these massive updates recently, the following seo tips and info should help improve your site's suddenly rocky standing with big G over time...
First, let's understand Panda and Penguin. The two main offenders to these giant Google algo updates were duplicate content and unnatural link schemes - mainly involving over-optimization of hypertext link anchor text.
So, if your site had a bunch of low quality content and a blatant inbound link scheme, you probably suffered.
If your site had a sizable amount of duplicate content on multiple sub pages, take the time to make some onsite content changes:
A prime offender who likely suffered reduced rankings would be a local business website targeting each suburb of a metro area by creating specific landing pages for each city/town.
It's easy to get lazy and crank out repeated content with just a few words (such as the targeted city name) changed on each page, but not worth it long term.
If you got hit by these massive updates recently, the following seo tips and info should help improve your site's suddenly rocky standing with big G over time...
First, let's understand Panda and Penguin. The two main offenders to these giant Google algo updates were duplicate content and unnatural link schemes - mainly involving over-optimization of hypertext link anchor text.
So, if your site had a bunch of low quality content and a blatant inbound link scheme, you probably suffered.
If your site had a sizable amount of duplicate content on multiple sub pages, take the time to make some onsite content changes:
- Switch up the meta descriptions to make each sub page unique (doesn't have to be Shakespeare, just a basic description of the page - these are often left blank at first)
- Edit image ALT tag duplicates if you use the same picture on multiple pages
- Add a few new sub pages with completely unique, informative content
- Insert meta tags on each page - I realize tags have been devalued for SEO purposes but it surely can't hurt to have different sets on each sub page vs. leaving them blank
- Improve your internal link structure by adding unique links on each sub page to other pages of your site
A prime offender who likely suffered reduced rankings would be a local business website targeting each suburb of a metro area by creating specific landing pages for each city/town.
It's easy to get lazy and crank out repeated content with just a few words (such as the targeted city name) changed on each page, but not worth it long term.
Search engines are only going to get better at detecting duplicate content (both internally and across multiple sites) and, as with ANY search demotion your site experiences, the solution here is simple: play by the rules.
Create quality unique content and write for an audience, not search spiders crawling your site.