How Duplicated Web Content Affects Clicks and Impression and Solution

This is a very fun experiment I did on how duplicated web content (or copied content from other websites) affects the pagerank of your websites, ultimately leads to lost of clicks and impression. While observing the penalty from google, I managed to “fix” it and found a possible solution.

I started a website and wrote about 20 original articles, the clicks and impression grows quite good steadily. But later I added another 20 duplicated articles and found that the clicks dropped to <10 and impression dropped all the way to be around 180, and could never get much higher again, seems like it’s being forced to that level or something.

A friend told me it’s the duplicated content that cause that, and that’s the penalty from Google. Although some people said that downgrade in page rank could be permanent, I tried hiding the duplicated content from Google search engine see if that works.

I can either delete the content or disallow search spider to claw these pages. I chose the second as most people would want to keep their content if they can, it might increase the pageview when visitors browse the website. And it seems to work! I simply “Disallow” spiders to get to those pages in the “Robots.txt”.

Although I am getting errors from Google saying these pages are not claw-able (that’s what I intent to do anyway so I ignore them), the impression and clicks starts to rise again after 2 to 3 weeks I did that.

Leave a Reply

Your email address will not be published. Required fields are marked *

For prompt technical support, please use our forum I check blog comments weekly.