How Duplicated Web Content Affects Clicks and Impression and Solution

by Oscar

This is a very fun experiment I did on how duplicated web content (or copied content from other websites) affects the pagerank of your websites, ultimately leads to lost of clicks and impression. While observing the penalty from google, I managed to “fix” it and found a possible solution.

Some of the links on this page are affiliate links. I receive a commission (at no extra cost to you) if you make a purchase after clicking on one of these affiliate links. This helps support the free content for the community on this website. Please read our Affiliate Link Policy for more information.

I started a website and wrote about 20 original articles, the clicks and impression grows quite good steadily. But later I added another 20 duplicated articles and found that the clicks dropped to <10 and impression dropped all the way to be around 180, and could never get much higher again, seems like it’s being forced to that level or something.

A friend told me it’s the duplicated content that cause that, and that’s the penalty from Google. Although some people said that downgrade in page rank could be permanent, I tried hiding the duplicated content from Google search engine see if that works.

I can either delete the content or disallow search spider to claw these pages. I chose the second as most people would want to keep their content if they can, it might increase the pageview when visitors browse the website. And it seems to work! I simply “Disallow” spiders to get to those pages in the “Robots.txt”.

Although I am getting errors from Google saying these pages are not claw-able (that’s what I intent to do anyway so I ignore them), the impression and clicks starts to rise again after 2 to 3 weeks I did that.

Leave a Comment

By using this form, you agree with the storage and handling of your data by this website. Note that all comments are held for moderation before appearing.