How Google Views And Treats Duplicate Content Issues

There usually is a raging controversy amongst webmasters as to how Google and other search google scraper engines view and treat duplicate content issues.

Before we reveal the result of our in depth research, we must place “duplicate content” in proper perspective.

What is duplicate content?

Duplicate content is more google scraping or less identical content appearing on the same or different sites.

The definition above almost immediately throws up the fact that duplicate content is primarily of two types:

A) More or less identical content appearing on the same site

Google classifies these into two types

1. Duplicate content, deceptive in origin and with malicious intent, on the same site.

Falling within this category, with a view to manipulating search engine rankings and web traffic to their advantage, some webmasters consciously duplicate content on their websites.

2. Unintentional duplicate content without any deceptive intent, on the same site.

This unintentionally occurs in some instances, for example

* Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices

* Store items shown or linked via multiple distinct URLs

* Printer-only versions of web pages

What is Google and other search engines view and treatment of the scenario where more or less identical content appears on the same site?

Our in depth research has revealed the following:

Where as in the first scenario, the duplication is premeditated and with malicious intent or deceptive in origin, then Google frowns at this and will take steps to sanction such erring sites as their action constitutes a violation of Google’s webmaster guidelines.

Such sanctions may include a complete removal from the Google index.

Where on the other hand and as in the second scenario, it arises unintentionally and without malicious intent, Google will not penalize such webmasters but rather take steps to index only one of the duplicated web pages it considers as ideal for such content.

The site’s listing on the search engine result pages (SERP’s) will therefore not be placed in the supplementary listing, as often touted.

Duplication as opposed to duplicate content may however indirectly influence this, if links to the webmaster’s pages are split among the various versions, causing lower per-page PageRank.

Webmasters are therefore advised to proactively take steps to address duplicate content issues on their websites and ensure that visitors see the content they want them to.

Leave a Reply

Your email address will not be published. Required fields are marked *