First of all, what is duplicate content? In simple terms, it refers to any text content on a website that has been wholly or substantially copied from an existing web source.
Why shouldn’t you copy content from someone else or duplicate content across multiple pages on your own website? The answer is simple – it’s Google. Over the years, Google has repeatedly said it will penalize websites that have duplicated content by lowering their rankings in the search results or removing them completely. This is an unthinkable situation to be in, especially for the many businesses that rely heavily on traffic generated via Google search.
Why does Google penalize websites that copy content? There are many reasons, but three stands above the rest.
By copying someone else’s content and placing it in whole or part on your website, you are breaking copyright laws. Almost all content on the web is copyright of the website owner, and permission to use that content in any form requires permission.
Duplicate content for dummies
Over the years web spammers have become more advanced. These days spammers use ‘scrapers’ to automatically copy content from legitimate websites and populate their own sites. These websites are often used for malicious activity such as fraud and widespread virus distribution.
3. User value
Websites that contain duplicate content across several pages or copy content from other websites are adding no value for the user. Providing the same information as another source is redundant and Google realizes this – it aims to provide the most relevant, useful information to its users via the search results.
To combat copyright infringement and spam, and to increase the search result quality, Google penalizes those sites that duplicate content. Google has an advanced algorithm that identifies what it believes to be websites that contain duplicate content. At its most basic level, Google deems the website that has been live the longest to be the original content and all other sites with the same content are duplicates.
Over the past 10 to 12 months, Google has publicly released information regarding changes to its duplicate content algorithm – these changes have been dubbed Panda. The Panda update was originally rolled out in February 2011 with several smaller updates occurring since then. The update aimed to lower the rank of “low-quality sites” and return higher-quality sites near the top of the search results and has reportedly affected the rankings of almost 12% of all searches.
Who does Panda effect and should you be worried about it? As stated above, Panda affected searches – this is not 12% of websites, but 12% of every search that was performed on Google. This resulted in slightly different results being returned. The biggest losers after the rollout were spam websites.
However, if you run an e-commerce website that duplicates product descriptions from the manufacturer’s website or run a content site that cross-posts content across multiple domains then you should be worried. Some of the civilian casualties of the Panda update included large e-commerce sites with duplicate product descriptions as well as many content-based websites.
The easy way to avoid these penalties is to ensure you are writing new, unique content for your website. Overall, the Google Panda update has been a godsend for search results. Low-quality, spam websites have all but disappeared from the search results.
Find out where your website stands in Google by ordering Netregistry’s free Search Engine Ranking Report. Call 1300 638 734, or visit Netregistry.com.au.