How to Ensure That Google Knows Your Content is Original

In a perfect world, no one would rip off the content from your website and use it on theirs and Google would be perfectly aware about who the original author of the content is. Creating original content is a major struggle for most companies and content creators. Some people even take the easy route and copy content from other sources. Although Google is now coming down severely on low-quality websites, it is quite hard for Google search algorithms to figure out who is the original author of any piece of content and which are the websites that copied the content.

Some content creators also resort to sneaky techniques like copying original content from another website and changing the time-stamp and publishing the article to make it appear as though they published the article first. In this video, Matt Cutts – well-renowned head of the Google Webspam team, talks about how you can ensure that Google knows your content is original and how to take action against sites that have copied your content.

Crawling the entire web simultaneously is not feasible

Some content marketers and companies are worried that because their websites are crawled less frequently by Google bots, these bots could miss out on important updates and original content. What if another website copies the content and benefits from it only because Google bots crawl that website more often? Matt asserts that in a perfect world Google would be able to crawl the entire web at the same time and all the websites would be in perfect shape when being crawled.

However, this is not possible because according to Matt, Google bots can only fetch a finite number of webpages at any given time. Matt indicates that Google could fetch almost all the webpages on the web simultaneously because it has the architecture to do so but this could crash the world wide web.

This is why Google prioritizes webpages based on their PageRanks and other factors to decide which pages to crawl first and more frequently and this is where errors can creep in.

Tips to inform Google bots about your original content

Firstly, Matt agrees that taking content from another website and changing the timestamp is quite a shady move and this practice could get websites penalized. Also, he describes various steps that you can follow to let Google know when you publish original content and when your content is being copied.

When you publish an original blog post or article, tweet it so that your followers share it and engage with it and this allows Google bots to find that post or article much faster. Matt also suggests that you hook up PubSubHubbub with your content. This protocol or API provides almost real-time notifications of changes and updates to various places and Google uses this protocol to improve its crawling process.

If your content has already been copied and the other website has changed the time stamp, Matt assures that there are quite a few ways to report such websites. One way is to file a DMCA (Digital Millennium Copyright Act) notice and send a DMCA request and information on this can be found by following this link – http://www.google.com/dmca.html. This allows you to file a complaint against the website copying your content and that website will be allowed to counter-notify. Rest assured that websites are penalized if it they lying when they counter notify and they will most often agree to take down the copied content.

If an auto generated site has copied your content and you see that it is ripping content from other sources too then you can file a spam report with Google because this is clearly a low-quality website. Watch this video to learn about the steps that Google is taking to identify original authors accurately and the various measures you can take to help Google identify correct authors.