Dan Taylor is a UK based digital marketer and blogger with a strong background in social media, email marketing and SEO. He’s kindly taken the tips to share some SEO tips with us that can help your website traffic increase from Google organic traffic. Who wants more traffic? If you want more traffic, read on…
In the last couple of years, web masters have been faced with a new foe – in the form of Google’s Panda update. The Panda update (most recent update July 2015) scans your site’s content and rewards high quality, unique content and penalises sites with low quality, thin content by not ranking it highly in the SERPs (search engine results pages).
But what is low quality, thin content? In Google’s eyes it is thought to be that ‘thin’ and ‘low quality’ are synonymous. Also, you need to remember that to us, quality is subjective, but to Google, quality is a calculated determination courtesy of their Panda update. Google is unable to measure the persuasiveness of your written copy, the manufacturing standards behind your products or even the scale of your accomplishments – it sees black and white text on a page.
So what constitutes low quality and how can it be avoided?
There are three types of duplicate content – internal duplicate, external duplicate and partial duplicate. Duplicate content alone was not what the Panda update was designed to tackle, but this certainly falls under its remit and can be damaging to your site.
It’s true, that a few duplicates on your site won’t hurt you, Google is clever enough to filter them out, but when you reach a large scale, often on ecommerce websites with hundreds, if not thousands of product pages it can become a serious issue. Whilst duplicate content is not always indicative of thin content, it can certainly amplify any other Panda issues on the site.
The solution to this issue? Remove the content. There is an argument that you should add canonical tags to the content, but telling Google that one URL is canonical only to link to a number of other versions is not a solution, but prolonging and complicating the issue further.
This is an issue faced by a number of different sites but is more commonly found on ecommerce sites or sites that list things (such as jobs or holidays).
For example I took this screen grab from an ecommerce tyre site:
I then ran an exact match search in Google (using quotation marks around the search query) taking the first sentence from this paragraph – it returned 24 results. So on this ecommerce try site there are 24 product pages with duplicate content, and to make matters worse this is the only copy on the page other than the product name, header/footer and pricing information.
If you go down the route of creating individual product pages for the same product but in every available size – you’re going to have duplication issues or a great number of blank pages. The solution that this site implemented was to add internal canonical tags to all the pages.
Google is becoming increasingly less tolerant of cross site duplication, i.e. copying the content from someone else’s website. There is a general consensus that sites copying other people’s content is a matter of legality and ethics, while this may be the case – Google is a machine, and their algorithm doesn’t care about the ethics of it. It sees the same content word for word across two or more sites and sees it as low quality and either penalises it by ranking it lower than other sites, or filters it out altogether.
So how do you resolve this? If you own or control the content of all the sites displaying the same duplicate content, then you should either rewrite the content so it is unique across all the sites or apply a cross-domain canonical tag and choose which site you want to be the source of the content and which sites are copying it with authorisation and acknowledging that the content isn’t theirs originally.
If the sites are copying your content and you don’t control them or authorise it, you may have to file a DMCA notice and try and get the content taken down that way.
Partial duplicates are pages which vary, but by only a small amount of content, such as a couple of lines of text or a brand name. A common example of this is a reseller or affiliate site who may use their suppliers product descriptions, or a recruitment site that may be advertising on behalf of a client (and the client is also advertising the job specification on their site) – chances are, the original site will have the ranking advantage.
If you want to improve your ranking, you have to curate unique content to support the borrowed content. It doesn’t necessarily have to be paragraphs of content you’ve paid copywriters to write or invested internal man hours to create – it could simply be a short editorial piece unique to the product. Another option available to you is leveraging user generated content such as product/agency reviews or an indexed social comments plugin.
An argument I’ve faced before is that ‘we don’t have time or the money to create that much content’. However, you don’t have to do it all at the same time – pick your top level pages, your most viewed pages and your best selling products/converting pages and start there.
This is very similar to duplicate content, however it manifests itself in a different way. Rather than repeating body copy on pages, sites with a low ratio of unique structure content have too much structure and too little copy.
This can be a result of excessive navigation, repeated images, repetitive widgets and large amounts of footer text.
If your site structure is something you’re not really able to change (easily) then you need to look at the number of pages. If you have pages with less than 500 words of unique text on them but have a large amount of structural content – you need to be ruthless and ask yourself what value these pages bring. If they are absolutely necessary, beef them up with more unique content – if they’re not, get rid of them.
Another issue that a lot of sites face is an issue with taxonomy pages being indexed. These include paginated pages, categories, alphabetical, tags and search results).
These are internal search results pages and Google doesn’t like these, the reason being people don’t want to go from Google’s search results to yours – they want to see content that matches satisfies their search intent. These pages can also create a lot of duplication on site. I would recommend that you get your developer to noindex these pages.
Any change you make to your site that will impact your search index has to carefully considered. sites that have been penalised by the Panda update face a long road to recovery, but by addressing these issues before Panda addresses them for you, you can avoid the consequences.
Addressing thin content and increasing the level of uniqueness on your site – whether this be by removing low value pages, adding supporting content to them or addressing taxonomy issues, these changes will undoubtedly have a positive impact on your sites SEO.
A great way to improve the level of uniqueness on your site is via your onsite blog (if you don’t have one, get one!). This list may be specific for travel companies, but it is the most comprehensive and detailed list of blog post ideas I have come across (123 to be precise).