Preventing duplicate content

Author Topic: Preventing duplicate content  (Read 5310 times)

Offline Ana LuciaTopic starter

  • Trade Count: (0)
  • Newbie
  • *
  • Thank You 0
  • Posts: 1
  • Karma: 0
Preventing duplicate content
« on: 07-07-2014, 03:46:40 »
Hi Everyone,

I'm working on the development of a new website and I would like to have some opinions on how to avoid duplicate content.

Duplicate content can be an issue, because:
- The website lists several products in several pages that can be sorted by type, price, colour, material, etc.
- The website also has an internal search engine

My recommendations would be:
– create a “view all” page for each product category, and in every variation generated by the filters apply rel=”canonical” to that page
– add a ”NOINDEX, FOLLOW” tag to each of the pages generated by internal searches

In your opinion, are these procedures correct?

Now, supposing the filters and the searches only add “#filter” and “#search” to the original URLs. Would the duplicate content still be a problem?

Any help on this would be very much appreciated.


Thank you,
Ana Lucia


Offline mts

  • Trade Count: (0)
  • Full Member
  • ***
  • Thank You 11
  • Posts: 109
  • Karma: 0
  • Gender: Male
    • link directory
Re: Preventing duplicate content
« Reply #1 on: 07-08-2014, 12:26:51 »
Hi Everyone,

I'm working on the development of a new website and I would like to have some opinions on how to avoid duplicate content.

Duplicate content can be an issue, because:
- The website lists several products in several pages that can be sorted by type, price, colour, material, etc.
- The website also has an internal search engine

My recommendations would be:
– create a “view all” page for each product category, and in every variation generated by the filters apply rel=”canonical” to that page
– add a ”NOINDEX, FOLLOW” tag to each of the pages generated by internal searches

In your opinion, are these procedures correct?

Now, supposing the filters and the searches only add “#filter” and “#search” to the original URLs. Would the duplicate content still be a problem?

Any help on this would be very much appreciated.


Thank you,
Ana Lucia
This article should explain som things
http://moz.com/learn/seo/robotstxt

Offline Megan Brown

  • Trade Count: (0)
  • Jr. Member
  • **
  • Thank You 3
  • Posts: 55
  • Karma: 0
Re: Preventing duplicate content
« Reply #2 on: 07-11-2014, 03:33:21 »
Nice article! Thanks for sharing the link.

 

Related Topics

  Subject / Started by Replies Last post
24 Replies
9588 Views
Last post 07-10-2015, 03:45:49
by Harish
11 Replies
3876 Views
Last post 07-26-2015, 23:11:51
by Prabhat Dey
1 Replies
3554 Views
Last post 12-28-2015, 05:27:06
by SilverMoon
2 Replies
1098 Views
Last post 06-25-2016, 02:39:06
by TomClarke
1 Replies
687 Views
Last post 08-12-2016, 01:06:50
by TomClarke