Duplicate pages

Started by Shantaram, 02-11-2017, 16:17:10

Previous topic - Next topic

ShantaramTopic starter

Hi everyone, Im a newbie in the seo business, and just started working in a company running a yellow page site, in the format of
A stock of professionals(ie elecricians, technicians..) from all fields, then people who are looking for a certain professional can call a number appears on the page, depends on the city he lives in and the professional he needs.
The problem is, currently the amount of duplicated pages is enormous! Since there 're about 1300 cities and even more, and about 700 professions.. Each page for each topic was written once, and was copied to all cities with only the change of the city name.
From what I ubderstand its a huge penalty by Google.

What do you suugest me to do? Maybe dynamic pages?

Thank you!
  •  


wellliving

Click the Pages Menu from the top bar of the Editor.
Click the relevant page.
Click the Show More icon .
Click Duplicate .
Type a new page name.
Click Done.


ContentHeat

I suggest a couple of solutions that could be helpful:

1. Try using a 301 redirect to the correct URL and the original content page.
2. Try the rel="canonical" attribute – that will tell SERPs that a specific page with all its links and content metrics should be treated as a copy of a specified URL.
3. Try setting up your preferred domain and/or parameter handling tool in Google Search Console, depending on your URL structure.
4. Try using Meta Robots with Noindex/Nofollow values
  •  

kaliya

Yes for a new SEO it's a common problem for such huge sites. You just need to include location and city name in each page of title. But you will also include the physical address of each service provider. You also need to include duplicate urls in robots.txt, those will produce from search filter in your website.
  •  

infosteve

You should add canonical tag on your webpage so that you can resolve this issue..


nickeypickorita

You should definitely check https://www.copyscape.com/ - Copyscape and see if these are seen as duplicate pages in google's eyes.
If they are I would change them to make them unique, otherwise, you are risking a penalty.

damponting44

Creating a duplicate post or page allows you to work on it without affecting the existing version. In this article, we will show you how to quickly duplicate a WordPress page or post with all the settings.

amayajace

    Place your cursor at the beginning of the page you want to copy.
    Click and drag the cursor to the bottom of the page you want to copy.
    Press Ctrl + C on your keyboard. Tip: Another way to copy your highlighted text is to click Home > Copy.


Lishmalinyjames

Place your cursor at the beginning of the page you want to copy.
Click and drag the cursor to the bottom of the page you want to copy.
Press Ctrl + C on your keyboard. Tip: Another way to copy your highlighted text is to click Home > Copy.


ShreeVaghani

Quote from: Shantaram on 02-11-2017, 16:17:10
Hi everyone, Im a newbie in the seo business, and just started working in a company running a yellow page site, in the format of
A stock of professionals(ie elecricians, technicians..) from all fields, then people who are looking for a certain professional can call a number appears on the page, depends on the city he lives in and the professional he needs.
The problem is, currently the amount of duplicated pages is enormous! Since there 're about 1300 cities and even more, and about 700 professions.. Each page for each topic was written once, and was copied to all cities with only the change of the city name.
From what I ubderstand its a huge penalty by Google.

What do you suugest me to do? Maybe dynamic pages?

Thank you!

Hello,



1. Try using a 301 redirect to the correct URL and the original content page.
3. Try setting up your preferred domain and/or parameter handling tool in Google Search Console (GSC), depending on your URL structure.
4. Try using meta robots with Noindex/Nofollow values
2. Try the rel=" canonical" attribute – which will tell SERPs that a specific page with all its links and content metrics should be treated as a copy of a specified URL.