My New Site Faces Crawling And Indexing Issues

Started by LindaPeterson, 12-15-2018, 04:02:02

Previous topic - Next topic

LindaPetersonTopic starter

I have created a new social networking site where user can upload their photos and create a community thereby. It is launched few weeks above. I have submitted my site in Google webmaster tool for indexing. But my site is not getting indexed even a single page. I have also try to fetch my site. But still i face the same problem.

When i checked for Crawl errors in Google webmaster tool, it says failed to crawl and asking to wait.

Can anyone help me in clearing this issue.?
  •  


DeveloperOnRent

Generally, it will take 4 days to 4 weeks to index any sites, so wait until that. Even if it is not get indexed after an interval of time and please make sure you have original content with you on your web pages and make sure your server has enough space and no issues.


levandau

You have added a sitemap, Ping, share social.....

Rattan11

Why does Google allow you to submit URLs?
Whenever you make changes to existing content on your site, or if you add something new, Google will need to crawl the page to identify the changes and update its index accordingly.

This helps to expedite the process. Whether you've launched an entirely new website, added new pages, or Google isn't indexing a page that it should, this gives people the option to let Google know about a particular page.

Assuming that your website gets a decent amount of traffic, Google is typically able to update its index without your help. So if you need to use this tool on a regular basis, there may be bigger issues at hand that should be addressed.

In my point of view you should SUBMIT YOUR SITEMAP in Google webmater Tool. Google will automatically crawl your all pages with in 24 hr.

asif bams

1.  Start with the Sitemap
Having a sitemap in place is SEO 101, and is critical for helping Google identify all the pages on your website. Sitemaps are also one of the most significant but easiest to fix SEO issues that websites encounter.

So what is a sitemap? Well, you can think of a sitemap as an easier way for Google to identify and crawl pages on your website. Instead of individually crawling each page and jumping from one page to another, a sitemap lists every page on your website in an easy to digest format for a bot. Google has a good page on what is a sitemap and why it is important.

Creating a sitemap is fairly easy and can be done for free on a number of sites like XML-Sitemaps.com or by using tools like Screaming Frog SEO Spider. If you use a commonly used CMS like WordPress, there are plugins that can assist with creating a dynamically generated sitemap.

Important – Once you create a sitemap, it needs to be submitted to Google. This is done through Google Search Console.

If you're looking for options for creating a sitemap for your website, this is a great resource. :( :)


MVMInfotech18

Crawling:

The first thing we need to look at is to make sure that all of our target pages can be crawled by the search engines. I say "target pages" because there will be occasions when you may want to actively stop certain pages being crawled, which I'll cover shortly.

indexing issue:

The spike in the green line shows when Google increased the number of URLs that is has classified as "not selected". This means that Google has found the pages to be very similar to other pages or are redirected. I spent a bit of time looking at this and believe that the cause was a faulty plugin that caused lots of duplicate URLs to be linked to – but I've been lazy as it is my own site and not fixed it yet!

If you're constantly adding new pages to your website, seeing a steady and gradual increase in the pages indexed probably means that they are being crawled and indexed correctly. On the other side, if you see a big drop (which wasn't expected) then it may indicate problems and that the search engines are not able to access your website correctly.

soniya_ss

Sure but before that can you please share the screenshot of the error? it will help me and others to understand the problem better.
  •  

ifixservices

A crawl block in robots.txt could also be the culprit if Google isn't indexing a single web page. To check if this is the case, paste the URL into the URL inspection tool in Google Search Console.