Robots.txt file block description in Google

Started by CreativeDreamrz, 11-24-2016, 04:52:08

Previous topic - Next topic

CreativeDreamrzTopic starter

Hi All please help me!!

I have an  indexing problem in my website. My website domian is rd-fitness dot com. when i type simple website name like "www.rd-fitness.com" in Google then google gives this message "A description for this result is not available because of this site's robots.txt"

But with https "https://www.rd-fitness.com"  description are shown

My Website's robots.txt files are:
<--------------- robots.txt---------------->

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

And
<--------------- .htaccess file---------------->
# BEGIN WordPress
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>


RewriteEngine On
RewriteCond %{HTTPS} =on
RewriteRule ^robots.txt$ robots-deny-all.txt [L]
# END WordPress


Sitemap are submitted successfully in webmaster and are not showing any warning or errors.. Also i have fetched the website's url in webmaster but still not index any single url by google ..Where is problem?? Can anyone help me please..

Thanks in Advance!


anjali0222

#1
The discrepancy between your website's indexing with and without "https" can be due to various reasons. Here are some possible factors that might be affecting the indexing of your website:

1. Robots.txt File: The "robots.txt" file on your website might be blocking certain content from being indexed by search engines. Ensure that the directives in the "robots.txt" file are not preventing Google from accessing and indexing your website's content.

2. HTTPS Configuration: The fact that Google is able to display the description for the "https" version of your website but not the non-secure version might indicate a configuration issue related to SSL/HTTPS. Verify that your SSL certificate is correctly installed and there are no mixed content issues (i.e., both secure and non-secure content being served on the same page).

3. Sitemap and Fetching: While you mentioned that the sitemap has been submitted successfully in Google Webmaster Tools and fetching has been done, it's crucial to ensure that the sitemap is up-to-date, includes all relevant URLs, and isn't encountering any errors during the crawling process.

4. Canonicalization: Check for proper canonical tags on your website to avoid duplicate content issues. Canonical tags help specify the preferred version of a web page when multiple versions exist (such as HTTP and HTTPS).

5. Crawl Errors: Review Google Search Console for any crawl errors or issues that might be preventing Google from properly indexing your website. Resolve any reported errors and re-submit the sitemap if necessary.

6. Content Quality and Relevance: Ensure that the content on your website is of high quality, relevant, and original. Google prioritizes websites with valuable content for indexing.

To address the problem, I recommend thoroughly reviewing each of these potential issues, making any necessary adjustments, and monitoring the impact on the indexing of your website. Additionally, consider seeking further assistance from SEO specialists or web developers who can conduct a detailed analysis and provide targeted solutions for improving the indexing of your website.


CreativeDreamrzTopic starter

Thanks For Answering! But still i am not getting your point but you asking..

Now i have removed content from by robots.txt file (Means now robots.txt file is empty) still in google showes message " description is blocked by Google" ... I have frustrated.