Seo Forum

Search Engine Optimization => SEO Basics => Topic started by: danielnash on 03-08-2016, 22:25:58

Title: What is robots.txt used for?
Post by: danielnash on 03-08-2016, 22:25:58
I learnt here about sitemap.xml file. Now I'm little bit confused for what is robots.txt used for?

Please guys clear it.
Title: Re: What is robots.txt used for?
Post by: hoahoa1909 on 03-09-2016, 00:54:23
-Non-image files
For non-image files (that is, web pages) robots.txt should only be used to control crawling traffic, typically because you don't want your server to be overwhelmed by Google's crawler or to waste crawl budget crawling unimportant or similar pages on your site. You should not use robots.txt as a means to hide your web pages from Google Search results. This is because other pages might point to your page, and your page could get indexed that way, avoiding the robots.txt file. If you want to block your page from search results, use another method such as password protection or index tags or directives.

-Image files
robots.txt does prevent image files from appearing in Google search results. (However it does not prevent other pages or users from linking to your image.)

-Resource files
You can use robots.txt to block resource files such as unimportant image, script, or style files if you think that pages loaded without these resources will not be significantly affected by the loss. However, if the absence of these resources make the page harder to understand for Google's crawler, you should not block them, or else Google won't do a good job of analyzing your pages that depend on those resources.
Title: Re: What is robots.txt used for?
Post by: Nimit.Suri on 03-09-2016, 03:12:53
The robots.txt file is a simple text file placed on your web server which tells webcrawlers like Googlebot if they should access a file or not
Improper usage of the robots.txt file can hurt your ranking
The robots.txt file controls how search engine spiders see and interact with your webpages.The first thing a search engine spider like Googlebot looks at when it is visiting a page is the robots.txt file.It does this because it wants to know if it has permission to access that page or file. If the robots.txt file says it can enter, the search engine spider then continues on to the page files.

Hope this helps
Title: Re: What is robots.txt used for?
Post by: RH-Calvin on 03-29-2016, 00:46:50
Robots.txt is a text file that is used to insert in your website to contain instructions for search engine spiders. The file lists webpages that are allowed and disallowed from search engine crawling. They help to control the crawling activity on your website.
Title: Re: What is robots.txt used for?
Post by: carldweb on 03-30-2016, 04:06:26
The robots exclusion protocol, or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.
Title: Re: What is robots.txt used for?
Post by: jaysh4922 on 04-12-2016, 23:39:58
Robots.txt is a text file that is inserted into your website and contains information for search engine robots. The file lists webpages that are allowed and disallowed from search engine crawling.