What is robots.txt used for?

Author Topic: What is robots.txt used for?  (Read 1220 times)

Offline danielnashTopic starter

  • Trade Count: (0)
  • Jr. Member
  • **
  • Thank You 6
  • Posts: 84
  • Karma: 0
  • Gender: Male
    • WordPress Development
What is robots.txt used for?
« on: 03-08-2016, 22:25:58 »
I learnt here about sitemap.xml file. Now I'm little bit confused for what is robots.txt used for?

Please guys clear it.


Offline hoahoa1909

  • Trade Count: (0)
  • Semi-Newbie
  • *
  • Thank You 1
  • Posts: 22
  • Karma: 0
Re: What is robots.txt used for?
« Reply #1 on: 03-09-2016, 00:54:23 »
-Non-image files
For non-image files (that is, web pages) robots.txt should only be used to control crawling traffic, typically because you don't want your server to be overwhelmed by Google's crawler or to waste crawl budget crawling unimportant or similar pages on your site. You should not use robots.txt as a means to hide your web pages from Google Search results. This is because other pages might point to your page, and your page could get indexed that way, avoiding the robots.txt file. If you want to block your page from search results, use another method such as password protection or index tags or directives.

-Image files
robots.txt does prevent image files from appearing in Google search results. (However it does not prevent other pages or users from linking to your image.)

-Resource files
You can use robots.txt to block resource files such as unimportant image, script, or style files if you think that pages loaded without these resources will not be significantly affected by the loss. However, if the absence of these resources make the page harder to understand for Google's crawler, you should not block them, or else Google won't do a good job of analyzing your pages that depend on those resources.

Offline Fadia Sheetal

  • Trade Count: (0)
  • Semi-Newbie
  • *
  • Thank You 1
  • Posts: 48
  • Karma: 0
  • Gender: Female
    • Buy Spoken English Books Online and Personality Development in Surat
Re: What is robots.txt used for?
« Reply #2 on: 03-09-2016, 03:03:49 »
Simply saying that robot.txt is use for to stop the search engines spiders to crawl the website or web page.

Offline Nimit.Suri

  • Trade Count: (0)
  • Semi-Newbie
  • *
  • Thank You 1
  • Posts: 45
  • Karma: 0
  • Gender: Male
    • IT Solutions |IT Outsourcing and Consulting Services - Synergy Technology Services
Re: What is robots.txt used for?
« Reply #3 on: 03-09-2016, 03:12:53 »
The robots.txt file is a simple text file placed on your web server which tells webcrawlers like Googlebot if they should access a file or not
Improper usage of the robots.txt file can hurt your ranking
The robots.txt file controls how search engine spiders see and interact with your webpages.The first thing a search engine spider like Googlebot looks at when it is visiting a page is the robots.txt file.It does this because it wants to know if it has permission to access that page or file. If the robots.txt file says it can enter, the search engine spider then continues on to the page files.

Hope this helps

Offline ShreyaKoushik

  • Trade Count: (0)
  • Hero Member
  • *****
  • Thank You 53
  • Posts: 500
  • Karma: 0
  • Gender: Female
    • Scope of e-commerce in India
Re: What is robots.txt used for?
« Reply #4 on: 03-25-2016, 04:57:30 »
Robots.txt is common name of a text file that is uploaded to a Web site's root directory and linked in the html code of the Web site. The robots.txt file is used to provide instructions about the Web site to Web robots and spiders. Web authors can use robots.txt to keep cooperating Web robots from accessing all or parts of a Web site that you want to keep private.


Use of Robots.txt - The most common usage of Robots.txt is to ban crawlers from visiting private folders or content that gives them no additional information.
Robots.txt Allowing Access to Specific Crawlers.
Allow everything apart from certain patterns of URLs.


Offline RH-Calvin

  • Trade Count: (0)
  • Hero Member
  • *****
  • Thank You 52
  • Posts: 1018
  • Karma: -1
  • Gender: Male
    • Cheap VPS Hosting
Re: What is robots.txt used for?
« Reply #5 on: 03-29-2016, 00:46:50 »
Robots.txt is a text file that is used to insert in your website to contain instructions for search engine spiders. The file lists webpages that are allowed and disallowed from search engine crawling. They help to control the crawling activity on your website.

Offline carldweb

  • Trade Count: (0)
  • Jr. Member
  • **
  • Thank You 7
  • Posts: 62
  • Karma: 0
  • Gender: Male
    • Digillence Rolson Services Pvt. Ltd.
Re: What is robots.txt used for?
« Reply #6 on: 03-30-2016, 04:06:26 »
The robots exclusion protocol, or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.

Offline jaysh4922

  • Trade Count: (0)
  • Sr. Member
  • ****
  • Thank You 10
  • Posts: 341
  • Karma: 0
Re: What is robots.txt used for?
« Reply #7 on: 04-12-2016, 23:39:58 »
Robots.txt is a text file that is inserted into your website and contains information for search engine robots. The file lists webpages that are allowed and disallowed from search engine crawling.

 

Related Topics

  Subject / Started by Replies Last post
24 Replies
3315 Views
Last post 05-10-2012, 01:38:00
by ring2012
10 Replies
1908 Views
Last post 11-29-2012, 04:09:32
by seocyrusjpr
17 Replies
5485 Views
Last post 04-19-2014, 00:47:04
by ComoliWesta
3 Replies
1720 Views
Last post 09-06-2016, 01:24:52
by TomClarke
12 Replies
2620 Views
Last post 10-05-2016, 06:39:48
by mukeshkr