Robot.txt is good or bad for site?

Started by Nahin, 02-11-2014, 00:03:37

Previous topic - Next topic

NahinTopic starter

Today I know about the robot.txt and it also knows that we can set it for allow or disallow. But I have no idea that to using this is good or bad for site. Because if we don't use than Google never index the site so way to find out in search again if we add it than it will be hamper when Google will update any algorithm like panda, hummingbird, penguin.  :(


emedianetwork

Using Robot.txt file is not mandatory and could be used based on your desire. It actually stops or encourages links to be crawled by Google. By default, Google indexes and follows all links in a website. If you are using it, there is no problem at all, but use it wisely, otherwise, it can stop indexing some important pages by mistake.


kumar29

Robots.txt is a text file placed on the root of the server to tell the search engine crawlers that there are restrictions for the crawlers in crawling the targeted site, We can perform various tasks with robots.txt, some of them are prevent search engine crawlers from crawling the site, stop crawlers from specific pages/directories etc,

Emmaballet20

Robot.txt is good & bad that all depends condition ! Basically Robot.txt is useful because it helps to Google for understanding index links.

jaintech

Good bots, like Google's spider, crawl your site to index it for search engines or provide some other symbiotic use. Others spider your site for more nefarious reasons such as stripping out your content for republishing, downloading whole archives of your site or extracting your images. Hence Robot.txt is good for site.


jayanta1

The Robots.txt file of a website will work when it is used as a request to specific robots to ignore directories or files specified within the Robots.txt file. Websites with sub-domains generally need a Robots.txt file for each sub-domain, all so that information that is not viewable to the public is not picked for a keyword search.

jobtardis

The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to advising cooperating web crawlers and other web robots about accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard is different from, but can be used in conjunction with, Sitemaps, a robot inclusion standard for websites.
Job Openings in Yamaha [nofollow] ,Tata motors jobs [nofollow]
  •  

arindamdutta16

 The robots.txt is a simple text file in your web site that informs search engine bots how to crawl and index website or web pages. It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want.


jaysh4922

Its very important for a website Robot.txt is an on-page SEO technique and it is basically used to allow for the web robots also known as the web wanderers, crawlers or spiders. It is a program that traverses the website automatically and this helps the popular search engine like Google to index the website and its content. !-!


arunavaseo

Robot.txt is very useful in SEO. If you want to block any pages of your website which you think it could be harmful for your site, you can use robot.txt to avoid indexing by Google crawler.
  •