Should i use separate robots.txt file for each subdomain on a root domain?

Author Topic: Should i use separate robots.txt file for each subdomain on a root domain?  (Read 1815 times)

Offline seenalalTopic starter

  • Trade Count: (0)
  • Newbie
  • *
  • Thank You 0
  • Posts: 1
  • Karma: 0
my root domain in www.dontworry.com. I have a multiple of subdomains. Among them 2 are

(1) https://dontworry.com/realestate/ and (2) https://classified.dontworry.com/

so should i add robots.txt file for each of them???
i have domains which are in form as (1) i written above.

so what should i do?? please help


Offline John - Smith

  • Trade Count: (0)
  • Full Member
  • ***
  • Thank You 8
  • Posts: 141
  • Karma: 3
  • Gender: Male
    • online-paystubs
It is okay if you include the sitemap links of subdomains in the robots.txt of the root file. as it is the root domain if the main robots.txt file will be crawled by crawlers, rest will be also done automatically! 

 

Related Topics

  Subject / Started by Replies Last post
1 Replies
1917 Views
Last post 06-16-2014, 13:10:28
by Johnny
4 Replies
5047 Views
Last post 07-09-2016, 10:09:26
by PrimoPierotz
0 Replies
658 Views
Last post 01-17-2017, 06:39:23
by jeyavinoth
2 Replies
1200 Views
Last post 02-19-2018, 23:24:36
by fayeseom
3 Replies
1337 Views
Last post 08-02-2018, 19:04:03
by bachynskijosh