Seo Forum

Search Engine Optimization => SEO Basics => Topic started by: seenalal on 04-28-2019, 00:13:12

Title: Should i use separate robots.txt file for each subdomain on a root domain?
Post by: seenalal on 04-28-2019, 00:13:12
my root domain in www.dontworry.com. I have a multiple of subdomains. Among them 2 are

(1) https://dontworry.com/realestate/ and (2) https://classified.dontworry.com/

so should i add robots.txt file for each of them???
i have domains which are in form as (1) i written above.

so what should i do?? please help
Title: Re: Should i use separate robots.txt file for each subdomain on a root domain?
Post by: John - Smith on 05-19-2019, 23:06:05
It is okay if you include the sitemap links of subdomains in the robots.txt of the root file. as it is the root domain if the main robots.txt file will be crawled by crawlers, rest will be also done automatically! 
Title: Re: Should i use separate robots.txt file for each subdomain on a root domain?
Post by: heenamajeed on 07-29-2019, 20:33:33
3 Answers. On the off chance that your test organizer is arranged as a virtual host, you need robots.txt in your test envelope too. (This is the most widely recognized utilization). Be that as it may, in the event that you move your web traffic from subdomain through .htaccess record, you could adjust it to consistently utilize robots.txt from the base of your primary area.
Title: Re: Should i use separate robots.txt file for each subdomain on a root domain?
Post by: amayajace on 08-03-2019, 00:08:42

But if you move your web traffic from subdomain via .htaccess file, you could modify it to always use robots.txt from the root of your main domain. When the crawler fetches test.domain.com/robots.txt that is the robots.txt file that it will see. It will not see any other robots.txt file.