Won't be to hard to do this if we want to: https://github.com/wikimedia/operations-mediawiki-config/blob/44747b0002359f29c336cd3dab6b9f5d7d1c4947/w/robots.php
- would need to change this: https://github.com/miraheze/puppet/blob/4e68cce031a164027d28b44029ecba9958c04ab2/modules/mediawiki/files/robots.php
- Would need to make robots.php part of /srv/mediawiki/w/, not just /srv/mediawiki/
Alternatively, we could make robots.php part of MirahezeMagic, or even /srv/mediawiki/config with some work, I think.
Technically the location of the file means little and moving it to be in MirahezeMagic would seem out of scope - as it's a root-level file. Keeping it within puppet seems easiest as theres no gain from moving it to mw-config.
If or when this is implemented, the global robots.txt's content should be added to the robots.txt file even if wikis do not keep them in their MediaWiki:Robots.txt as there may be global incidents caused by web crawlers otherwise.