For example: https://en.wikipedia.org/robots.txt transcludes https://en.wikipedia.org/wiki/MediaWiki:Robots.txt after the global Robots.txt. Can we do same thing for Miraheze?
Thanks!
For example: https://en.wikipedia.org/robots.txt transcludes https://en.wikipedia.org/wiki/MediaWiki:Robots.txt after the global Robots.txt. Can we do same thing for Miraheze?
Thanks!
Possibly would be normal priority, but will triage as low based on triaging of similar requests
Won't be to hard to do this if we want to: https://github.com/wikimedia/operations-mediawiki-config/blob/44747b0002359f29c336cd3dab6b9f5d7d1c4947/w/robots.php
Alternatively, we could make robots.php part of MirahezeMagic, or even /srv/mediawiki/config with some work, I think.
@Universal_Omega Cool. I really like the idea of making robots.php part of MirahezeMagic.
Technically the location of the file means little and moving it to be in MirahezeMagic would seem out of scope - as it's a root-level file. Keeping it within puppet seems easiest as theres no gain from moving it to mw-config.
@John: Oh makes sense. Any objections to implementing this through the puppet robots.php though?
If or when this is implemented, the global robots.txt's content should be added to the robots.txt file even if wikis do not keep them in their MediaWiki:Robots.txt as there may be global incidents caused by web crawlers otherwise.
if miraheze sysadmins implement WMF edition verbatim, it should simply append local edition below the global one.