Page MenuHomeMiraheze

Let each wiki control their robots.txt via MediaWiki:Robots.txt
Closed, ResolvedPublic

Description

For example: https://en.wikipedia.org/robots.txt transcludes https://en.wikipedia.org/wiki/MediaWiki:Robots.txt after the global Robots.txt. Can we do same thing for Miraheze?

Thanks!

Event Timeline

Dmehus triaged this task as Low priority.Feb 21 2021, 19:36
Dmehus subscribed.

Possibly would be normal priority, but will triage as low based on triaging of similar requests

Technically the location of the file means little and moving it to be in MirahezeMagic would seem out of scope - as it's a root-level file. Keeping it within puppet seems easiest as theres no gain from moving it to mw-config.

Unknown Object (User) added a comment.Feb 24 2021, 02:13

@John: Oh makes sense. Any objections to implementing this through the puppet robots.php though?

In T6881#135656, @John wrote:

Technically the location of the file means little and moving it to be in MirahezeMagic would seem out of scope - as it's a root-level file. Keeping it within puppet seems easiest as theres no gain from moving it to mw-config.

Ah, true. Makes sense. I'm not fussed about which repo it's in.

If or when this is implemented, the global robots.txt's content should be added to the robots.txt file even if wikis do not keep them in their MediaWiki:Robots.txt as there may be global incidents caused by web crawlers otherwise.

In T6881#135941, @R4356th wrote:

If or when this is implemented, the global robots.txt's content should be added to the robots.txt file even if wikis do not keep them in their MediaWiki:Robots.txt as there may be global incidents caused by web crawlers otherwise.

if miraheze sysadmins implement WMF edition verbatim, it should simply append local edition below the global one.

Unknown Object (User) claimed this task.Mar 11 2021, 20:54
Unknown Object (User) closed this task as Resolved.Mar 16 2021, 00:03