Personal tracking project for revi.
Apr 3 2021
Seems to be resolved now.
Apr 2 2021
https://github.com/miraheze/DataDump/pull/21/ will hopefully fix it.
A message has gone up on Special:DataDump and if you need a dump please request here on Phab (create another task) until we fix the issue.
Seems related to the following error: https://graylog.miraheze.org/messages/graylog_138/a49b2810-9384-11eb-a3cc-0200001a24a4
I (SocDemWiki) also have the same issue. https://socdemwiki.miraheze.org/wiki/Special:DataDump
Apr 1 2021
Mar 31 2021
This is happening cross-wiki.
Mar 20 2021
Mar 16 2021
Mar 11 2021
Feb 28 2021
Feb 25 2021
If or when this is implemented, the global robots.txt's content should be added to the robots.txt file even if wikis do not keep them in their MediaWiki:Robots.txt as there may be global incidents caused by web crawlers otherwise.
Feb 24 2021
@John: Oh makes sense. Any objections to implementing this through the puppet robots.php though?
Feb 21 2021
Technically the location of the file means little and moving it to be in MirahezeMagic would seem out of scope - as it's a root-level file. Keeping it within puppet seems easiest as theres no gain from moving it to mw-config.
Won't be to hard to do this if we want to: https://github.com/wikimedia/operations-mediawiki-config/blob/44747b0002359f29c336cd3dab6b9f5d7d1c4947/w/robots.php
- would need to change this: https://github.com/miraheze/puppet/blob/4e68cce031a164027d28b44029ecba9958c04ab2/modules/mediawiki/files/robots.php
- Would need to make robots.php part of /srv/mediawiki/w/, not just /srv/mediawiki/
Possibly would be normal priority, but will triage as low based on triaging of similar requests
Feb 3 2021
https://github.com/miraheze/MirahezeMagic/blob/master/maintenance/generateMirahezeSitemap.php#L79 is as I assume @Paladox found at fault. This should be the wiki's domain.
Adding Paladox based on recent Magic repo commits
I don't see how the two bugs are related. The best solution is to fix the script to show links to pages that exist.
Raising back to how @Reception123 had it
@Reception123 feel free to raise this task's priority as I'm not quite sure where/how to prioritize it.
Interesting... I mean, since we've (I think?) resolved the issue with sitemaps on private wikis, this seems like less of an issue, but it's not immediately clear to me what the best solution is to correct the problem with 404 pages being linked?
Feb 2 2021
Midnight UTC for us, for other users of the extension, it would be whatever they configure it to be.