This is a known issue, a workaround which is currently working for me, on Firefox's latest stable version, is to go directly to Special:UserLogin on those wikis after logging in on Meta.
- Queries
- All Stories
- Search
- Advanced Search
- Transactions
- Transaction Logs
Advanced Search
Today
Tue, Apr 16
UO said he would look into this. The regular way of doing redirects here doesn't work for this case specifically, but hardcoding this on the nginx config of the cache proxies is always an option, even if it's one we would rather not pursue if at all possible.
Fri, Apr 12
Note for other members of SRE seeing this error in the future, please see https://phabricator.wikimedia.org/T315737#8187139
Stalled on T12040
Thu, Apr 11
Requires using Chrome to submit evidence of site breakage...
Wed, Apr 10
Precondition failed: This Title instance does not represent a proper page, but merely a link target. from /srv/mediawiki/1.41/vendor/wikimedia/assert/src/Assert.php(49) #0 /srv/mediawiki/1.41/includes/title/Title.php(3932): Wikimedia\Assert\Assert::precondition(boolean, string) #1 /srv/mediawiki/1.41/includes/title/Title.php(3915): MediaWiki\Title\Title->assertProperPage() #2 /srv/mediawiki/1.41/includes/Revision/RevisionStore.php(1844): MediaWiki\Title\Title->getId(boolean) #3 /srv/mediawiki/1.41/includes/Revision/RevisionStore.php(1749): MediaWiki\Revision\RevisionStore->ensureRevisionRowMatchesPage(stdClass, MediaWiki\Title\Title) #4 /srv/mediawiki/1.41/includes/Revision/RevisionStore.php(1625): MediaWiki\Revision\RevisionStore->newRevisionFromRowAndSlots(stdClass, NULL, integer, MediaWiki\Title\Title, boolean) #5 /srv/mediawiki/1.41/includes/Revision/RevisionStore.php(2368): MediaWiki\Revision\RevisionStore->newRevisionFromRow(stdClass, integer, MediaWiki\Title\Title) #6 /srv/mediawiki/1.41/includes/Revision/RevisionStore.php(1304): MediaWiki\Revision\RevisionStore->loadRevisionFromConds(Wikimedia\Rdbms\DBConnRef, array, integer, MediaWiki\Title\Title) #7 /srv/mediawiki/1.41/includes/search/RevisionSearchResultTrait.php(51): MediaWiki\Revision\RevisionStore->getRevisionByTitle(MediaWiki\Title\Title, boolean, integer) #8 /srv/mediawiki/1.41/includes/search/RevisionSearchResult.php(18): RevisionSearchResult->initFromTitle(MediaWiki\Title\Title) #9 /srv/mediawiki/1.41/includes/search/SqlSearchResult.php(37): RevisionSearchResult->__construct(MediaWiki\Title\Title) #10 /srv/mediawiki/1.41/includes/search/SqlSearchResultSet.php(60): SqlSearchResult->__construct(MediaWiki\Title\Title, array) #11 /srv/mediawiki/1.41/includes/search/SearchResultSet.php(76): SqlSearchResultSet->extractResults() #12 /srv/mediawiki/1.41/includes/search/SearchResultSet.php(187): SearchResultSet->count() #13 /srv/mediawiki/1.41/includes/search/SearchEngine.php(206): SearchResultSet->shrink(integer) #14 /srv/mediawiki/1.41/includes/search/SearchEngine.php(99): SearchEngine->maybePaginate(Closure) #15 /srv/mediawiki/1.41/includes/specials/SpecialSearch.php(466): SearchEngine->searchText(string) #16 /srv/mediawiki/1.41/includes/specials/SpecialSearch.php(248): MediaWiki\Specials\SpecialSearch->showResults(string) #17 /srv/mediawiki/1.41/includes/specialpage/SpecialPage.php(727): MediaWiki\Specials\SpecialSearch->execute(NULL) #18 /srv/mediawiki/1.41/includes/specialpage/SpecialPageFactory.php(1621): MediaWiki\SpecialPage\SpecialPage->run(NULL) #19 /srv/mediawiki/1.41/includes/MediaWiki.php(357): MediaWiki\SpecialPage\SpecialPageFactory->executePath(string, RequestContext) #20 /srv/mediawiki/1.41/includes/MediaWiki.php(960): MediaWiki->performRequest() #21 /srv/mediawiki/1.41/includes/MediaWiki.php(613): MediaWiki->main() #22 /srv/mediawiki/config/initialise/entrypoints/index.php(100): MediaWiki->run() #23 /srv/mediawiki/config/initialise/entrypoints/index.php(95): wfIndexMain() #24 {main}
Exception does not seem to have been logged. That will need to be solved first unfortunately.
Tue, Apr 9
Fixed, again. I see this happened after toggling the private setting on your wiki, which makes me think the setContainersAccess job is not running.
Sat, Apr 6
Done in https://github.com/miraheze/CreateWiki/commit/001c5a422feab8c0afa116ca90cbad16b4beb393, with some required follow-up fixes at https://github.com/miraheze/CreateWiki/commit/fa9cd74a0ce5a979ee822d03c8208f5282a9bc77, https://github.com/miraheze/CreateWiki/commit/d4ddc2ff2b96224c455e27d39ef24beb6d0046a7 and https://github.com/miraheze/CreateWiki/commit/45b622d624fff8f15303e68a51020f393118ec77 required in order for it to fully work.
Thu, Apr 4
Wed, Apr 3
script finished.
Tue, Apr 2
If you get this exception again, can you copy paste the full exception (including the random string of letters) into the comments instead of taking a screenshot? I need a new one to find what's going wrong here.
XML import of the Development: pages is done. I think this is now finished (though there's likely some cleanup and setup work to do depending of which extensions you had installed on your wikis, as well as probably moving all links to pages in dev: to their equivalent in Development:). Do let me know something else is pending though.
Image import is done!
Mon, Apr 1
@Thorbjorn disregard the above message, I've just downloaded your image dump thanks to help from another sysadmin. SHA256 hashes match with the one you've given me, I'll start the image import shortly. let me know once you have that new XML dump with the Development: namespace
There are no AAAA records for files.mapeditor.org, so I can't reach it over IPv6. If the file dump is small enough, you can try uploading it to this ticket instead.
Sun, Mar 31
RequestSSL actually does not use anything from ManageWiki at all (other than the logs, which only exist properly if ManageWiki is loaded), all the servername change magic happens through CreateWiki.
Hmmm, that's going to be complicated because Miraheze's network, with the exclusion of the HTTP reverse proxies, only have a public IPv6... I'd appreciate if you could move the dump to a server I can reach over IPv6.
@Thorbjorn I'm unable to download your newest image dump.
Connecting to updates.themanaworld.org (updates.themanaworld.org)|2001:4ba0:babe:1913::|:443... failed: Connection refused.
I'm able to ping updates.themanaworld.org. Please, make sure that 2602:294:0:b12::107 (mwtask181's IPv6) is not being blocked by your firewall.
Fri, Mar 29
I'll try to get the rest of the imports done by this weekend @Thorbjorn, sorry for not answering for a while. I should say I tried downloading the newest image dump you provided and your server was blocking HTTP requests from our servers, please do look into that.
In T11978#240335, @Thorbjorn wrote:Btw, the https://updates.themanaworld.org/wikimigration/mediawikidump.xml was also updated now that we've renamed all "Dev:" pages to "Development:". Is that alright, or do you need a file containing just the "Development:" pages?
A file with just that namespace would be greatly appreciated.
Thu, Mar 28
Wed, Mar 27
Tue, Mar 26
Issue of getting logged out is due to CentralAuth and is a known issue; only workaround is to go directly to Special:UserLogin on the wiki you want to sign in to after logging in on Miraheze Meta. Per T11988#240178, it seems the issue with the passwords has already been solved, so not much to do here for now.
Using RDAP (preferably) or WHOIS is a better solution for these kinds of issues.
Hmm, actually can you try making another image dump, just in case?
GHSA published, CVE ID CVE-2024-29883 has been assigned to it.
Fixed. GHSA will be published shortly.
Mon, Mar 25
Available since commit 675caeb is the REST endpoint createwiki/v0/query_wiki_request/{id}. You can see it in action at https://meta.mirabeta.org/w/rest.php/createwiki/v0/query_wiki_request/5. Keep in mind the following:
Sun, Mar 24
Part of this task is done: the canned responses are now available to JavaScript code on subpages of Special:RequestWikiQueue. However, I'm stalling the rest of this task on T11931 because my original idea of hacking the form in order to allow for canned responses to be edited is too hacky even for me. Having an API will allow for this to be done more cleanly.
XML import finished, I see some redlinks on the wiki however, pages on the Dev: namespace have not been imported as expected, but maybe a few more also didn't manage to go through. So, here's what's still pending:
Sat, Mar 23
Unable to reproduce the "redirects being broken" issue. Which template were you trying to recreate in the last image @Dr.Teatime?
Fri, Mar 22
@Thorbjorn the XML import is still ongoing as we speak, you might want to look at the image dump and whether it has all the files because that one has completed and it only imported 6 files (Unless something went wrong there but it seems unlikely).
Mar 20 2024
Yep, it's fine, feel free to keep editing.
Mar 17 2024
Nah, it was okay to reopen this one. Can you post the new error on another comment, same as before?
Mar 16 2024
@Dr.Teatime In your last image, it would be very helpful if you could copy paste the text in the red box in a new comment. We need it in order to find out the underlying reason for that error.
Mar 15 2024
So, to recap, it was found by @Agent_Isai that users misconfiguring wgRCMaxAge was the cause behind at least some mysterious closures. The cause behind T11598 is unknown, or at least I don't remember anyone saying what happened to that wiki, though I assume it was unrelated to RCMaxAge being misconfigured. This is all I known regarding this topic as of posting this.
I'll test this on Mirabeta. If it doesn't blow up, I'll change config so that both can be enabled at the same time.
Mar 14 2024
Yes, I do intend to help with the backlog here. That's why I was asked to consider making this request in the first place. To elaborate on why I want to remain mostly focused on SWE for now, I'm very limited in terms of time right now due to IRL constraints, and am thus unable to really commit to day-to-day MWE work (though I'm trying to be online from 16:00 to 20:30 UTC+1, and mostly succeeding so far). SWE work on the other hand I can take care of relatively slowly over time and is less time sensitive than bug reports.
Mar 13 2024
Mar 10 2024
Mar 8 2024
WHOIS unfortunately has no future; soon enough, there'll be no guarantee that gTLDs have a WHOIS server, only RDAP will be guaranteed to work with them by 2025 (https://www.icann.org/resources/pages/global-amendment-2023-en). Only problem currently is the low adoption rate of RDAP among ccTLDs since I think currently they don't have a requirement to have a RDAP server, but I think they'll follow suit sooner rather than later. RDAP also has a big advantage over WHOIS for our use case: it is machine readable JSON over HTTPS, much better than trying to parse WHOIS output (which is server-dependent). We should prepare for the future and support RDAP, therefore I still believe RequestSSL should use RDAP for this use case, and that we should write our own client library.
Mar 7 2024
Unfortunately, while we can do that for CNAMEs (and will), we can't for NS.
I know how to check if we're the authoritative nameserver for a domain now. I'm going to use RDAP. I don't like any of the libraries that exist (only one exists which I don't really like, the rest are unmaintained), so I'll make one and upload it to Packagist. @Universal_Omega would it be OK to host the repo for this library on the Miraheze GitHub org or not?