FOSS developer and Miraheze Sysadmin.
causing icinga to warn because nginx access logs were filled with 429s.
It was doing this with the old system anyway on occasion. I assume you mean more frequently. Are 429s something we can exclude from the check? Would that be better?
It seems this problem will prolong indefinitely or should we expect a solution soon?
@Paladox and @Southparkfan are working to deal with this. The rate limiter will probably exist for the foreseeable future unless we are happy are resources can cope with traffic that was causing issues (DDoS/SQLis) but we are working to improve the solution so it doesn't affect legitimate traffic.
I've looked at the documentation as we should be doing that when you change the option and I see:
In MediaWiki 1.8 or older, if you change this after installation, you should run the maintenance/rebuildmessages.php script to rebuild the user interface messages (MediaWiki namespace). Otherwise, you will not see the interface in the new language, or a mix of the old and new languages. Note that running that script will override any custom interface messages you may have created.
Tue, Sep 22
Mon, Sep 21
Over to you to install, or should we just just close and handle the rest in the parent task
Sun, Sep 20
Could you reply to the dev on the github issue?
Per merged task
Sat, Sep 19
Hmm, I'd need to ask whoever asked that rule to see if we can switch it.
Yes we did that in mid July. I'm just not blacklisting the ~6000 entires manually it would take to resolve the error in search console. Google shouldn't be indexing them any more. They're old errors.
Removing randomly subscribed people
We blocked indexing of all special pages, this was fixed months back.
@Helper: please see the replies at https://github.com/edwardspec/mediawiki-extension-JsCalendar/issues/2
The part about localising custom namespaces, it would have to have a change to ManageWiki.
Actually, you're blocked on wiki so it'll be email@example.com
Fri, Sep 18
Mon, Sep 7
Moving to high, a lot of wikis use this.
Sat, Sep 5
Aug 24 2020
Aug 22 2020
Script should run within 2 weeks now. Sub tasks are open for the additional feature requests. Closing this as invalid as it's simply a tracking task for 3 seperate issues/requests
Patch is upstream to make this work for all wikis and disabled for wikis with longer db names than this supports.
I'm marked as busy (see staffwiki) but any on the list that still trigger icinga alerts can be removed.
We have the keys. We just need to decide whether we are okay doing this.
Aug 21 2020
Now available on upstream REL1_34 branch
Aug 20 2020
Noting for when the time comes to read over https://phabricator.wikimedia.org/T260281#6399181 and ensure we won't face the same issues
No one has updated the database, gonna create a seperate task.
Aug 19 2020
Adding the current security reviewers for their thoughts and maybe you could invent a proficieny test.
Here the alerts go again
Aug 18 2020
Looking at how you've set it up, you'll have to add me (rhinosf1[at]gmail.com) to your search console property.
We don't control how google rank your wiki. It's not reflected by *.miraheze.org but I'll look
Aug 16 2020
Seen, will check later.
ConfirmEdit is enabled now, I don't think there's any reason to change the cache type.
Aug 15 2020
@Paladox: could you run the alter on the tables now rather than waiting for upstream to merge?
WMF installed, auto-approved
No one known to be available and able. See email to sre@
Aug 14 2020
I'll see what others say
@Paladox: could we add https://github.com/miraheze/MirahezeMagic/commit/472a8acfcbb9798c37c8df88200cf23354209326 back or alter the table?
Well, this is very strange. I see the error is in field site_global_key which is defined as varbinary(32): while I cannot be sure of your naming convention, I assume you are identifying the wiki with an ID like datatrekwiki or datatrek so it should easily fit.
The insert it's failing on is for a wiki with a db name of 38 characters which is over 32. We seem to allow db names up to 64 charecters.
This will just get overridden once the script works. We need to find a method to add other wikis to the script that populates it and get that working.
I can see about the standard wikimedia ones if we can modify the script we reuse. I'd just leave the info on other wikis you want added here and we can work on it.
The easiest and more robust way to add sites to the sites table is using an XML file and importSites.php script as described here https://www.mediawiki.org/wiki/Manual:Sites_table#Adding_a_new_site .
I will provide a sample in T6041 for you to try.
That won't help, the script that we use to populate the sites table of wikis is failing because the column isn't big enough for the data it needs to hold. We're going to have that issue whatever way we try and import the table.
@Paladox: You've dealt with similar issues before, would you mind proposing whatever change needs to be done and/or applying it to our tables?
And another question: are you only adding Miraheze wikis to sites table? Meaning you are discarding the configuration with the default groups like wikipedia, commons, wikidata and so on?
Only Miraheze Wikis, it should be running when you turn the extension on as well but I'll do a manual run for your wiki.