User Details
- User Since
- Oct 27 2019, 05:18 (213 w, 2 d)
- Availability
- Available
- GitHub User
- Unknown
- Miraheze User
- Jidanni [ Global Accounts ]
Jun 17 2022
Anyway, I have already assigned a new bureaucrat for https://radioscanningtw.miraheze.org/ so wish to retire myself there.
Jan 9 2021
Miraheze needs to take the advice of
https://webmasters.stackexchange.com/questions/117744/how-to-resolve-google-indexed-though-blocked-by-robots-txt
then Google would no longer complain.
Jan 2 2021
Oct 23 2020
I'm telling you guys something fishy is up.
I still think Miraheze is being abused.
Checking the logs for just last week
on (My Granny's readonly archive wiki) https://abj.miraheze.org/
are you sure this is normal?
Sep 30 2020
Sep 19 2020
Yes but with robots.txt:
All I know is anybody who indexes their site with Google should be getting these
Here's what they look like:
Aug 24 2020
Whatever you think is best.
May 11 2020
OK I made a workaround.
May 10 2020
Anybody have a better way to solve https://phabricator.miraheze.org/T5504#107946 other than one by one entering their name into the box...?
May 4 2020
OK, I have converted half of my Talk pages, but can't figure out ( https://phabricator.wikimedia.org/T251717 ) a better way to finish the job other than one by one pasting their names into that Special:EnableStructuredDiscussions box and clicking Submit. After which I also need to run it on all my User Talk pages. Anybody got a better way? Thanks.
OK, thanks. And I can confirm another existing page I just converted looks good too.
May 3 2020
Alas, it appears the script had no effect...
Thanks. This is a page on the wiki with the problem that you can observe if running the script fixes.
May 2 2020
I cannot proceed converting the previously existing talk pages on my wiki until this is done.
May 1 2020
Then how about https://www.mediawiki.org/wiki/Manual:$wgServer ?
Just use https://{{SERVERNAME}}/pagename or #External_links_to_internal_pages , both mentioned on https://www.mediawiki.org/wiki/Help:Links .
Apr 30 2020
Apr 29 2020
OK, on
https://www.mediawiki.org/wiki/Extension:StructuredDiscussions
it says
To enable it on a single page, use Special:EnableStructuredDiscussions. This requires the flow-create-board right, which can be granted to any group
But as you see,
it seems to already have been granted!
P.S., one also notices raw
Metrolook Bartile When "Metrolook Down Arrow" is enabled and "Metrolook Bartile" is enabled, the tile menu will be generated from [[MediaWiki:Metrolook-tiles]]. If "Metrolook Down Arrow" is not set and "Metrolook Bartile" is not set, then the tile menu will be generated from [[MediaWiki:Metrolook-tiles-second]].
on the same page.
Thanks, but making them clickable has nothing to do with security or what rights one has when viewing the page. It's the same problem if you are the wiki owner or not. They should simply be wrapped in <a href...>'s .
And no, I don't know why a logged in user ended up editing
https://radioscanningtw.miraheze.org/wiki/User_talk:180.217.159.3
Please do the following steps on my wiki:
https://www.mediawiki.org/wiki/Extension:StructuredDiscussions#Migrating_existing_pages
https://www.mediawiki.org/wiki/Extension:StructuredDiscussions#Old_Remnants
to fix existing pages.
I see it is already working for new pages:
https://radioscanningtw.miraheze.org/wiki/User_talk:Jidanni
I'm saying that there is a problem with the extension, but I wish miraheze's staff will first isolate the problem, then report it directly to the extension makers.
Yes.
The current way to see what I am talking about is:
- Browse https://meta.miraheze.org/wiki/Special:ManageWiki/settings
- Click "Styling"
Apr 27 2020
(At first looking at the log it appears you got one wrong.
 21:04, 26 April 2020 Paladox talk contribs block modified the namespace "User" for wiki "radioscanningtwwiki" (Enable flow on User namespace)
 21:00, 26 April 2020 Paladox talk contribs block modified the namespace "<Main>" for wiki "radioscanningtwwiki" (Enable flow on Talk namespace)
Also, within the parentheses mentionined in T5472, perhaps also add a note that one needs to take extra steps after installation. As the user doesn't know if your scripts do all the work or not.
But everything still looks the same. E.,g., this User Talk page.
Apr 26 2020
Indeed, after many actions, e.g.,
the user want to see all the buttons back again.
Sure, he can simply hit BACK in his browser,
but the user worries about resubmitting forms,
which is what usually happens when one does that.
OK, I made a note of it.
Apr 23 2020
Better keep it the same, lest this become a maintenance headache for you guys.
Let's see if other users run into this problem first.
Apr 15 2020
Apr 12 2020
So far so good...
OK, I changed <seo__ to <seo_ (removing one space.)
That fixed whatever was new.
Apr 6 2020
Well all I know is apparently I want this stuff, https://phabricator.wikimedia.org/T249396 , whatever it is called.
Apr 4 2020
I see, you are saying I should find the appropriate templates, and then use https://www.mediawiki.org/wiki/Help:Templates#Copying_from_one_wiki_to_another
procedures to copy them to my wiki.
Apr 3 2020
Well all I know is users might discover Special:WikiDiscover by looking at Special:SpecialPages, but then they need a way to learn about Special:Log/farmer, as they probably won't check there themselves.
I also found https://en.wikipedia.org/wiki/Template:User_link .
Are you saying I should file an issue at https://phabricator.wikimedia.org/ ?
But Wikipedia users have been pinging each other for years.
And so can meta.miraheze.org users.
I'm saying that users on *.miraheze.org should be able to ping each other too (on the same wiki.)
Apr 2 2020
OK, though DiscussionTools makes Talk pages look unfamiliar, at least it will do the ~~~~ job.
(In https://www.mediawiki.org/wiki/Extension_talk:DiscussionTools I do note there is no link to a real description.)
No!
I am saying,
for all sites,
you have a problem.
Google will send a message to all people who use its Webmaster Tools about the problem.
The problem you can read about in the top of this bug report.
The only related setting is on
https://abj.miraheze.org/wiki/Special:ManageWiki/settings#mw-section-restricted
but users can't change it anyway.
Mar 25 2020
OK, do mention Special:Log/farmer on Special:WikiDiscover. But note it is real good for finding the newest wiki, but painful for finding the oldest.
Mar 24 2020
If established dates are not stored anywhere, then certainly some "wiki number" is.
The user could also find which wiki was the very first wiki.
(A workaround for users who have messed up a setting or two would be: find a new (T5351) wiki that nobody had customized yet. Then check Special:ManageWiki/permissions etc. there to see what the defaults are.)
OK created https://phabricator.wikimedia.org/T248411 .
Maybe it was due to different settings being available when each wiki was born.
It says "Resolved".
Is there some link to what resolved it?
I see "Resolved".
Maybe the software was updated?
Maybe not?
I see "Resolved".
That's great.
However how did it get resolved?
Mar 3 2020
P.S., if the page is restricted, shouldn't the log also be?
https://radioscanningtw.miraheze.org/wiki/%E7%89%B9%E6%AE%8A:%E6%97%A5%E8%AA%8C/datadump clearly shows something is wrong,
Feb 29 2020
Which we can't as there is no button to do so.
Unless we remember to go to Special:Specialpages on our own again...
OK, manually hitting ENTER on the browser URL bar's
https://radioscanningtw.miraheze.org/wiki/%E7%89%B9%E6%AE%8A:DataDump/view/xml
gives us a fresh view: "No dumps have been found, please generate one."
After deleting the dump, the name stays on the screen.
In fact the name is still even hyperlinked.
In fact sending the same POST, wpdelete_radioscanningtwwiki_xml_6104ed4ef6c712e92cbe.xml.gz: 1
over and over never deletes it.
So we see there is no longer a way to delete dumps, and thus no way to make new ones.
Feb 5 2020
And here we see that it can't be in robots.txt, else Google can't read the <meta name="robots" content="noindex,nofollow"/> ...
https://webmasters.stackexchange.com/questions/117744/how-to-resolve-google-indexed-though-blocked-by-robots-txt
OK, it seems even
<meta name="robots" content="noindex,nofollow"/>
isn't enough to prevent "Indexed, though blocked by robots.txt"
See https://yoast.com/x-robots-tag-play/
Feb 2 2020
Jan 21 2020
Jan 19 2020
Jan 18 2020
Here's one that's on every wiki:
$ mech-dump --links https://abj.miraheze.org/|grep ^
login.miraheze.org
Jan 16 2020
Jan 11 2020
Jan 7 2020
OK!, but
$ w3m -dump https://abj.miraheze.org/wiki/Special:RottenLinks?stats=1
Status of external links
Jan 6 2020
I am not saying "please update it for my wiki".
I am saying "please give users a way to update it without needing a Steward's help."