Page MenuHomeMiraheze

Duplicate log entries being added in DataDump extension
Closed, ResolvedPublic

Description

Bug: If you'll examine this wiki's DataDump public logs, you'll likely note that duplicate log entries are being added for every deletion action performed. Note that generating the dumps only creates a single log entry. Additionally, it should be noted I clicked the "delete" button only once for each of the two dumps. Thanks

Reproducable? Yes, this has been observed on other wikis by other users as well.

Event Timeline

Dmehus triaged this task as Normal priority.Thu, Jan 7, 17:03
Dmehus created this task.

I can't reproduce this - I deleted the ManageWiki backup on test2 and only one log entry was made.

In T6698#131809, @John wrote:

I can't reproduce this - I deleted the ManageWiki backup on test2 and only one log entry was made.

How many backup dumps were listed on the special page, though? The problem doesn't exist where only dump file exists. It only exists where more than one dump exists.

In T6698#131809, @John wrote:

I can't reproduce this - I deleted the ManageWiki backup on test2 and only one log entry was made.

How many backup dumps were listed on the special page, though?

One.

The problem doesn't exist where only dump file exists. It only exists where more than one dump exists.

I have generated 3 dumps, and deleted all three - only one log per deletion was produced.

See relevant IRC conversation with @JohnLewis, which may be helpful for debugging and troubleshooting purposes:

dmehus> @JohnLewis, No if there's there three, I suspect there would only be two log entries for each file deleted. Basically, when more than one dump exists and you delete a dump, there's two deleted log entries for each deletion. If one dump exists and you delete the dump, there's only one deleted log entry
11:30 @JohnLewis Weird. I was just able to reproduce on TestWiki...https://publictestwiki.com/wiki/Special:Log/datadump
11:30 I generated three separate dumps, yet six delete log entries were produced
11:30 I can grant you local `sysop` rights if needed if you want to try reproducing there
11:30 <•JohnLewis> John Lewis State every step you did please
11:32 <•dmehus> Okay
11:34 Step #1 - Proceed to [[Special:DataDump]]
11:34 Step #2 - Generated dump from item at the top of the drop-down list
11:34 Step #3 - Refreshed page
11:34 Step #4 - Generated dump second from top
11:34 Step #5 - Refreshed page
11:34 Step #6 - Generated dump third from top
11:34 Step #7 - Refreshed page
11:34 Step #8 - Waited until dumps processed
11:34 Step #9 - Deleted first dump
11:34 Step #10 - Refreshed page
11:34 Step #11 - Deleted second dump
11:34 Step #12 - Refreshed page
11:34 Step #13 - Deleted third dump
11:34 Step #14 - Refreshed page
11:34 Step #15 - Visited [[Special:Log/datadump]]
11:34 Want me to post ^ on the Phab task?
11:35 <•JohnLewis> John Lewis Refreshed the page, how?
11:35 <•dmehus> With the refresh this page link at the bottom of Special:DataDump
11:35 <•JohnLewis> John Lewis Hm, that's exactly how I do
11:36 <•dmehus> Weird
11:36 Could it be a web browser specific incompatibility issue with DataDump?
11:36 I note there were duplicate deleted log entries from before today on test2wiki as well
11:38 Let me try testing on test2
11:38 Maybe it's an issue with test2wiki that isn't reproduced but is reproduced on production wikis
11:38 <•JohnLewis> John Lewis Done it on testwiki, and can't reproduce either
11:40 <•dmehus> Weird
11:40 as I did it on test2wiki and did reproduce
11:41 <•JohnLewis> John Lewis It has to be something you're doing it
11:41 If you press F5 or refresh the browser, you will produce second logs, but if you press the button, it should take you to Special:DataDump directly
11:42 <•dmehus> Yeah, I didn't refresh the web browser...just used that refresh link on the special page
11:42 but I note that it did it for Reception123 as well...
11:42 
** 11:04, 21 December 2020 Reception123 talk contribs block deleted test2wiki_xml_9c0d2e2b7c6c163b6624.xml.gz dumps. (Deleted dumps)
 11:04, 21 December 2020 Reception123 talk contribs block deleted test2wiki_xml_9c0d2e2b7c6c163b6624.xml.gz dumps. (Deleted dumps)
 11:02, 21 December 2020 Reception123 talk contribs block generated test2wiki_xml_9c0d2e2b7c6c163b6624.xml.gz dump. (Generated dump)**
11:42 What web browser do you and Reception123 use?
11:43 My web browser information is:
11:43

Vivaldi 3.5.2115.81 (Stable channel) (64-bit)
Revision 002de1f8aa3173b79493ce72f9bf13454e94751d
OS Windows 10 OS Version 2004 (Build 19041.685)
JavaScript V8 8.7.220.29
Flash 32.0.0.465 C:\WINDOWS\system32\Macromed\Flash\pepflashplayer64_32_0_0_465.dll
User Agent Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.107 Safari/537.36```

@Reception123, can you perform a test as well, and also state what web browser you use?

I can't seem to reproduce the error either.

Closing as invalid as looking at the code there is absolutely nothing that would cause this unless you reload the page a second time, without clicking the "Refresh page" link. https://github.com/miraheze/DataDump/blob/f9dc4f686221606913c207fc94589c5f78723efc/includes/specials/SpecialDataDump.php#L80.

@Universal_Omega I'm reopening this task because in reference to:

I can't seem to reproduce the error either.

You actually did reproduce this. Per this (see screenshot below), you generated one unique dump, then deleted the dump, and two log entries were produced.

I realize we desire to minimize our open tasks, but this needs further investigation, and there's probably a way we can recode this to not produce two deleted log entries. Let's set it aside, as a low priority task, to give us some time to do a development rethink on the extension's approach.

Interesting I don't see that log entry like that.

Interesting I don't see that log entry like that.

Hrm, really? On Vector?

Interesting I don't see that log entry like that.

Hrm, really? On Vector?

on my own skin, cosmos

Yay! Looks like the screenshot helped. I'm glad I pushed you to look into fixing this. I think the fix you implemented was a reasonable one. Thanks! :)