Page MenuHomeMiraheze

Import for https://baobabarchives.miraheze.org/wiki/Main_Page
Open, Stalled, NormalPublic

Description

{F869776}

By the way, is there a way to export images?

Event Timeline

Dark_Helmet triaged this task as High priority.
MacFan4000 lowered the priority of this task from High to Normal.Nov 30 2018, 12:25
MacFan4000 removed Dark_Helmet as the assignee of this task.
Paladox claimed this task.Nov 30 2018, 21:43

Is it alright if I ask something here? Aside from wanting to know of a way to export images, we are having some trouble getting our templates and quotes to work. They worked fine on our previous hosts, but we can't seem to get them working right here. Is there a way to fix this?

https://baobabarchives.miraheze.org/wiki/Template:Title_infobox
https://baobabarchives.miraheze.org/wiki/Template:Quote

If this is not the place to ask this, then my apologies.

Also how long will the import process take?

Paladox added a comment.Dec 14 2018, 22:27

@Dark_Helmet it doesn't look broken to me? Also you can export your images by either zipping / gzipping the images folder under your mediawiki root and then uploading it to google cloud or another open cloud.

Paladox added a comment.Dec 14 2018, 22:29

Also status update the wiki is still importing.

Dark_Helmet added a comment.EditedDec 15 2018, 06:17

Thank you for the response and sorry for the trouble. Anyway here's the problem with the quote template. When we try to set it up on the new wiki, it says: Script error: No such module "Quote". Instead of:

"{{{1}}}"
―{{{2}}}

But on our old wikis, it never gave us this issue. Here's how it looked like on our second site over at shoutwiki: http://swholonet.shoutwiki.com/wiki/Template:Quote

And here's how the title infobox is supposed to look like:
http://swholonet.shoutwiki.com/wiki/Template:Title_infobox

It worked fine there. It seems to be a script issue I guess?

Paladox added a comment.EditedDec 16 2018, 21:21

The import has finished. Those 2 templates seem to work for me (comparing the ones on your miraheze wiki to shoutwiki)

Please open a new task if the issue is still there :)

Paladox closed this task as Resolved.Dec 16 2018, 21:21

Thank you kindly, but were there only 5000 articles in the initial xml file I sent you? There should've been more unless the file I made was incomplete. If I make another, should I post it here or make a new task?

Paladox added a comment.Dec 16 2018, 22:28

You can reopen this task with the new xml file :)

I have the new XML files ready. I've cut them into smaller parts so they should be easier to import hopefully. File number 8 is kind of big though. Is there a way to split it?

{F874587}
{F874586}
{F874590}
{F874588}
{F874589}
{F874806}
{F874603}
{F874593}
{F874592}
{F874594}
{F874595}
{F874596}
{F874598}
{F874597}
{F874599}
{F874600}
{F874601}
{F874602}

Dark_Helmet reopened this task as Open.Dec 20 2018, 19:54

Wait, looks like I need to make some more changes...

Dark_Helmet closed this task as Resolved.Dec 25 2018, 04:00

Okay, disregard the last set of xml files. This is the one that should have everything:

{F875351}

Sadly I could not split it up into smaller sizes, but it should hold everything this time.

Dark_Helmet reopened this task as Open.Dec 26 2018, 17:18

Don't suppose there's any way we can keep track of the import progress?

Paladox added a comment.Dec 29 2018, 00:43

Hi, sorry for the delay. I have started the import.

Great! I thought you had already, I was more curious if there was a way to see how far along the import was, like a progress bar of sorts.

Paladox added a comment.Jan 6 2019, 14:21

Hi, im getting:

PHP Warning:  XMLReader::read(): uploadsource://a673fd9165d32b8150ba7b63523b6721:17557849: parser error : Extra content at the end of the document in /srv/mediawiki/w/includes/import/WikiImporter.php on line 545

Warning: XMLReader::read(): uploadsource://a673fd9165d32b8150ba7b63523b6721:17557849: parser error : Extra content at the end of the document in /srv/mediawiki/w/includes/import/WikiImporter.php on line 545
PHP Warning:  XMLReader::read(): *{{DB|character in /srv/mediawiki/w/includes/import/WikiImporter.php on line 545

Warning: XMLReader::read(): *{{DB|character in /srv/mediawiki/w/includes/import/WikiImporter.php on line 545
PHP Warning:  XMLReader::read():                ^ in /srv/mediawiki/w/includes/import/WikiImporter.php on line 545

Warning: XMLReader::read():                ^ in /srv/mediawiki/w/includes/import/WikiImporter.php on line 545
PHP Warning:  XMLReader::read(): Load Data before trying to read in /srv/mediawiki/w/includes/import/WikiImporter.php on line 836

Warning: XMLReader::read(): Load Data before trying to read in /srv/mediawiki/w/includes/import/WikiImporter.php on line 836
98 (0.10 pages/sec 29.71 revs/sec)
PHP Warning:  XMLReader::read(): Load Data before trying to read in /srv/mediawiki/w/includes/import/WikiImporter.php on line 755

Warning: XMLReader::read(): Load Data before trying to read in /srv/mediawiki/w/includes/import/WikiImporter.php on line 755
PHP Warning:  XMLReader::read(): Load Data before trying to read in /srv/mediawiki/w/includes/import/WikiImporter.php on line 622

Warning: XMLReader::read(): Load Data before trying to read in /srv/mediawiki/w/includes/import/WikiImporter.php on line 622
Done!

Could you regenerate your dump please? :)

I believe it's in progress by Helmet...it's a big file.

I'm trying to upload it. Its like 62GB large. It was up to 90% last night, but I fell asleep and something went wrong with the connection. Now when I try to re-upload it, it says there was an upload failure and "there are no file data chunks in byte".

Paladox added a comment.Jan 12 2019, 20:37

@Dark_Helmet hi, where are you trying to upload that file?

Paladox added a comment.Jan 12 2019, 22:52

Was this on phabricator?

Mitth.raw.nuruodo added a comment.EditedJan 18 2019, 06:27

Hello Paladox,
given that we've had some issues uploading the file to Phabricator directly, would it be preferable to host it on a third party and send it to you that way, in case our second attempt fails?

We also have a backup file, albeit much less complete, if we can't get anywhere with this one.

Paladox added a comment.Jan 18 2019, 15:05

Note that uploading them through Phabricator will cause our host to run out of space.

So please doin't upload 40gb+ through it :)

Thanks for the tip, hosted link should be coming through later today.

Paladox added a comment.Jan 19 2019, 00:37

@Mitth.raw.nuruodo thanks, we need to increase our file storage before we can import it. We have just ~40gb left.

I see. If you aren't planning on upgrading in the near future, I believe Helmet has another file we can use--it's smaller, but contains no pictures or post history.

Dark_Helmet added a comment.EditedJan 19 2019, 02:40

I got your message Thrawn. Here's the smaller version.

Hopefully it'll work until they can import the 60gb version. But does that mean we won't be able to edit anything if the bigger import still happens? Would anything new we add be overridden?

Let's wait until we know what the timeline (if there is one) is for increasing the storage space. If it isn't so long it's a non-issue.

Paladox, would you be able to import the shorter one for now?

Paladox added a comment.Jan 26 2019, 01:22

Yes, i can.

John added a subscriber: John.Feb 4 2019, 00:13

@Paladox over a week, updates?

Paladox added a comment.Feb 4 2019, 00:21

It seems mw1 going down stopped the import. So im going to restart it.

Hey Paladox, do you know if Miraheze has a timeframe in place for the server space upgrade? Also, if the upgrade goes through and we import the "large" file, will that overwrite the changes we've already made to the articles?

Paladox added a comment.Feb 9 2019, 13:31

We currently do not have a timeframe :( (since we need more funds before we can do it).

@Paladox Is it not possible to even import the smaller dump?

There have been some errors with the dump, could you please check the content?

I'll do my best to see if I can find a better and smaller xml.

Paladox added a comment.Feb 13 2019, 09:04

The 1gb file is ok, but how are you generating the dump? With a tool?

Mitth.raw.nuruodo added a comment.EditedFeb 16 2019, 05:52

What kind of funding do you need, and for what kind of upgrades?

So wait, the 1gb file is working? Can it be imported? Is it already being imported? I'll accept anything we can get. As for how the xmls are made, a friend of mine handles their creation and he has a lot of wiki xmls, however he's been afk for a while.

Paladox added a comment.Feb 16 2019, 14:16

@Dark_Helmet i was referring to @Reception123 comment on smaller files :)

But are you using https://github.com/WikiTeam/wikiteam ? there seems to be a issue with using that tool causing the file to generate incorrectly (happened to our other users who used that tool). Is there any way for you to get the dump from the wiki directly using https://www.mediawiki.org/wiki/Manual:DumpBackup.php ?

Also we need more cash see https://meta.miraheze.org/wiki/Finance .

So wait, the 1gb file is working? Can it be imported? Is it already being imported? I'll accept anything we can get. As for how the xmls are made, a friend of mine handles their creation and he has a lot of wiki xmls, however he's been afk for a while.

We have already tried to import it, but it seemed that there were errors during the import. That is why I invited you to please verify if the content was imported or not above.

The dump isn't being generated, it's from an xml archive of the wiki in the state it was a few years ago; we would prefer to import the wiki in the state it was then, rather than its current state.

Paladox added a comment.Feb 17 2019, 15:38

Would you be able to confirm that it imported correctly? But the dump does have a syntax error. Just we don't know where.

I'm not sure how I would confirm that from the XML file. If that one isn't working, we do have another smaller one we'd like to try:
https://mega.nz/#!zQxygCBB!U6IK6w82qB-gVP_pCsJGNIvbVXngEwIPDD1boiJsUXc

Paladox, did the new file work?

Paladox added a comment.Sat, Feb 23, 03:28

Hi, i haven't tried it yet, though I'm going to compress your wiki ( https://www.mediawiki.org/wiki/Manual:CompressOld.php ) before doing that. Since that file looks like it's 12gb.

Paladox added a comment.Sat, Feb 23, 04:33

Hi, is there a smaller dump?

Only because that may use all the available storage on the db.

Not to my knowledge. This one's already pretty small compared to the ~60 GB one we had earlier.

Paladox changed the task status from Open to Stalled.Mon, Feb 25, 17:12

We need to increase our db storage (as it's also running low)

Paladox,
Would you be able to provide us any details as to how much Miraheze would need in donations for the server space to become available?

Paladox added a subscriber: labster.Tue, Feb 26, 10:59

Hi, I currently am not sure how much we need. Pinging @John or @labster or @Southparkfan for ^^?

Paladox added a comment.Tue, Feb 26, 12:45

Per my chat with local ops, we will work out best at trying to fix this.

Paladox added a comment.Tue, Feb 26, 14:06

The person I spoke to said this “few bucks ($10/mo) will definitely help to get the process started”