[00:12:46] deni49: I left some comments on the patch. The other major thing, is that plain just "sh" needs to have an empty replacement table [00:13:23] Alternatively, in some other languages (like tg and tly), they have just "tg" and "th-cyrl" with no "tg-latn" [00:14:28] Amir80 and cscott as reviewers to the patch, who i think are the people who know most about lang converters [00:15:31] Ideally i'd like one of them to comment because i am not very familiar with lang converter, but if after fixing the things i mentioned you still have problems getting it reviewed, feel free to ping me again [00:16:35] Thanks bawolff. I'll make sure that we look into these as soon as possible and get back to you. [00:16:53] Hopefully Amir80 and cscott gets back to us soon. [00:37:29] Hmm, i wonder if https://gerrit.wikimedia.org/r/c/mediawiki/core/+/844427 should be bakported to 1.39 [00:37:40] Not sure what our policy on what should be backported is, now a days [00:39:17] deni49: Ok. Managed to register with nick serv. [00:39:24] I am imdeni now. [03:37:16] I cherry-picked https://gerrit.wikimedia.org/r/c/mediawiki/core/+/844427 to REL1_39 [14:07:57] Hello and thank you for the awesome MediaWiki software [14:09:16] I'm trying to upgrade a bunch of wikis to 1.35.8, but 'composer update --no-dev' fails miserably. My upgrade method is using git and I am moving the vendor-folder away before starting the upgrade. Could this be to do with a too old php version. I seem to have only 7.3 atm [14:11:05] Here is my method to upgrade these minor point releases. These instructions have worked good, up to now: https://paste.debian.net/1258020/ [14:12:23] I'm thinking maybe my composer is out of date. Or maybe I should try without moving the vendor-folder away (that I do to avoid some old stuff lurking around in the dependencies). Any help would be much appreciated, cheers. And looking forward to the new LTS 1.39 [14:33:40] " fails miserably" [14:33:44] what does that mean ? [14:41:17] TheDJ[m]: running 'composer update --no-dev' screams hundreds of lines of white-on-red about unregistered dependencies or something similar. I could recreate the situation in like 5 minutes time and have the exact error [14:45:14] if you move the vendor directory out of the way, shouldn't you be using composer install instead ? [14:50:03] also, you've updated extensions, but not skins [14:50:24] TheDJ[m]: good points [14:50:49] I recreated the situation once again and it seems to complain about one unknown package [14:51:17] Unknown package has no name defined ([{"name":"christian-riesen\/bas [14:51:17] e32","version":"1.4.0","version_normalized":"1.4.0.0","source":{"typ [14:51:17] e":"git","url":"https:\/\/github.com\/ChristianRiesen\/base32.git" [14:51:51] I'll try 'composer install --no-dev' instead. Sorry, a bit brainfogged [14:52:55] same error with "install", instead of "update" [14:53:59] Maybe I should update composer. I have PHP7.3, that may be affecting this situation as I understand it is a bit old nowadays [15:00:35] Uh-oh: Composer 1.8.4 ... I should probably get the version 2, that has self-update included in it if I browsed correctly [15:02:07] running even 'composer --version' in that directory will complain that same hundreds of lines about "christian-riesen\/base32" being unknown package and having no name defined [15:04:11] I better try upgrading composer to 2.4.3 on a testing server and see if I can get that to work and if it helps, which it probably will and then upgrade it on the production servers [15:18:52] yeah, based on installing the newest 2-series composer on a test server the problem was eradicated, I will now upgrade the composers on the productions servers and upgrade the wikis [16:14:14] 6 wikis upgraded, but now a problem with the wikifamily of 2 wikis on a distinct server. I cannot upgrade composer because I am unable to run the php to download it as described in https://getcomposer.org/download/. This is not a MediaWiki problem, just a problem I ran into trying to keep mah mediawikis up and running [16:48:40] CosmicAlpha: Hello, are you able to answer some questions about https://meta.miraheze.org/wiki/Backups ? Does the sentence "any wiki administrator can create an XML or image backup of their wiki by going to Special:DataDump on their wiki " mean that Miraheze no longer makes regular XML dumps? [16:50:26] Over at WikiTeam we stopped taking dumps of Miraheze in 2017 https://wiki.archiveteam.org/index.php/Miraheze as we understood that server-side dumps would be archived on archive.org by the sysadmins, but it seems that didn't happen for most wikis. So we'll have to make dumps ourselves for all wikis. [16:51:31] This seems rather wasteful. It would be nicer if Miraheze itself could update those dumps, maybe every 6 months or so if a more regular schedule isn't possible. What could help make it happen? Would donations be needed to cover extra costs? [16:56:19] Cc Agent [17:41:36] Nemo_bis: Reception123 is the person to ask about archive.org [17:41:48] I know we always upload a copy prior to deletion [18:22:57] RhinosF1: thanks. Yeah I wanted to ask Reception123 but they're not online at the moment. :) [18:25:10] Nemo_bis: now they are [18:26:00] Nemo_bis: Reception123 publishes database dumps for all wikis every few months, this mainly occurs before deleting any inactive wikis. Schedule seems to be every 4-6 months. [18:26:11] Reception123: Nemo_bis would like to ask you about archives [18:27:03] I haven't done one in a while if that's the question [18:27:32] I'm planning on doing one soon, though I need to remove some larger wikis from the main batch as otherwise the file is too lardge [18:27:34] Yes, we'd like to have dumps in archive.org for all the public wikis. [18:27:36] Large* [18:27:43] Too large for what? [18:27:56] For the server :p [18:28:03] I can likely have it done this week [18:28:05] Can't you compress it on the go? [18:28:18] Out of curiosity, what do you need it for? [18:28:33] We archive all wikis https://archive.org/details/wikiteam [18:29:03] Ah you're part of wikiteam. I'll set a reminder and get it done soon [18:29:32] Thanks. For all wikis? [18:30:05] Is there a way we can help automate this? [18:35:19] Nemo_bis: can you automate uploads to archive.org from importDump [18:37:07] We're currently working on a way to integrate backups into our deleteWikis.php script in order to have backups made automatically before wikis are deleted so that's one step towards that for wikis that are being deleted. Perhaps we could add an option to have the script upload this to archive.org automatically too and from there work out a script to backup all wikis automatically and upload them to archive.org [18:37:08] https://phabricator.miraheze.org/T9665 [18:38:31] I mean I'd be hoping to save us time the automating to archive.org is something archive team already had Agent[m] [18:38:39] The all wikis gets easy [18:38:50] We could run it in threads can delete the file after [19:00:37] RhinosF1: so what's the issue, that you can't do all the wikis or the server will run out of space while doing the dumps? [19:03:54] Uploading thousands of individual files may not be the most efficient method, it would perhaps be better to have a single tar/zip with all wikis (assuming we can include some basic metadata, at least the siteinfo). Uploading one file to one new item should be a rather easy call to the internetarchive cli or Python module. If you can create/link a start of a script/cronjob/whatever where the dumping [19:04:00] could happen, I can write the upload part. [19:09:17] Nemo_bis: yes [19:09:29] Maybe we could have a script that batches by letter or something [19:09:49] Generate all a wikis, archive them, upload, delete move on [19:09:54] Move to b [19:10:00] Etc [19:10:25] I can help with the dumping part, I get fairly busy but if you can send some uploading code then I'll look [19:10:59] Or Reception123 could say what the big wikis are to be excluded from the main dump [19:11:16] And I could make it do them individually [19:20:23] Yes, I usually do them individually [19:20:30] Or at least do a smaller wikis dump + large wikis [19:34:12] Hi, is there a way to export mediawiki pages as plain text in csv format? Thanks [19:49:15] Any help please? [19:53:26] Guest11, please state your problem [19:53:39] Eytirth1: they did [19:53:47] 20:34:13 Hi, is there a way to export mediawiki pages as plain text in csv format? Thanks [19:54:01] Oh, I cleared my window [19:54:24] * RhinosF1 does not have an answer [19:55:18] I was actually wondering if there is any way to export mediawiki page in text format. [19:56:08] I rember somewhere on Github there is a sctript that can "parse" xml dumps into plain text [19:59:35] Oh ok that will work I guess but can you recall what the script called? [20:00:07] RhinosF1: yes, batching by first letter is something I've done before! It's sensible [20:01:33] We could also set up some kind of staging server where you'd send the (compressed) dumps before batching them. It could even just be a vps in the https://wikitech.wikimedia.org/wiki/Nova_Resource:Dumps project [20:01:34] Guest11, I think it was this: https://github.com/rspeer/wiki2text [20:01:47] https://github.com/rspeer/wiki2text [20:02:04] Shoot I found the exact same page lol [20:02:17] Thanks I'll look at this one. [20:02:53] https://github.com/yohasebe/wp2txt - this one may work if the otrher does not [20:07:54] Ok Thank you for your time. [20:17:07] Nemo_bis: cool, we can talk about that somewhere [20:21:37] RhinosF1: ok, ideally email but I can also idle on some other channel if you prefer [20:22:01] Nemo_bis: email is cool