[00:08:46] ah, that makes sense [00:32:16] [1/3] https://github.com/miraheze/mw-config/pull/6274 [00:32:16] [2/3] I worked a bit more on the chart extension. The current solution for the competing use of the `.tab` page title suffix is to create a `.data` suffix that is local to the wiki and intended to be used together with Chart. [00:32:16] [3/3] If we follow through with this workaround I can create a Help:Chart page on meta and link to it in ManageWiki's extension help message. [00:36:31] Incredible work man, thank you [01:07:34] Wait no that's an AbuseFilter issue. `Unknown column 'afl_ip_hex' in 'INSERT INTO'`. I thought ProtectSite changed the AF behavior in weird ways but maybe some missing AF sql script is the cause. [01:08:51] What wiki was this on? [01:10:19] Public test wiki [01:10:23] https://cdn.discordapp.com/attachments/1006789349498699827/1470225284442620037/message.txt?ex=698a857f&is=698933ff&hm=9e342d1d1aa02d6d6ff1611c809d542493bd6256e570fa18f594a231f21914bc& [01:10:25] afl_ip_hex was added in 1.45 and it exisrs on PTW. [01:10:43] However, it might be because it doesnt on the global abuse filter in metawiki. [01:10:53] I can fix that though [01:11:01] Oh that makes a lot of sense [01:11:25] @MacFan4000: do you think it might be worth relaying the Discord volunteering channel (1225560610628964423)? There's quite a bit of activity on it rather consistently, and the most recent [[RfP]] cites not being able to partake in the conversations in there [01:11:50] fine guess I'll just do this: https://meta.miraheze.org/wiki/Meta:Requests_for_permissions [01:11:58] pweeeeeeeeease? [01:12:06] i have notifs off on discord and rely on irc to get them [01:12:19] We can [01:12:38] Btw how do we retrieve the list of wikis by extensions enabled? I thought of running some SQL but the `sql` command expects a specific wiki database. [01:12:52] The final decision however rests with infra specialists as the ones who run the bot [01:13:12] i cant remember but dont you just need access to mhglobal [01:13:36] There is a maint script for it. [01:13:42] even better [01:13:54] GenerateExtensionDatabaseList or something [01:14:08] Thanks! I'll look for it then. [01:14:22] Its in ManageWiki IIRC [01:14:48] @Infrastructure Specialists for your consideration ^ [01:15:02] I have no problems with it. [01:16:07] Found in MirahezeMagic. I'm assuming that the list gets generated as json and is then passed to `foreachwikiindblist` for sending notifications? [01:16:42] Oh I thought I moved it to ManageWiki lol [01:17:11] generates as PHP array but can be used in foreachwikiindblist. [01:17:29] also mwscript supports it using --extension= rather than passing a wiki [01:18:48] However it will lose spaces in arguments even if using `'"..."'` format if used in mwscript. [01:20:10] we should invest in not-calling-the-shell tbh [01:24:42] [1/5] I tried this with `--extension=gadgets` and somehow `mwscript` automatically converts it to [01:24:43] [2/5] ```bash [01:24:43] [3/5] sudo -u www-data php /srv/mediawiki/1.45/maintenance/run.php MirahezeMagic:GenerateExtensionDatabaseList --wiki=metawikibeta --extension=gadgets --directory=/tmp [01:24:43] [4/5] sudo -u www-data /usr/local/bin/foreachwikiindblist /tmp/gadgets.php /srv/mediawiki/1.45/maintenance/run.php /srv/mediawiki/1.45/maintenance/MirahezeMagic:GenerateExtensionDatabaseList.php [01:24:44] [5/5] ``` [01:25:47] The `foreachwikiindblist` makes no sense though since it's supposed to run some other user-specified script and not GenerateExtensionDatabaseList [01:26:32] Oh I see what you mean. mwscript recognizes `--extension` and calls the maintenance script automatically [01:32:16] If you run it with a script as well it does both and follows with foreachwikiindblist in the newly generated list. [01:35:29] I think this sends duplicate notifications though. If a wiki has more than 1 extension enabled it'll receive more than 1 notifications. [01:35:51] Thats true yes. [01:36:47] [1/10] And yeah it seems that foreachwikiindblist also doesn't respect quotation marks. [01:36:47] [2/10] ``` [01:36:48] [3/10] sudo -u www-data /usr/local/bin/foreachwikiindblist /tmp/modern.php /srv/mediawiki/1.45/maintenance/run.php MirahezeMagic:NotifyWikiUsers \ [01:36:48] [4/10] --header='"Extension removals"' \ [01:36:48] [5/10] --message='"The technology team plans to remove one or more extensions currently used by your wiki. Please check the tech noticeboard on Meta to discuss."' \ [01:36:49] [6/10] --link='"m:Tech:Noticeboard/Removing_extensions_for_the_MediaWiki_1.45_upgrade"' \ [01:36:49] [7/10] --link-label='"Discussion page"' \ [01:36:49] [8/10] --group=bureaucrat \ [01:36:49] [9/10] --group=sysop [01:36:50] [10/10] ``` [01:44:29] [1/2] Turns out changing `/usr/bin/php ${*:2} --wiki $wiki` to `/usr/bin/php "${@:2}" --wiki $wiki` in foreachwikiindblist worked out. In fact, it worked so well that even the quotation marks are preserved. [01:44:29] [2/2] https://cdn.discordapp.com/attachments/1006789349498699827/1470233864319275230/image.png?ex=698a8d7c&is=69893bfc&hm=78db258b24c743b06a3eccf48f3da0e861611135c08f0c86f53b300067af9fe7& [01:48:00] [1/10] Adjusted script [01:48:01] [2/10] ``` [01:48:01] [3/10] sudo -u www-data ./each.sh /tmp/listofallwikis.php /srv/mediawiki/1.45/maintenance/run.php MirahezeMagic:NotifyWikiUsers \ [01:48:01] [4/10] --header='Extension removal notice' \ [01:48:02] [5/10] --message='The technology team plans to remove one or more extensions currently used by your wiki. Please check the tech noticeboard on Meta to discuss.' \ [01:48:02] [6/10] --link='m:Tech:Noticeboard/Removing_extensions_for_the_MediaWiki_1.45_upgrade' \ [01:48:02] [7/10] --link-label='Discussion page' \ [01:48:02] [8/10] --group=bureaucrat \ [01:48:03] [9/10] --group=sysop [01:48:03] [10/10] ``` [02:19:58] [1/17] @cosmicalpha I think I have the procedure ready now: [02:19:59] [2/17] 1. `sudo -u www-data mkdir /tmp/exts` [02:19:59] [3/17] 2. Skipping OrphanedTalkPages for now since it seems to be working (somehow?) [02:19:59] [4/17] ``` [02:20:00] [5/17] sudo -u www-data php /srv/mediawiki/1.45/maintenance/run.php MirahezeMagic:GenerateExtensionDatabaseList --wiki=metawiki --extension=flow --extension=imagerating --extension=3d --extension=protectsite --extension=quizgame --extension=randomgameunit --extension=numberheadings --directory=/tmp/exts [02:20:00] [6/17] ``` [02:20:00] [7/17] 3. Use a simple PHP script to merge these into a single list `php dblist_merge.php list.php /tmp/exts/*.php`. Do a sanity check with `wc -l` after the merge. Both 3d and protectsite have around 2000 lines before the merge, so there should be around 3000 lines after the merge. [02:20:00] [8/17] 4. Use the modified `foreachwikiindblist`. [02:20:01] [9/17] ``` [02:20:01] [10/17] sudo -u www-data ./each.sh ./list.php /srv/mediawiki/1.44/maintenance/run.php MirahezeMagic:NotifyWikiUsers \ [02:20:01] [11/17] --header='Extension removal notice' \ [02:20:02] [12/17] --message='The technology team plans to remove one or more extensions currently used by your wiki. Please check the tech noticeboard on Meta to discuss.' \ [02:20:02] [13/17] --link='m:Tech:Noticeboard/Removing_extensions_for_the_MediaWiki_1.45_upgrade' \ [02:20:03] [14/17] --link-label='Discussion page' \ [02:20:03] [15/17] --group=bureaucrat \ [02:20:04] [16/17] --group=sysop [02:20:04] [17/17] ``` [02:22:21] Sounds good to me. [02:41:35] Sending notifications. It'll probably take half an hour since there are close to 2000 wikis that need to be notified. [02:52:07] Do you wanna bump non ping to minor announcement [02:58:26] Sure. [02:58:42] I just noticed that I sent a notification to PTW since it has 3d enabled, which is probably unwise. [02:59:12] Hey thanks for the email [02:59:22] XD [03:01:01] Omg you sent it to sysops 🤣🤣 [03:01:31] Thats awesome lol [03:02:44] Yep. 90+ notifications, though most of them probably don't want one. [03:03:20] Oops, I'm getting notices for a bunch of wikis I closed for compliance issues [03:03:52] they exist in the database so likely got included as well [03:04:27] https://publictestwiki.com/wiki/TestWiki:Community_portal#Email_from_Technology_Team [03:04:45] RIP to your email [03:05:02] Yep, still weird I got the email. Seems like role scoping didn't work as intended maybe? [03:05:22] Hmm. That's very odd since `NotifyWikiUsers` was intended to only send emails to bureaucrats and sysops. [03:05:45] Uh oh [03:06:20] I only got it on wikis I have sysop on. [03:06:37] In the future we can filter by wiki state in `GenerateExtensionDatabaseList` to exclude deleted wikis. [03:06:40] And TIL I have sysop on some random wikis lol [03:07:21] I think I should've just sent notifications for Flow as a test. Doing it for 7 extensions at once was probably too risky. [03:07:38] Oh well, its kinda comedic [03:07:58] At least you didn’t drop PTW’s DB [03:08:08] Seems like it's limited to wikis I took a remote action on at some point [03:08:18] So that helps narrow the why [03:08:34] Did you once add sysop to yourself remotely? [03:08:44] Possibly [03:08:56] Probably the why then. [03:09:23] Still might be kinda funny deleted wiki admins are getting notices for extension removals. [03:09:28] Remote at the moment, so don't have time to debug [03:09:44] User retention rates go up [03:09:47] 😂 [03:10:29] Cause I assume it may apply to wikis that get dormancy policy [03:10:38] As well [03:12:16] They get automated emails anyway. [03:12:49] Ever since I fixed it a couple years ago anyway. Before then wikis get deleted and no one ever even knew inactive as emails were broken lol. [03:15:20] Lol [03:15:27] Its a feature not a bug /j [03:18:33] @posix_memalign I wonder if there is a way to find wikis that have flow enabled but dont have any usage of them at all, and clear them first... [03:20:58] We could either use a SQL query to find pages with the `flow-board` content model or see if there are any pages in the Topic NS. [03:24:09] quizás? porque siguen existiendo pero no vi la extension en ManageWiki. https://commons.miraheze.org/wiki/Special:Contributions/Flow_talk_page_manager [03:25:09] Now an orphan page from 2022 has appeared of Flow https://commons.miraheze.org/wiki/Special:OrphanedTalkPages [03:25:53] I think that's the result of removing Flow without a proper cleanup. All Flow pages become inaccessible because they have a content model that cannot be handled. [03:55:15] I need to see what pages use flow on PTW and just delete them unless its a non test page…. Is there an easy way to see which pages are using flow? [03:56:45] [1/2] @zppix outran the script [03:56:45] [2/2] https://cdn.discordapp.com/attachments/1006789349498699827/1470267150756352214/image.png?ex=698aac7c&is=69895afc&hm=5e3a4d84644e299cc93e16f12ae80e9b54e85a2299f38db15fb907043ecc4012& [04:09:23] [1/127] ``` [04:09:24] [2/127] stdClass Object [04:09:24] [3/127] ( [04:09:24] [4/127] [page_namespace] => 3 [04:09:24] [5/127] [page_title] => RhinosF1/2 [04:09:25] [6/127] ) [04:09:25] [7/127] stdClass Object [04:09:25] [8/127] ( [04:09:26] [9/127] [page_namespace] => 3 [04:09:26] [10/127] [page_title] => Huawei251 [04:09:26] [11/127] ) [04:09:27] [12/127] stdClass Object [04:09:27] [13/127] ( [04:09:28] [14/127] [page_namespace] => 1 [04:09:28] [15/127] [page_title] => Flow [04:09:29] [16/127] ) [04:09:29] [17/127] stdClass Object [04:09:30] [18/127] ( [04:09:30] [19/127] [page_namespace] => 1 [04:09:31] [20/127] [page_title] => Locked_Flow [04:09:31] [21/127] ) [04:09:32] [22/127] stdClass Object [04:09:32] [23/127] ( [04:09:33] [24/127] [page_namespace] => 3 [04:09:33] [25/127] [page_title] => Huawei251-test [04:09:34] [26/127] ) [04:09:34] [27/127] stdClass Object [04:09:35] [28/127] ( [04:09:35] [29/127] [page_namespace] => 3 [04:09:36] [30/127] [page_title] => Dmehus [04:09:36] [31/127] ) [04:09:37] [32/127] stdClass Object [04:09:37] [33/127] ( [04:09:38] [34/127] [page_namespace] => 3 [04:09:38] [35/127] [page_title] => Flow_talk_page_manager [04:09:39] [36/127] ) [04:09:39] [37/127] stdClass Object [04:09:40] [38/127] ( [04:09:40] [39/127] [page_namespace] => 5 [04:09:41] [40/127] [page_title] => Request_permissions [04:09:41] [41/127] ) [04:10:42] https://cdn.discordapp.com/attachments/1006789349498699827/1470270661933924443/pages.txt?ex=698aafc2&is=69895e42&hm=2f0fc10aa7a75bbf2cda77b8525704ae49e88f5807ebf7709d45713b8404c93a& [04:11:23] 1 is Talk. 3 is User_talk. 5 is Project_talk. [04:12:38] My inbox is a mess, but I quarantined the majority [04:14:09] Ah crap I cant just change content models [04:14:18] Sadge [04:14:59] Guess I’ll just delete the pages we dont need [04:15:04] If you'd like I can run a poorly tested script to bulk convert them to wikitext and replace the original page. [04:15:36] Nah, I dont need that done until the ext is gone per say, I’ll go through the list and delete the ones we dont need [04:17:02] I spent hours fixing PTW during 1.44 upgrade as a ton of flow pages were corrupted also. I ended up purging some from the DB and reindexing. [04:17:36] Likely artifacts from when PTW was dropped I think. [04:18:31] Pls no dropping my poor PTW again 😂 [04:18:55] It can only handle it once per lifetime lol [04:18:57] Although I did have some trouble with a few to fix on ATT also. Don't think it was ever dropped lol. [04:19:08] Not quite as much as PTW though [04:19:17] No but its also survived since like orain days [04:19:31] So who know whats happened over the years [04:20:40] Well that was easy to cleanup unneeded pages [04:22:42] Btw how has login been working for everyone since SUL3? Ive managed to break my login a couple times until I clear cookies but I think I have to try hard lol [04:23:17] i cant remember having a single issue with not being logged in on a new wiki [04:23:22] I havent broke it yet [04:23:23] One time I managed to fatal metawiki for me only because something happened and I was logged in but wasn't logged in. It was super odd and probably just me logging in and put 40 times at once. [04:23:24] very smooth sailing [04:23:39] I’ll keep trying harder to break it [04:23:50] (Opens up a botnet) /j [04:24:35] This seems to have been a one time issue though as I could never reproduce again and graylog only ever showed my case there. [04:24:56] Thats good. I do love not getting logged out every 5 minutes lol [04:25:25] I’m still too scared to tempt that 😂 SUL2 has me trained too way [06:03:01] https://github.com/facebook/mcrouter/wiki/Routing-Prefix [06:03:07] how did they add macros in a configuration file [11:44:35] I still get logged out occasionally but I think it was worse on sul2 [11:52:16] I sometimes dont stay auto logged in on wikis but I dont have to login again just clicking the login link logs me in. I actually wish all wikis were like that so we dont auto attach lol [11:59:44] yeah true the amount of times I've had to enter my password has definitely decreased [12:00:32] although it still happens maybe on average once a day? [12:01:57] It doesnt happen to me very often. I think it has one time since SUL3 and they may have been something I did in the browser I'm not certain. But I stay logged in for the most part now. [12:02:27] But even if it were once a day I would gladly accept that over the every 5 minutes in SUL2 lol [23:55:49] https://issue-tracker.miraheze.org/T14893#298492 [23:59:19] Also I will finish off remaining testing myself if needed. [23:59:47] Whoops wrong channel lol