[06:30:09] can anyone help me with Manual:Shared database [06:30:54] just been getting errors trying to login and stuff 😓 [06:36:04] Miraheze uses CentralAuth instead of shared databases. You may have better luck asking on the MediaWiki discord server. [15:40:39] can such thing be done on mh:communities: tbh? [20:33:33] [1/2] Does anybody have any objections to archiving ? [20:33:34] [2/2] CA has moved the repo to a while ago and we're not using the github one anymore [20:40:00] go ahead [20:40:29] if we're supposed to use the one on gerrit then we're supposed to use the one on gerrit [20:43:20] done [20:43:35] should probably do the same for JavascriptSlideshow which was also moved [20:45:28] actually, I don't have access to that; if anybody could merge and archive it afterwards that would be great [21:13:14] wtf [21:40:00] Done [21:40:55] For some reason only @.labster had access [21:45:53] @justman10000 do you have any idea what you're doing with https://github.com/miraheze/mw-config/pull/6154 or are we just typing random words and hoping something sticks? [21:46:17] If it's the latter, get someone to talk you through it please [21:50:21] ? [21:50:22] I have already explained... [21:50:47] And I would know it directly if I could test it! [21:51:03] 😭 sorry what [21:51:06] you want to test in prod [21:51:50] Didn't say that! [21:51:57] I talk about an test env [21:51:59] it sounded like that to me [21:53:54] btw can anybody confirm if the latest commits to CreateWiki and ManageWiki are compatible with 1.43? [21:53:58] We can test your changes on beta, but not if it's obvious that they're not going to work as intended based on the code [21:55:23] CreateWiki requires 1.44 now per , so probably not [21:55:36] Same for ManageWikki [21:57:36] :catThumbsUp: okay, I need to update WO to 1.44 eventually but a starting point is CW/MW now lol [21:58:29] Why wouldn't these work? I simply duplicated and modified the code! It cannot fail to function! [21:58:43] What could be wrong with that? [21:59:04] That is explained in the review comments [21:59:55] To which I already replied! [22:00:13] Well, you didn't fix the issues [22:00:20] `corge` is not a valid value for that setting [22:00:24] it should be added to ManageWikiSettings [22:00:25] No point entertaining their charade lol [22:00:36] and `$wgPageImagesOpenGraph` is missing in LocalSettings [22:01:42] You do not need a test environment to work out the basics of configuring mediawiki [22:01:53] Not having a clue what you're talking about is perfectly acceptable [22:02:05] But if you don't and want to contribute, you need to be supported [22:02:14] My advice would be fork them so you can merge the changes in you want [22:02:19] Not left to randomly throw nonsense at a PR and hope it sticks [22:02:38] Since MH is insistent on bumping the version requirement with no warning [22:03:08] tbh not a bad shout because for now I want to leave the ai crap out [22:03:34] I know some of it is in it but I'm not sure how much and I really don't want to deal with it lol [22:03:49] I'm not sure how much of it is still forced but yeah [22:04:13] Yeah ditto on the AI shit lmao [22:04:44] The local AI was quite frankly a case study in how not to do software development [22:05:24] the only config changes I see after the split from CreateWiki managing wiki config caches are ManageWikiCacheDirectory and ManageWikiCacheType and ManageWikiServers but I don't need that for a monolith lol [22:05:27] Then just say so! So... once again... I was told to use `list` for `wgPageImagesOpenGraph`, even though it's a Boolean! Then I was told to use this `list` for `wgLakeusWikiDefaultColorScheme`! But how, if I haven't registered it in `ManageWikiSettings.php`? [22:05:32] Dudes, do not confuse me [22:05:36] 🤯 [22:06:07] I can't believe it's cheaper to maintain an entire gpu server than rely on the openai api [22:06:08] You have to been told that the latter should be in ManageWiki [22:06:14] Please re-read the comment at regarding `list` [22:06:45] We have a few uses for the GPU, not just AI [22:07:11] gaming? rendering something? [22:07:20] (but also just to clarify, open API performed significantly better than whatever local model Agent was using) [22:07:26] OpenAI not open API lol [22:07:52] Yes! NOW [22:07:54] Thumbnailing is one of them ye [22:08:00] Right, what I just said! However, it doesn't say anything about registering it in `ManageWikiSettings.php`! [22:08:03] I told you that in my first review [22:08:06] Taking over the federal government, obv [22:08:13] even on the flex request tier I would assume that it's probably faster than locally running an LLM [22:08:15] 5 days ago [22:08:25] https://github.com/miraheze/mw-config/pull/6154#discussion_r2470870148 [22:08:29] No, that's discussed in [22:08:29] It's probably cheapest to abuse free api usage provided by services like Google's gemini. MH has about 60 wiki requests per day, which is under the usage limit IIRC. [22:08:35] batch requests usually end up processing fairly quickly from my experience [22:08:44] Probably but it's done in a job so not time sensitive in any case [22:08:44] true true [22:08:58] flex request tier isn't time sensitive at all [22:09:04] it completes it within 15m [22:09:10] and has the 50% discount [22:09:42] Yeah, which is why I'm saying it doesn't really matter that it's quicker as it's not time sensitive in any case [22:09:49] true [22:10:04] any time is less than the time for a human to check it [22:10:08] well not exactly but [22:10:09] you get the point [22:10:12] The job queue is supposed to just look like it's processing stuff [22:10:15] Hmm... Fact! Must have overlooked it! Anyway, now I know what to do! [22:10:22] There is absolutely no guarantee it will do it quickly [22:10:45] It's been a misnomer sometimes because ours hasn't been that bad when it does work [22:10:55] You can wait days for jobs to finish on Wikimedia wikis [22:11:06] That are low priority [22:11:31] The job system in MW needs overhauling significantly [22:11:42] I have no idea how well my job queue works [22:11:49] Describing it as a system is generous [22:11:51] I don't bother looking too deep at it [22:12:03] I know it runs every 5 minutes and that's good enough [22:12:09] It's the job queue so it running well would be a miracle @zipppee [22:12:16] [1/2] Even though it didn't quite turn out the results we were hoping, the idea of moving away from reliance on paying 3rd parties that may implode/have dubious business practices was enticing. [22:12:16] [2/2] I would love to see a more open-ended framework to utilize different LLM endpoints down the line so we're not hamstrung to a single provider. [22:12:27] The infrastructure surrounding running the job queue at scale is awful [22:12:39] I don't have 20000 wikis writing jobs to it sooo [22:12:43] It's about the worst designed part of mediawiki infrastructure [22:12:56] Although it probably wasn't designed [22:13:08] :lol: [22:13:35] Trying to think of something worse rn but came up blank [22:14:09] I mean on a monolith with about 350 wikis using it it works well enough [22:14:18] I don't get complaints about stale data generally [22:15:37] Once you get into the realm of job runners though it does become a bit [22:18:47] Which setup are you using? [22:19:19] LEMP stack with cron doing job running iirc [22:19:26] [1/3] I think the thing that pisses me off about MediaWiki jobs is that MediaWiki starts a transaction BEFORE the job starts. Which is a huge issue if say you A) insert some data in the job queue [22:19:27] [2/3] B) then need to run a maintenance script within the job that needs access to that data. [22:19:27] [3/3] Which is impossible, because the transaction hasn't been committed yet [22:19:46] Of course, you can get around it by using a deferred update but why should I use a deferred update from inside a job? It makes no sense [22:21:03] When they rewrite it [22:21:13] Which will be a long way away [22:21:26] It will probably never happen because they are too busy fucking everyone over with parsoid [22:21:28] they could test the code on a project with a large job queue [22:21:33] rather than a dev env [22:21:40] who really needs simplewikipedia anyway [22:25:39] Ye doing it via cron doesn't scale at all [22:25:55] We have over 20k wikis now [22:26:11] more like the queue is gonna scale [22:28:03] true but [22:28:07] we aren't scaling to that yet [22:28:26] I can still single handedly manage to keep things running [22:28:31] which is part of the sign [22:28:38] there are others but none of them know how it worksl ol [22:37:32] [1/3] Because? The extension defined already an default oneself! [22:37:32] [2/3] https://www.mediawiki.org/wiki/Extension:PageImages [22:37:32] [3/3] https://cdn.discordapp.com/attachments/1006789349498699827/1434672802736111626/image.png?ex=69092eab&is=6907dd2b&hm=2cb16baf790229f4ebce4dbce07f837016f15f0056d425e5a1a9ef6753ec8142& [22:37:33] themself [22:38:07] Or is ManageWiki registering this separately? [22:59:44] So... Fixed